Is Mindfulness Meditation worth it?

Emma Pallen

In the past, the word meditation was associated with Tibetan monks chanting on isolated
mountaintops. But nowadays, it seems that everyone and their cat are espousing the benefits of
the mindfulness-based practice. However, instead of aiming to achieve spiritual enlightenment,
modern meditation is far more concerned with the health benefits, both mental and physical.
With claims such as decreased anxiety and depression, boosted immune systems and even
being linked to a longer life span, it all sounds too good to be true. Is this all just pseudo-science
mumbo-jumbo, or have the Tibetan monks really been sitting on a panacea for human health
problems all this time?
Mindfulness meditation is the practice of focusing on the present moment, instead of
deliberating over past failures, or worrying about future problems. It makes sense that
something like this could improve mental health, especially in modern Western society, where
there are so many competing calls for our attention. Numerous studies have found that this
process of stopping and refocusing your attention on the present, whether that’s through
breathing, focusing on bodily sensations or simply by mindfully enjoying the food you’re eating,
leads to decreased rumination and worry. This in turn is linked to decreased anxiety and


Image credit: Wikimedia commons

As well as being beneficial for our mental wellbeing, mindfulness has been shown to have
numerous physical health benefits as well. Recently, researchers at Coventry University
investigated the effects of mind-body interventions on gene activity. Remarkably, they found that
for participants who practiced mind-body interventions such as mindfulness, gene activity was
reduced in genes related to inflammation. This is the opposite effect of chronic stress. Not only
does this reinforce the notion that mindfulness reduces stress, it also suggests that practicing
mind-body interventions may even reduce the risk of physical inflammation-related disorders
such as arthritis and asthma.
Practicing mindfulness meditation will not only lead you to a happier and healthier life, it may
also lead you toward a longer one. Researchers at the University of California showed that
participants who had attended a three-month meditation retreat had greater levels of an enzyme
that builds up telomeres than a control group. Telomeres are regions of DNA at the end of
chromosomes that get shorter every time a cell divides. The length of telomeres is related to
ageing and longevity, so it appears that mindfulness could be linked to a longer life span.
Clearly, meditation has its benefits. But, like many things that seem too good to be true, it may
also have a dark side. For some people, instead of leading to peace and enlightenment,
mindfulness meditation can lead to panic, depression or even psychosis. According to a study
conducted by David Shapiro at the University of California, 7% of people who have tried
mindfulness meditation reported anxiety, depression, pain, or panic. There is little published
research on these potential negative effects of mindfulness, perhaps because of its ‘trending’
status at the moment, publication bias towards studies with positive results, or simply because
those who experience these negative effects simply stop with practice and don’t report it.


However, there are some potential explanations as to why some people have such negative
experiences. Meditation involves sitting with and accepting your own thoughts and feelings,
positive or negative. This can be sometimes difficult for even mentally healthy people, so for
people who are already suffering with poor mental health or negative feelings, this could
potentially make things worse. Similarly, for patients with post-traumatic stress disorder (PTSD),
mindfulness can be difficult as traumatic memories can rise to the surface.
Nonetheless, the potential negative effects of mindfulness need not put us off. It may simply be
a case of weighing up the risks versus the rewards. Speaking to the Guardian in 2016, Floridan
Ruths, a mindfulness researcher and a practicing psychologist, compared the cost-benefit
calculations of meditation to how we think about exercise. “If we exercise, we live longer, we’re
slimmer, we’ve got less risk of dementia, we’re happier and less anxious,” he said. “People don’t
talk about the fact that when you exercise, you are at a natural risk of injuring yourself.” And as
with exercise, some people are unable to exercise due to a pre-existing condition, or may have
a higher risk of injury.
Another potential explanation as to why some people have negative experiences meditating is
due to poor practice, whether that’


Image credit: Wikimedia commons

s down to a lack of information on the correct ways to
meditate or due to a poor teacher. Indeed, unlike other forms of therapy such as cognitive
behavioral therapy (CBT), there is no professionally accredited training for mindfulness
teachers, and anyone can call themselves a mindfulness coach. This may have led to the
‘pseudo-science’ perception of mindfulness. Additionally, many studies that have found positive
effects of mindfulness only compared the effects of mindfulness to ‘treatment as usual’ (TAU),
such as seeing a GP, or to waiting list controls. This makes it unclear as to whether the positive
effects of mindfulness are simply due to placebo, spending more time with a therapist and
becoming more aware of emotions, or whether there is indeed an ‘active component’ of
mindfulness that specifically causes the observed benefits.
So, while it seems like mindfulness meditation does have positive effects, a lot more research
needs to be done. It is still unclear as to how long lasting the effects of mindfulness are, and
clearly, not everyone will benefit. It is also unclear as to the mechanism of action of mindfulness
and how it works in comparison to other forms of therapy such as CBT or talking therapy.
Obviously, what mindfulness does have is that is quick and cheap, and can be done by anyone
at any time. Also, unlike other forms of therapy that require a diagnosis before being able to be
accessed on the NHS, mindfulness can also be done to ‘maintain’ mental health, hopefully
avoiding the necessity of using other mental health services.

Will we ever ‘cure’ Mental Illness?


Jonathan Cooke

People do not wake up one day realising that they have a mental illness; that their view of the world is clouded by a poorly defined alteration of their brain chemistry. It can take days, months or even years before a person comes to terms with that what they are experiencing is not ‘normal’. Even at that time, they may not immediately seek medical advice, to some seeking such advice is an admission of weakness; an inability to deal with what everyone else is dealing with.

That is not to say that people cannot recover from mental illness. The flood in the pharmaceutical industry of different pills and tablets that are prescribed to people to help their conditions would lend credibility to the theory that these conditions can be coped with. However medication doesn’t work for some, and for others it can make the situation even worse than before. Pills are not a one-stop solution, they do not suddenly fix your brain chemistry overnight. Even SSRIs (selective serotonin uptake inhibitors) only help to limit the amount of serotonin your body absorbs, it doesn’t alter the amount your body produces.

The negative reactions to these tablets betrays a much more important point: the debate over the cause of mental illness is still hotly debated. There is undeniably a natural-genetic component to their disorder. However does this make the development of mental illness inevitable? Or does it merely increase the chance of mental illness arising in a person and its environment that provides the trigger?

In addition, there are other ‘cures’ that over the years have been used to try and treat ‘mental illness’ over the years. It was not that long ago that electro-convulsive therapy (ECT) was prescribed as the most efficient therapy when trying to treat anything that was considered a mental illness. Whilst its efficiency at treating some conditions has been noted in the literature, very few therapies have generated such a heated debate, perhaps due to how the treatment is perceived. After all, no-one is likely to warmly receive the idea of having an electric current shot through their brain.

The evolving definition of what and what isn’t a mental illness should give pause to the idea of a cure. It wasn’t until 1987 that homosexuality was removed from textbooks which listed psychiatric disorders and being transgender is soon to be removed as well, replaced with the more accurate but no less weighted term ‘gender dysphoria’. Societies needing a cure for conditions which it doesn’t understand is perhaps its greatest failing. If we don’t understand or accept something, it is that something which is regarded as being in the wrong and having to change rather than our attitude towards it. It is therefore the reaction that these marginalized minorities receive that is probably the root cause of their higher than average rates of depression and suicide, rather than who they are themselves.

What use is a cure if it does not cure the stigma that comes with a mental illness. A book by Nunally J (1981) looked at the semantics that people typically associate with people that have a mental illness. Respondents, when describing a mentally ill man, were most likely to use semantics like “dangerous, dirty, unpredictable and worthless”.

That may have been several decades ago and times have changed; there are more public advocates of mental health awareness and the advent of the internet has allowed people to find others experiencing similar symptoms, helping them to forge support networks. However, to those unaware of those advocates or support networks, what are they greeted with? Most shows on ‘mainstream’ media that try to portray mental illness inevitably demonize characters on TV shows with mental health illnesses as either violent or unlikeable.

Full disclosure, I have not watched either 13 Reasons Why or To the bone, arguably the two biggest attempts to portray characters with mental illnesses recently. However, both were widely criticized, by the depression and anorexic communities respectively, for their inadequate portrayal of the issues that they raised. It would be naïve to suggest that a people are not heavily influenced by what they watch on television. In a paper in 1978, it was shown that people who watch a lot of crime-related television and police dramas are more likely to vastly overestimate their chances of being a victim of crime, as well as overestimate how many police officers and judges there are. (Gerloner et al. 1978)

These criticisms are not based on wild speculation either; Granello & Pauley (2000) demonstrated that portrayals of mentally ill characters on TV and film are typically made out to be “violent and unpredictable”. This is not just negative for those who wish to identify with a character on TV that represents them, but also for the general public. With the ever emerging evidence that genetics play a part in the development of mental illness, such demonization of the mentally ill allow the rest of the public to separate the mentally ill into a ‘other’ group of people, different and unique to them.

This separation of the population into ‘normal’ and ‘other’ leads to a disassociation and an inability to understand that mental illness is a sliding scale of grey with no two conditions exactly alike. My depression and anxiety do affect me, but they affect others differently to me. There are similarities, but also differences. It is this nuance that is missing in our discourse when we discuss mental illness in the media and with the public.

Some people get better without a recognized ‘cure’. They open up, discuss their problems and find they are not quite as alone as they thought they were. There is power in the ability to talk with your fellows about how you are feeling. But how can they hope to ever feel they are better if society refuses to acknowledge that someone can recover from mental illness without the need for a specially crafted ‘happy pill’ that solves all their problems? Curing mental illness is a lofty and admirable goal; but my training is not in that area and so it would be unwarranted of me to posit that such a cure is achievable.

Cures begin by having an accurate picture of what we are trying to cure. We could not cure the plague by ‘bleeding’ the badness away. To help those with mental illness, we have to understand that many of the common mental illnesses, depression/anxiety, are exacerbated by the society in which we live. Therefore, should not the conversation be about curing society and not those that live within it?

Often Misunderstood: Schizophrenia


Rhiannon Freya Lyon

Often misunderstood, schizophrenia is possibly the most stigmatised of mental illnesses. This is largely down to a lack of education on it in the general public, leading to misconceptions that it is some sort of split personality disorder, causing those with it to be violent towards others. The word may conjure up images of padded cells, straight-jackets, and someone who must be kept isolated from society for the good of everyone. The media definitely doesn’t do anything to help with this image.

In reality, this could not be further from the truth. Schizophrenia is complex, made up of many different types of symptoms, and definitely doesn’t cause a person to be any more violent than someone without schizophrenia would be.  Although there is currently no cure for schizophrenia, as unfortunately is the trend with mental illnesses, there are many medications and talking therapies that together can work to alleviate an individual’s symptoms, greatly increasing their quality of life.


When one thinks of the symptoms of schizophrenia, the first things that come to mind are things like auditory hallucinations (hearing voices) and delusions (e.g. paranoia). These are known as the ‘positive’ or ‘psychotic symptoms’ of schizophrenia (not positive as in good, but positive as in they are in addition to ‘normal’ experiences). Although auditory hallucinations are the most common, hallucinations of all the other senses can occur too, such as visual hallucinations, the sense of being touched when you are not, and even sensing smells and tastes that are not there. Delusions are beliefs that do not line up with reality, for example those suffering from delusions may feel that they are being followed or plotted against, or that they have committed a terrible crime. These delusions can cause them to feel overwhelmed and act in ways that may seem to not make sense to others. Another positive symptom is disorganised thinking, which may cause the person to talk more quickly or slowly, and jump from topic to topic in with no obvious link.

However positive symptoms are only a part of schizophrenia. There are also ‘negative symptoms’ which are more similar in character to depression, and usually involve a lack of something. They include things like loss of motivation and enjoyment of life, changing sleep patterns, withdrawal from social activities, and memory problems. Negative symptoms are much less dramatic than positive ones, but they generally last longer, and those with schizophrenia often say that they feel the negative symptoms have the biggest impact on their life.


There are a number of forms of schizophrenia, distinguished by their different combinations of various types of positive and negative symptoms. Paranoid schizophrenia is the most common and well known type, often developing in a person’s 20s, and includes prominent hallucinations and delusions. Other types of schizophrenia may be more focused on negative symptoms (simple and residual schizophrenia), or on a specific type of hallucinations, such as experiencing unusual bodily sensations in canasthopathic schizophrenia.


It is not entirely clear what causes schizophrenia, although many risk factors have been identified. Schizophrenia is thought to have some genetic component, as demonstrated by twin studies, but this alone does not cause a person to have schizophrenia, which also requires environmental stressors such as losing a loved one or going through big life changes. Subtle differences in brain structure are also seen in some people with schizophrenia, but not all.

High levels of the neurotransmitter dopamine are associated with hallucinations and delusions. Drugs that lower the levels of dopamine are known to relive some of the positive schizophrenic symptoms – suggesting that those with schizophrenia either have too high levels of this neurotransmitter in the brain, or are somehow overly sensitive to it. Recreational drugs such as amphetamines and cannabis with a high THC content are also associated with the development of schizophrenia, it is unclear whether these directly trigger the disease or if people more likely to develop schizophrenia are also more likely to use these drugs.

There is also evidence that birth complications such as not getting enough oxygen during birth, being born prematurely, or having a low birth weight also increase the risk of developing schizophrenia later in life. This may be due to subtle changes in the brain caused by these complications.


A combination of medication and talking therapies are usually used to combat the symptoms of schizophrenia. The main medications used are antipsychotics, which help alleviate the positive symptoms. There are two main classes of antipsychotics: typical and atypical. Typical antipsychotics used to be used to treat psychosis, but often gave Parkinson’s-like side effects (as Parkinson’s disease involves the death of dopamine-producing neurons), so have more recently been replaced with atypical antipsychotics. Antidepressants can also sometimes be used to help with the negative symptoms.

Cognitive behavioral therapy can be useful in allowing the individual to manage their symptoms more easily, recognising delusions and hallucinations for what they are and making them less overwhelming. Education about the illness and how to spot early signs of a psychotic episode are helpful for both the individual with the illness and those close to them, it’s very important for family and friends of someone with schizophrenia to understand the condition and how to help.

A Case of the Blues? What causes Depression?

Vanessa Kam

I felt dubious about seasonal depression until I moved to England.  Can cold, dark winters really dampen our spirits?  I thought I was immune to these effects until December drew nigh and daylight slumped by 4pm…

 While a number of cross-sectional studies have cast doubt on the existence of seasonal depression, (wittily termed SAD–seasonal effective disorder), the abundance of media coverage on this phenomenon echoes the general ‘down in the dumps’ mood many endure at certain times of the year.  But when does occasionally feeling blue, part of the human condition, toe the line of depression, a debilitating mood disorder?

 With depression being a major risk factor for suicide and suicide among the leading causes of death worldwide, in recognition of National Suicide Prevention Day, we explore what morphs the mind into a ‘bad neighbourhood’.

What is depression?

 Clinically, ‘depression’ encompasses several disorders where patients are absorbed by feelings of sadness, emptiness or irritability, with physical and mental changes which impair everyday functioning.

Major depressive disorder, put simply as depression, is the most common form.  In its manual, the American Psychiatric Association requires the following symptoms to be consistently present for a minimum of two weeks:

Depression 1

Depressed individuals have been shown to possess altered thought processes, falling into subconscious negative self-representations reinforced by biases in attention and memory to negative stimuli.  A key cognitive feature of depression is rumination, with sufferers repeatedly mulling over the causes and consequences of their current state.

Laboratory tests to diagnose depression do not exist, hinting towards a murky understanding of its pathophysiology.  Yet as the second leading cause of disability worldwide, it remains a major global health issue, affecting more than 300 million people.  In England, depression is the most common mental illness, with one in five of 5,450 respondents in a national survey having been diagnosed in 2014, and an estimated £10.96 billion cost to the country in 2010.

Considering its detriment to society and the individual, what is known about the underlying cause of depression?  Is there a single cause?


Causes of depression

 The NHS webpage on causes of clinical depression kicks off by saying “There’s no single cause of depression”.

Well, that was easy.

Instead, a combination of biological, psychological and social factors intertwine for each individual, elegantly demonstrated by the diathesis-stress model.

This considers a person’s vulnerability or predisposition (diathesis) alongside both internal or external stresses in precipitating a depressive episode.  Those with a high diathesis require a lower stress level to stimulate depression, while the less-inclined cope with more setbacks.

Depression 2

But what might make someone a severe “diasthetic”?

Genetic components play a role.  Depression has a heritability of about 40%, meaning 40% of the variation in vulnerability amongst individuals is down to differences in genes.  This is comparable to type 2 diabetes, another common illness riddled with lifestyle and genetic factors, but lower than schizophrenia, which has a heritability of about 80%.

Despite this genetic contribution, genome-wide association studies (GWAS) have repeatedly failed to find significant gene variants associated with depression.  One study combed through the genes of 9,240 patients and 9,520 controls–the largest study as of 2012–and came out empty-handed.  Considering the success of GWAS in many other complex human diseases and traits, this points towards an exceptional heterogeneity within depression.

One study which found the tip of the iceberg included subjects with recurrent depression only.  By scrutinising 5,300 Chinese women who suffered repeated bouts of depression, two loci were finally identified.

 Intriguingly, one lies close to a gene required for making mitochondria, in line with recent hypotheses involving mitochondrial dysfunction in depression and findings of increased mitochondrial DNA with increased life adversities in depressed individuals.

 In fact several early life experiences contribute to diathesis.  Childhood abuse is plain to see, but even growing up in a negative environment with constant criticism, rejection or a depressed parent can mould the negative cognitive processes associated with depression.

Personality, largely a product of genetics and early life experiences, also ties in to depression.  A study of female twins over time found neuroticism, a personality trait characterised by moodiness, irritability, anxiety and self-consciousness, to mediate symptoms of anxiety and depression, perhaps due to a common negative bias in information-processing.  More alluringly, researchers have come to view depressed, neurotic individuals as active contributors in snowballing their afflictions, interacting with others in stress-generating ways.

With a foundational vulnerability, what about stress?  What factors may push individuals above the threshold?

Below are some common examples, from major adverse experiences like the loss of a loved one, to cumulative, minor chronic stresses like living with many toddler children.

Depression 3

Of most clinical relevance is co-morbidity.  Those who suffer from chronic physical diseases have higher rates of depression, leading to worse outcomes and significant healthcare costs.  A 2012 report estimated that £1 in every £8 the NHS spent on long-term conditions is linked to poor mental health, pointing towards a need for more holistic attention towards patient health.

But how exactly does stress invite depression?  The prevailing model taught to this day is the monoamine hypothesis, the idea that a chemical imbalance, the depletion of serotonin, noradrenaline and/or dopamine in the central nervous system, produces depressive symptoms.  Indefinite exposure to cortisol in chronic stress increases enzymatic breakdown of these neurotransmitters.  The hypothesis stems from the action of antidepressants like Prozac, which increases the availability of serotonin outside cells to act at synapses.

Yet mounting bad press uncovering the buried data, skewed positives, exaggerated efficacy and hidden harms of antidepressants adds to the list of limitations to this model.  An analysis of 70 trials found suicide ideation and aggression doubled for children and teens on certain antidepressants, an eery finding considering the unadulterated symptoms of depression itself.

As such, researchers are looking into inflammation, cell death in certain brain regions–depressed individuals have smaller hippocampi–and reductions in the already limited generation of neurons in the adult brain, all of which can be linked to chronic stress, as causes.  In the case of depression, despite the phrase being banned in the BMJ, more research is needed.

For the sake of current sufferers and the many to come.

How has our understanding of Mental Health Changed?


Diego Vieira

The idea of what constitutes health is a product of its time. The essence of this idea has followed humanity throughout time.

Greek philosophers saw the use of reason to contemplate the world, but it was only after some great time, with the thoughts of famous French philosopher René Descartes, that the idea of using reason to improve health and the world was first popularised. This directly opposed older thinking that the rage of gods upon humans was the reason behind the suffering and the rise of diseases and the possibility of a cure was pure fiction. In this world, faith was the first kind of ‘treatment’ that could possibly change what was so called fate, giving people something to do about it other than doing nothing. The exploration of nature and the usage of herbs were brought into consideration and so humans were no longer at the mercy of fate or gods, now having an actual method for fighting disease – being more responsible for their own recovery, the concept of health and prevention/cure growing more prevalent in human society and thinking. Some ages later, the ‘material’ health was considered the only possible reason for evaluation, in other words, what could be seen, could be treated. Much was studied about the human body. Anatomy and Physiology was vastly explored, drawing a path to the field of Psy.

The great contributors in the first steps for the creation of the study of the mind were English and German philosophers and physiologists like Francis Bacon, Ernst Weber, John Stuart Mill, and many others. They helped lead science into the field of the study of the mind that later became Psychology. It’s a matter of fact that even though Psychology was being developed as science, what was studied was how the mind functions, and mental illnesses were not even a concept. Whilst mental illnesses have always existed, our acknowledgement of them and our attempts to deal with them have only recently become mainstream.

Some of the most remarkable research of the human mind was carried out by neurologist Sigmund Freud, founder of Psychoanalysis, which tried to explain the formation of personality based on the conflicts of the conscious and the unconscious mind, and how the human mind is driven by the concept of trieb (German word for instinct, libido). Certainly though, Freud’s work is not without controversy – the debate about its accuracy, if not its findings, is one of great debate. Other memorable researchers included Watson, Skinner, and Pavlov, who studied how external stimulus could influence our behaviour, leading to the development of mental functions.

These two different references approach the study of the mind and the environment to understand how humans live through a psychological perspective, but the actual understanding of mental health wasn’t known to be what it is today. The concept of madness is a historical construction, before the 19th century there was neither the concept of mental illness nor a division between reason and madness. The path of development from the Renaissance era to today is marked by a growing separation between those experiencing mental conditions and the rest of society, with the development of asylums being the most famous example.

The Renaissance is regarded as the era of self-realization and the growth of the scientific model. To be a true Renaissance person required on to develops his intellectual, moral, religious, physical, and aesthetic capacities. The Renaissance was strongly characterized by art and literature and men had been upon the grace of culture and internalized the sense of civilization and refinement, meaning that those straying from the circle of culture were excluded and taken out of the sight of the civilization. The criteria used to determine who was or was not eligible to be part of society had no solid fundamentals. For that reason, substance abusers, the homeless, homosexuals, and everyone who were deviant from the “normal” acceptable behavior would be locked in mental asylums, where they had no basic conditions of in-habitation. These disorders were seen as an incapability to handle normal life situations, and so began attempts to understand patterns based on biological and sociological knowledge. This resulted, in 1952, in the creation of the Diagnostic and Statistical Manual of Mental Disorders by the American Psychiatric Association. One major flaw was that there was no divided line between normality and abnormality – this was left up to interpretation.

The creation of a manual to identify mental disorders was to have a large effect of the numbers of diagnosis for mental health, and a need for a standard was pertinent because professionals around the world had their own ways to treat these cases, and unifying the knowledge from different diagnosis explored by different professionals all around the world would serve as standard for a better medical practice and facilitate researches in the field of mental health. The DSM was possible because of commissions that reunited the mental health professionals to create criteria to better understand and deal with the people affected by mental disorders.  Since its creation in 1952, the DSM has gone through 5 revisions to review its findings and increase the number of studied diagnosis that were found in psychiatric researches conducted, for instance, by Robert Spitzer and Emil Kraepelin.

The U.S. National Institute of Mental Health sponsored researches between 1977 and 1979 to test the validity of the diagnosis’s, allowing more knowledge to complement their understandings. With the International Statistical Classification of Diseases and Related Health Problems – ICD, created in 1893, further contributions and collaborative work with the DSM.

The DSM in its fifth and most up to date revision from 2013, counting with approximately 300 categories of disorders and is used internationally as an instrument to guide treatment and research into mental health, allowing professionals to correctly approach patients and have an oriented practice, guaranteeing their practices under scientific research and no longer relying on presumptions or personal perspectives, which previously clouded the ability of psychologists to accurately diagnose disorders.

The Science of Sexuality


Sintija Jurkevica and Jonathan James

The struggle of understanding sexuality begins to muddle even before sexual orientation can be defined. Some sources describe it as a person’s capacity to have erotic experiences and responses. However, in general, sexual orientation or preference, can be defined as “the sex (biological aspects of maleness and femaleness) of those whom one feels romantically and sexually attracted to”, where one’s sexual orientation may be categorised as heterosexual, bisexual, homosexual, queer, pansexual, asexual or among others. However, categorisation of identifiable preferences is more nuanced than it appears; whilst some research may describe orientation as discrete categories, substantial evidence backs up the existence of a sexual continuum or spectrum.

But how does one develop a sexual preference? This riddle is a classic psychological argument of nature versus nurture: do the genes, the environment, or a mixture of them both influence one’s sexual attraction to others? This is obviously an ongoing debate and a matter of significantly more research. A recent September publication, composed by a psychology researcher Michael Bailey and his colleagues in the peer-reviewed journal of Psychological Science in the Public Interest, has been created with the intention of objectively reviewing previous scientific research on sexual orientation to draw impartial conclusions on the topic, without preconceptions of scientific biases and political influences.

Bailey’s review paper concluded that the non-social causes, such as the individual’s genetic make-up, play a larger role than environmental influences in establishment of one’s sexuality. The evidence, supporting such a claim, includes the genetic influences in twin studies and unchanged sexual orientation of infant boys after they are surgically or socially “converted” into girls. Bailey and colleagues also argue against the commonly assumed environmental causes of homosexuality to be weak and distorted in comparison to alternative explanations.

Various genetic hypotheses had been proposed to explain differences in sexuality. In several studies, it was found that a several different genetic markers (i.e. genetic elements) were more likely to be found in gay men in comparison to their straight counterparts. When this news was first published, it caused an outpouring in the media of the discovery of the so called ‘gay gene’, but the media failed to report one significant factor – genetic influences themselves cannot be used to determine predisposition to a trait. In other words, simply having a genetic element doesn’t automatically result in these individual’s sexual orientation. To make matters more complicated, scientists were unable to reproduce these findings in women for same sex attraction, suggesting that sexual orientation is a lot more complex than a few genetic differences.

Other scientists have conducted studies considering the seemingly well establish theory that each additional older brother increases the odds of a male being gay by approximately 33%, with something like 1 in 7 gay males holding their sexual orientation because of having older male siblings. These findings have been controversial, not least because there are several scientific studies that support these proposals, and several that have not found a link.

One attempt to explain this apparent causation is through the maternal immune response. Male fetuses produce H-Y antigens (small proteins) that play a role in sexual development in the womb (i.e. the development of male sex organs). In response to these antigens, the mother will sometimes produce an immune response, which gets stronger with each successive male fetus, resulting in decreased activity of these antigens in later males. One suggestion is that this results in less ‘mascularization’ of the male brain, resulting in the development of same sex attraction. The major flaw with this explanation is simple – the occurrence of the mother’s immune response is significantly lower than the prevalence of homosexuality, suggesting it cannot be the major cause.

The truth of the matter is, despite several attempts to better understand the genetics behind human sexual orientation, scientist know very little about what causes it, or even the true significance of any environmental factors. As Bailey concludes in his paper however, “Sexual orientation is an important human trait, and we should study it without fear, and without political constraint,” Bailey argues. “The more controversial a topic, the more we should invest in acquiring unbiased knowledge and science is the best way to acquire unbiased knowledge.” Therefore, we should look forward to developing a better understanding in the future, in the hope that a better understanding of ourselves, results in a better understanding of each other.



A Profile of Margeret Mead

m mead

Rhiannon Freya Lyon

Born in the US, 1901, Margaret Mead is recognised as one of the most influential anthropologists of the 20th century, often seen as the woman who laid the foundation for second wave feminism and the sexual revolution of the 1960s. Through her studies of isolated civilisations in the South Pacific, Mead was a pioneer of the idea that behaviour is culturally learned rather than being innate. She specifically focused on gender roles (the expected behaviour of an individual based on their gender), and how these are greatly shaped by the society we grow up in.

During her early academic career, Mead was especially interested in studying cultures uninfluenced by Westernisation. This lead to her first pacific island field study in Samoa which largely consisted of interviews with adolescent girls, observations from which laid the grounds for her first book Coming of Age in Samoa, published in 1928. In this book she put forward the idea that Samoan cultures didn’t adhere as strictly to gender roles as the US: that adolescents had more freedom to explore their sexuality, that extra-marital sex was not so taboo, and that these attitudes lead to more healthy development. She put forward the controversial view that the Western way of doing things was not necessarily the best or most progressive way of doing things.

In 1935 Mead started digging into the differences in gender roles and temperament across different cultures in New Guinea, recorded in her book Sex and Temperament in Three Primitive Societies. She found that different cultures had different attitudes towards aggression and what the roles of men and women were in society. For example, the Arapesh people were peaceful and neither men nor women were involved in war. Contrary to this, among the Mundugumor people both men and women were involved in war. The women in the Tchambuli ethnic group were responsible for catching and trading of food, while the men were more involved in the politics of the tribe, with neither gender being dominant over the other. Mead found that across cultures men and women would be responsible for different things, but in general whatever the role of the man was, this was held more highly. This observation broke ground by separating the biological sex from a socially constructed gender.

During World War 2, access to the South Pacific was cut off and Mead’s focus therefore shifted to the US. During this time Mead and her former academic mentor Ruth Benedict founded The Institute for Intercultural Studies.

As with anything that challenges the status quo, Mead’s work attracted a lot of criticism. People did not like the idea that their ideas of gender and gender roles were not as set in stone as they may have thought. One of Mead’s most prominent critics was Derek Freeman, who was very determined to discredit her and her findings, publishing several books on her “hoaxing”. There are of course legitimate criticisms to make of Mead’s work, her downplaying of some of the negative elements of Samoan development for example. But Freeman’s criticisms went beyond this in his (somewhat successful) attempts to damage her reputation. His work has now by and large been rejected by the anthropological community, due to his unreliable methods and tendency to cherry-pick his data, while misrepresenting Mead’s work.

After the Second World War, Mead went back to New Guinea in order to study the impact of exposure to the wider world on the people living there as a result of war. She found that after contact with the wider world, societal ideas among previously cut-off cultures had changed.  This trip ended up informing her beliefs in the way cultural ideas shape social problems such as racism and disregard for the environment, and lead to her famous quote “never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has”.

Although her mother was a suffragist, Mead never publically labelled herself a feminist. She was however very outspoken on women’s equality and civil rights. Her work contributed to the rise of second wave feminism by focusing on how gender roles are shaped by the society you live in, rather than being inherent.

Later in life Mead became a curator for the American Natural History Museum, President of the American Anthropological Association, Vice President of the New York Academy of Sciences, and served various positions in the American Association for the Advancement of science. She was a public speaker and university lecturer, speaking on a wide variety of subjects. In total Mead authored 12 books, and co-authored many more. She is seen as being a very accessible writer and speaker, able to successfully engage with members of the general public to spread her ideas further than the circle of academia.

Mead said of relationships “one can love several people and that demonstrative affection has its place in different types of relationships”. This illustrates her views, unconventional at the time, and possibly even now, that romance need not be heterosexual or monogamous to be valid. These views were displayed in her own personal life, although her relationships with women were not public knowledge at the time. Mead had three successive husbands, the last of whom she had a child with; alongside her marriages she also had a long-term lover Ruth Benedict, her former mentor. She spent the later years of her life living with fellow anthropologist Rhoda Metraux, with whom she had a romantic relationship.

Over her lifetime Mead was awarded many accolades for her contributions to anthropology and wider society, including being posthumously awarded the Presidential Award of Freedom.