The Anti Vax Movement; More Dangerous Than A Disease? by Hedda Belsnes

Going to the doctors’ or the school nurse to have a needle stuck in your arm, is typically a painful childhood memory most tend to have. For those who travel or require regular flu jabs, it could even be a reoccurring memory. However, how much thought goes into the science behind those sharp needles? What are known as vaccinations are available on the NHS and can protect an individual against a multitude of diseases, including diphtheria, tetanus, whooping cough, polio, Hib, hepatitis B, pneumococcal, rotavirus, meningococcal, measles, mumps, rubella and HPV. Protecting against these diseases not only relieves strain on the NHS, it also saves lives. So why do some choose not to vaccinate?

In 1998, the Lancet Medical Journal published the findings of Dr Andrew Wakefield, who had conducted research that proposed a link between the MMR vaccine (protecting against measles, mumps and rubella) and autism. This sparked global panic, with a dramatic drop in MMR vaccination rates, which inevitably caused a rise in measle cases. Not only did this “doctor” conduct unnecessary, invasive tests on children, without ethical approval or appropriate qualifications, the General Medical Council discredited the entire study, due to lack of medical basis.

Despite repeated research, including an extensive review, conducted by the World Health Organisation showing no evidence for a link between autism and the MMR vaccine, the anti-vax movement remains strong. Internet propaganda spouts arguments against vaccinations, yet these arguments are mainly built upon myths. These myths include vaccines being made using aborted fetal tissue, vaccines contain mercury, and the idea that vaccinations are dangerous. Numerous and extensive academic studies and medical research have disproven these myths. This is alongside proving that any side effects from vaccinations are mild and short-lived. Any severer adverse reactions are incredibly rare; however, doctors and nurses are trained to treat these.

It could be argued that the choice to vaccinate is a personal one. The basis of this argument is flawed in many ways. The first being that the fate of a young child isn’t their personal choice, it’s down to the parent’s own belief of what they think is best. It could be argued that the child’s right to safety and good health is being stripped from them, before they are old enough and informed enough to make their own choices. By the time the child is ready to decide for themselves, it could be too late. Does this mean that parents have a moral obligation to vaccinate their children?

This is also an argument that the choice to vaccinate isn’t a personal one, because of the dangers unvaccinated individuals can pose. The common idea is that if vaccinations are so effective, others aren’t at risk if someone was to chose not to vaccinate their kids. On the contrary, there are many children who would be at risk, such as those who are currently too young to be vaccinated. This would be alongside individuals, young and old, who are unable to be vaccinated, due to immune system problems, such as cancer patients. Parents have also claimed that if they didn’t vaccinate their child, but their child became ill, they could keep them home from nursery/school, to avoid infecting others. This option is also unsuitable, as those infected are often contagious before symptoms can properly begin.

It can be easy to underestimate the importance of vaccinations, thus underestimating the dangers of the anti vax movement. Vaccination remains one of the easiest ways to stay healthy, protecting against serious illnesses, which still poses a very real threat. Vaccine-preventable diseases, such as measles, mumps and whooping cough still results in hospitalisations and death every year. This is largely because these diseases are more common in other countries, meaning children can be infected by travellers, or by travelling themselves. Ultimately, a reduction in vaccination rates could result in an epidemic, where diseases which are virtually eradicated, such as Meningitis C, can return with vengeance.

Yes, many may have their fears about the dangers of vaccinations. Yes, many may believe they are unnecessary and invasive. Yes, many may believe it’s safer not to vaccinate. Yet, you are at a far, far greater risk, by choosing not to vaccinate. Not only are you at a greater risk of severe infections, which could result in death, but you could also be neglecting your moral duties. As individuals, we all have a public health commitment to protect our families, friends and communities, and the only way to sufficiently do this, is to vaccinate.

https://www.verywellhealth.com/anti-vaccine-myths-and-misinformation-2633730

https://www.nhs.uk/conditions/vaccinations/

http://www.vaccineinformation.org/vaccines-save-lives/

The Chemical Quest for Love by Sophie Ball

With Valentine’s day just passing, everyone’s emotions have been tested; whether that means you were one of the lucky bunch that had the romantic company of your significant other or you were someone who shared brunch with their ‘galantines’ to make themselves feel a bit better. So what actually is the chemistry behind that fuzzy feeling?

In history, there have been many suggestions to how and why we fall in love. One scientist from Germany even suggested that relationships are affinity reactions and can be measured through browsing tables. More recently, although not completely understood, different chemicals within the body are thought to control the feelings of love.

When you bump into someone that takes your fancy, you tend to find your palms go sweaty, you stutter and your heart feeling likes it’s physically pumping out of your chest, so no wonder love was thought to come from the heart. However, *spoiler alert* love isn’t actually found in the blood-pumping organ but rather is just the brain (how unromantic!) making the rest of your body go a bit mad.

This feeling of lust is driven by the sex hormones, testosterone and oestrogen, from our evolutionary need to reproduce. Both hormones increase in both men and women when you see features that you desire – a symmetrical face and/or proportional body dimensions. This properly explains why the majority of people say they believe in love at first sight.

Next, follows attraction – being ‘star struck’. Once the initial butterflies have settled, the ‘reward pathway’ kicks in. When doing things that feel good such as spending time with your partner or having sexy time (whitwooo), dopamine and norepinephrine neurotransmitters are released. This initiates the feelings of excitement and making you feel giddy when thinking about that special someone. Wow, doesn’t that play on your heart strings (wait sorry, *chemical reactions).

Attraction can also cause a decrease in serotonin hormones, involved in the response for appetite and mood. Scientists have found a similar pattern in those who have OCD suggesting this is what causes the brain to constantly fixate on your love and nothing else.

Finally, to keep out of that annoying friend zone, attachment is the final contributor in falling in love. Attachment is also involved in friendships, mother-baby bonding and other relationships but the addition of lust and attraction factors separate relationships from these other intimacies (well those who don’t have their own problems to deal with).

Oxytocin, known as the ‘cuddle hormone’, is released to make us feel this attraction and want to be close to our other half. It is stimulated by touch and trust – from feeling supported to an orgasm. This increase in oxytocin over time builds a cycle of social trust.

Given these oxytocin attachments’ powerful nature, they are hard to break causing severe heartbreak when losing a loved one. This is why falling in love is seen to follow similar behaviour to drug addictions (although much healthier than recreational drugs, of course). Similarly, endorphin is stimulated by physical pain to reduce its effect and trigger a positive feeling. If it’s a loved one that stimulates this hormone, the brain can learn to associate the pain with a better feeling, allowing people to tolerate painful relationships.

So although there is no particular ‘formula’ for love, it’s understanding is being improved. Love can be one of the best things and worst things that happens to you but everyone is capable of it – it’s just a bit of hormone fluctuation!

Hopefully, you will find that chemistry soon if you haven’t already, happy belated Valentine’s!

 

References

https://www.psychologytoday.com/gb/blog/your-neurochemical-self/201802/the-neurochemistry-love

http://www.eoht.info/m/page/Love+the+chemical+reaction

http://sitn.hms.harvard.edu/flash/2017/love-actually-science-behind-lust-attraction-companionship/

Biohacking: an upgrade to “wearable tech”, or turning ourselves into cyborgs? Ellie Marshall

Anyone who’s watched the futuristic Netflix show ‘Black Mirror’ will know of how emerging technology and our reliance on it can have unanticipated consequences – If you have not seen it, I highly recommend giving it a watch!

Yet, we might be closer to the futuristic world of Black Mirror than you think. Around the world, people are pushing the gruesome boundaries of how far we integrate tech with our lives, through a series of implants and body modifications. This is a branch of biohacking – a blanket term used to describe a whole spectrum of ways that people modify or improve their bodies. People who hack themselves with electronic hardware to extend and improve human capacities are known as Grinders or Transhumanists.

Common procedures

A common procedure is to implant a strong magnet beneath the surface of a person’s skin, often in the tip of the ring finger. Nerves in the fingertips then grow around the magnet. This allows nearby magnetic and electrical fields along with their strength and shape to become detectable to the user, thanks to the subtle currents they provoke. For a party trick, the person can also pick up metal objects or make other magnets move around.

Calling this a procedure, though, gives rather the wrong impression. Biohacking is not a field of medicine. Instead it is carried out either at home with DIY kits purchased online or in piercing shops, but without an anaesthetic (which you need a licence for). If you think this sounds painful, you are correct. With no corporate help, the only way grinders can accomplish their goals is by learning from other grinders, mainly through online forums such as biohack.me.

Britain is the birthplace of grinders and in 1998 Kevin Warwick, professor of cybernetics at the University of Reading had a simple radio-frequency identification transmitter (RFID) implanted in his upper left arm, in an experiment that he called Project Cyborg. The chip didn’t do much – it mainly just tracked him around the university and turned on the lights to his lab when he walked in. Still, Warwick was thrilled, and the media were enchanted, declaring him the world’s first cyborg.

RFID implants are now common among grinders and allow users to unlock physical and electronic barriers. Similar technology is already widely used in contactless card payment systems and clothing tags, and Motorola are developing an RFID-activated ‘password pill’ that a user can swallow and access their devices without the hassle of remembering them.

Other examples of biohacking

Circadia, developed by Biohack.me offshoot company Grindhouse Wetware is another implantable device that constantly gathers the user’s biometric data, for example transmitting temperature data via Bluetooth. The medical potential for this device is vast, and it has the most immediately practical benefits.

Additionally, the first internal compass, dubbed the ‘Southpaw’ has been invented. It works by sealing a miniature compass inside a silicon coat, within a rounded Titanium shell, to be implanted under the skin. An ultra-thin whisker juts out, which is activated when the user faces north, to lightly brush an alert on the underside of the skin.

Rich Lee, a star of biohack.me forum, has magnets embedded in each ear so he can listen to music through them, via a wire coil he wears around his neck, that converts sound into electromagnetic fields, creating the first ‘internal headphones’. The implants allow him to detect different sensors, so he can ‘hear’ heat from a distance and detect magnetic fields and Wi-Fi signals too! There is a practical purpose to Lee’s experiments, as he suffers deteriorating eyesight and hopes to improve his orientation through greater sensory awareness.

A damaging concept to users and society?

The question we must ask ourselves is at what point does the incorporation of all this technology make us a different species and what are the ethics behind that?

The bluntest argument against biohacking is that it’s unnatural. For most people, especially those who benefit from medical advancements like pacemakers and cochlear implants, adding RFID or magnets to the body appears to have little value. There are very few people who can’t recognize the benefits of technological progress and how it has helped humanity. Grinding, however is often not recognized as an advancement.

Another argument against human augmentation mirrors the worries that commonly surround genetic engineering. A thought provoking possibility is that those who have access to (and can afford) augmentation procedures and devices will gain unfair advantages over those who do not. Over generations, this could create a large rift between the augmented and the unaugmented. Luckily, the grinder movement provides a solution to this problem as part of its central ethos: open source hardware and the free access of information.

A benefit to the individual and society?

To some, implanted technology represents the next stage in mankind’s evolution that may bring many medical advancements. And, indeed, the idea is not outlandish. Brain stimulation from implanted electrodes is already a routine treatment for Parkinson’s and other diseases, and there are prototypes that promise to let paralysed people control computers, wheelchairs and robotic limbs.

The Wellcome Trust has begun a trial with Alzheimer’s patients carrying a silicon chip on the brain itself, to predict dangerous episodes, and able to stimulate weakened neurons. Military researchers Darpa are also experimenting with a chip implant on humans to help control mental trauma suffered by soldiers.

There is potential to help visually and hearing impaired people by using a chip that translates words and distances into sound, which could mean the end of Braille and sticks. Neil Harbisson is the founder of the non-profit Cyborg Foundation in Barcelona and was born with achromatopsia, the inability to see colours. Since 2004, Harbisson has worn a device he calls the eyeborg, a head-mounted camera that translates colours into soundwaves and pipes them into his head via bone conduction. Today Harbisson “hears” colours, including some beyond the visible spectrum.

These experimental grinders are certainly laying the groundwork for more powerful and pervasive human enhancements in the future, but for now, a Fitbit is more than enough for me.

 

https://www.techopedia.com/definition/29897/biohacking

http://www.abc.net.au/news/2017-02-23/biohackers-transhumanists-grinders-on-living-forever/8292790

http://www.slate.com/articles/technology/superman/2013/03/cyborgs_grinders_and_body_hackers_diy_tools_for_adding_sensory_perceptions.html

https://gizmodo.com/the-most-extreme-body-hacks-that-actually-change-your-p-1704056851

https://hackaday.com/2015/10/12/cyberpunk-yourself-body-modification-augmentation-and-grinders/

https://www.wired.com/story/hannes-wiedemann-grinders/

https://www.theverge.com/2012/8/8/3177438/cyborg-america-biohackers-grinders-body-hackers

http://edition.cnn.com/2014/04/08/tech/forget-wearable-tech-embeddable-implants/index.html

https://www.digitaltrends.com/cool-tech/coolest-biohacking-implants/

Face Blindness – what is it (and does it actually exist)? Gege Li

What’s the first physical thing you notice about someone when you meet them for the first time? Is it whether they’re male or female? Maybe it’s their eye colour or freckles. Their unusually-shaped nose, even. Whatever it may be, the combination of these unique traits is what we use to recognise the important people in our life, as well as all the others we know and encounter in between.

So imagine what it would be like if you couldn’t use characteristics like these to distinguish between your mum or your best friend, your old high school teacher or your postman. That’s exactly what sufferers of face blindness – also known by its fancy name, ‘prosopagnosia’ – must endure for most, if not all, of their life (people usually get it from birth).

The inability to recognise people by their face alone affects approximately two in one hundred people in the UK. Sufferers may fail to judge a person’s age, gender or emotional expression from their face or spot similarities and differences between two different faces.

Unsurprisingly, this can have a profound impact on the behaviour and even mental health of sufferers. Although many cope by using alternative strategies such as a person’s distinctive voice, hairstyle or way of walking as identifiers, for others this doesn’t always work, especially if their acquaintance has recently changed their appearance. What are your options when even secondary clues become impossible to spot?

For some, the condition will cause them to deliberately avoid social interactions altogether, which can lead to relationship and career problems, bouts of depression and, in extreme cases, the development of social anxiety disorder. The latter may prevent a person from even leaving their house for overwhelming fear of social situations and embarrassment.

And to make matters worse, it isn’t only other people that someone with face blindness might not recognise. Objects, including places, cars and animals, also present difficulties, particularly for navigation and memory – even their own face staring back at them in the mirror could be an alien sight.

However, for anyone who’s an avid watcher of Arrested Development, you might find yourself questioning the legitimacy or even existence of face blindness as it’s played out through the eccentric and often ridiculous character of Marky Bark. On the show, Marky’s affliction with face blindness, exaggerated to the point it seems at times almost unbelievable, doesn’t appear to cause him half the inconvenience or trauma that it can in reality. Though viewers are made aware of the condition and its characteristics, does the show’s overtly comedic context promote public health misconceptions while hindering an educational message?

It is important to establish that face blindness really does exist and is much less a quirky trait than a cognitive impairment with potentially serious social and emotional consequences. But what causes it exactly? Although it’s true that it can develop following brain damage (from a particularly nasty knock to the head, for example), it has become increasingly clear that ‘developmental prosopagnosia,’ where people simply don’t develop facial recognition, is the most common cause. In fact, as many as one in fifty people might have it, equating to a potential one and a half million sufferers in the UK.

What’s more, many sufferers have reported to a parent or sibling experiencing the same kinds of difficulties, so it’s likely that genetics play a role in the occurrence of face blindness among families.

With all the trouble that face blindness can cause at the expense of someone’s livelihood and even health, it might be reassuring to point out that there are ways to determine whether you might suffer from the condition.

Often, doctors will use computer-based tests that require people to memorise and identify a set of faces, including those of celebrities. Most recently in 2015, a partnership between several doctors and universities in London coined a new questionnaire that can aid both a person’s diagnosis of face blindness as well as a measurement of its severity.

The inevitable bad news, however, is that there isn’t currently a way to treat face blindness directly other than to help sufferers improve their facial recognition with training and rehabilitation programmes.

There’s still hope for the future though – the prosopagnosia research currently taking place at Bournemouth University has also hinted towards using pharmaceuticals to temporarily intervene with face blindness, to some success. Once these techniques have been developed further, a successful cure or therapy might not be too far off.

In the meantime though, if you’re able to recognise a familiar face, why not consider taking the time to appreciate the people close to you (especially their mug) just that little bit more? Don’t take face perception for granted – you’re luckier than you think…

References

  1. https://www.nhs.uk/conditions/face-blindness/
  2. https://prosopagnosiaresearch.org/index/information
  3. http://www.bbc.com/news/health-34709004

Brain Study Evidence Shows Women are More Intelligent Than Men- Ciara Barrett

A recent study across a team of Russell group universities provides evidence that women are more intelligent than men, a source from the University of Sheffield’s neuroscience department says. The study was performed on a group of approximately 500 British women and men and came to a quite frankly controversial result.

A previous study on brain damage from the University of Illinois (see further reading) places the intelligence sectors of the brain in the left prefrontal cortex, left parietal cortex and left temporal cortex. These areas are vital to what we define as general intelligence: planning, verbal comprehension and working memory and are the same sections of the brain that were found to be “subtly more pronounced in the female subjects than the males”. Ironically, although not directly mentioned in the study, the researchers working on this were 50% female, a pleasantly surprising statistic for the face of women in science.

Subjects were tested on short and mid-term memory, verbal reasoning, simple mental maths and non-verbal aptitude while under a newly designed brain scanner. The scanner detected the parts of the brain that were the most in use during each activity using thermal and electrical impulse imaging. White matter helps to connect the different sectors of the brain and women are thought to have 10 times as much as men. This could be why women also were found to use both sides of their brains more equally, especially when listening (men predominantly use the left side). Connections are made in the brain via networks of neurons using “electro-chemical signalling” and since women have so much more white matter, it makes some sense that their brains work faster and are better at solving problems due to the improved connectivity within the brains so that more links can be made.

Historically, females have been shown to be more rational and better at spatial leaning, particularly regarding breeding, choosing a mate and place to live, but these traits don’t specifically involve the left side of the brain. As mentioned, women’s brains have been proven to use both the left and right side equally, so why did the “intelligence sector” light up for the women’s brains? Although there is little evidence that brain size affects brain power, the women’s frontal cortex was larger, allowing more space for neural connections to be made. This makes women hardwired to be better problem solvers which was said to be one of the criteria for intelligence according to the report. The hippocampus (the centre of the brain that converts short term memory to long term and helps us to remember details) is also proven to be bigger in women and this also may have had an effect on why women performed better in the memory tests too.

These are just a couple of unexpected physiological differences between male and female brains, none of which were newly founded during this study, but have been stated as possible explanations for the fascinating result. Other possible explanations listed in the discussion of the report are that women, “after being oppressed for so long have begun to evolve slightly faster than men and have done so to compensate for being underestimated in their natural habitat”, which is very unlike the original natural habitat that humans first grew up in where the males were the sole providers for their families and women were nothing but caregivers. The researchers themselves place the result down to the size differences in the intelligence sector and enhanced ability for connections in the female brain, but don’t completely discount the possibility of recent accelerated evolution in women. No matter how much we all evolve, it still comes down to survival of the fittest, and here, it seems, women are truly enduring.

 For the study from University of Illinois: http://www.kurzweilai.net/where-is-intelligence-located-in-the-brain

http://neurorelay.com/2012/10/07/female-brain-versus-male-brain/

**DISCLAIMER: This was an article written for April Fool’s Day, 2018. The above article was intended for entertainment purposes only and may include completely fabricated facts.**

Sugar Certified as MORE ADDICTIVE Than Crack Cocaine – Vanessa Kam

In a brave, valiant study against the sugar-loving food conglomerates dominating our kitchen cupboards, Dr Dia Beatez1 of the University of Duncee has proven that sugar is more addictive than crack cocaine.

Using state-of-the-art facilities in Yu Chun, China, Dr Beatez attached pedometers to adult pandas2 and fed them regularly with sugar water for three weeks.  Upon withdrawing the sugar solution in place of GMO low-sugar bamboo shoots, the pedometers recorded a staggering 401% increase in the pandas’ physical activity, with the youngest subject Benben even exhibiting tree-climbing activity in search of sugar water, much to the surprise of the team.

The experiment was repeated using 1tsp crack cocaine3 in solution, but failed to elicit the same effects as the everyday commodity—sugar.

Commenting on her findings, Dr Beatez says the evidence is clear cut.

“Pandas share 68% of their DNA with us.  Accounting for the small difference in genetic makeup, our pedometer results are still significant.  Sugar is more addictive than crack cocaine.  It’s a fact.”

The research has yet to be published, but Beatez is driven to enlighten the masses before reporting her results in the likes of Nature and the BMJ, acting in the interest of the public.

“Just as craving ice cream will make the couch potato stand up, walk 10m to the kitchen, open the freezer and dig in to a tub of cookie dough ice cream, the panda too will quadruple its physical activity in search of sugar.”

Other experts echo these findings.  An article published last year in the British Journal of Sports Medicine demonstrated that sugar is more addictive than opioid drugs in rats, causing behavioural problems and depression on withdrawal, due to dopamine-related changes in the brain.

“Cocaine?  It’s as irrelevant to pandas as it is to most of us in our daily lives,” Beatez added.

These findings come in light of the Soft Drinks Levy sweeping England, which charges manufacturers per litre sugar-loaded beverage produced.

However Dr Beatez insists the government is not doing enough, neglecting the heavy consumption of sugary snacks.  “I have had patients come in because they simply cannot just eat two squares of Cadbury’s Dairy Milk—they must finish the pack, all 200g of it.”

Speaking on account of anonymity,  a student at the University of Duncee confesses to daily binge-eating of Dairy Milk.

“I used to beeline to Sainsbury’s right after lectures to pick up a bar of that milky sweet stuff, demolishing it before even reaching the library.  …  Thanks to Dia’s help, I have been taking DIAgram© twice daily and it is a true miracle!  I walk down the vegetable aisle now!”

When asked to compare sugar to crack, the student was unable to testify, having never been on the drug.

“I am convinced that sugar is more addictive than any Class A drug, just because Dia says so.”

Inspired by her patients’ success, Dr Beatez has founded the Group for Large Unhealthy Companies’ imprisOnment & Sentencing (GLUCOSE), fighting for the incarceration of “global sugar daddies” and the criminalisation of “sugar-dealing”.

The sugar war, is one to match the war on drugs.

Footnotes:

  1. Dr Dia Beatez is a registered dietician, with a Masters in Nutritional Science from the Universidad de Noticias Falsas.  Beatez is not a medical doctor, but prefers the title “Dr” and is a recognised expert in the field.
  2. Beatez’ research on pandas was conducted in accordance with the Good Laboratory Pandas and approved by the Medical Religious Church.
  3. Beatez declined to comment on the procurement of crack cocaine for the experiment, but assures it was ethically-sourced from organic, Fairtrade croppers just south of North America.

 

**DISCLAIMER: This was an article written for April Fool’s Day, 2018. The above article was intended for entertainment purposes only and may include completely fabricated facts.**

Author’s notes

  • Dr Dia Beatez – Diabetes
  • University of Duncee – Dunce
  • Yu Chun, Benben – Chinese words for stupid, dumb
  • 401% for April Fool’s
  • British Journal of Sports Medicine study real, and was criticised in the following article https://amp.theguardian.com/society/2017/aug/25/is-sugar-really-as-addictive-as-cocaine-scientists-row-over-effect-on-body-and-brain
  • DIAgram© – Mirroring how “experts” who give nutritional advice on the internet often sell their own health products as well
  • La Universidad de noticias falsas – “The University of Fake News” in Spanish
  • GLP – Play on “Good Laboratory Practice”
  • MRC – Play on “Medical Research Council”

 

 

Epilepsy Awareness Day: The Science of Epilepsy – Emily Vincent

Today is Epilepsy Awareness Day, a fitting time to shed some light on a common, varied, and poorly understood condition. Of course the science surrounding epilepsy is almost endless (as a sufferer I am all too aware of this), so this article is by no means exhaustive!

 What is epilepsy?

The word “epilepsy” actually applies to a group of neurological disorders, and has many forms. 60 % of seizures are convulsive – the shaking and aggressive movement most people imagine when thinking about epilepsy. Other forms include “absences” – where consciousness is lost temporarily and the person “blanks out” or stares into space throughout, and focal seizures – where the person may experience a huge variety of effects such as stiff or floppy limbs, repeated actions such as picking at clothes, numbness or tingling, or sensing an unusual smell or taste.

 While relatively little is known about the mechanism, we know that seizures are brought on by an abnormal, excessive, and synchronised firing of neurons in the brain.

 Interestingly, epilepsy and seizures have distinct meanings. Anyone can have a seizure, and one incident wouldn’t be classed as epilepsy, nor would a seizure clearly brought on by a specific event or injury. The term epilepsy is then generally applied to people who experience, or have experienced, multiple seizures. It can onset at any time of life (quite commonly around puberty) and also stop affecting people at any time of life (for example some types of epilepsy mostly affect children).

 A key fact to take home here is that epilepsy can mean many different things, and affects people in dramatically different ways.

 What causes epilepsy?

The short answer is that we’re not really sure. A common misconception is that flashing lights are the trigger – for some people this is definitely true, but in fact only three percent of people with epilepsy have this as their trigger. Genetics, brain tumours, sleep deprivation, strokes, periods, stress, and alcohol are just a few of the causes or triggers of seizures – most of which aren’t really understood. For example, some people know their seizures are tied to sleep deprivation but their doctors can’t tell them why this is.

 How do we treat and manage epilepsy?

Medication, brain surgery, and implants are used, and lifestyle management such as avoiding triggers and adopting the ketogenic diet can also be effective. For some people it seems that nothing works, while others can go for years without any seizures.

 70 % of people have their epilepsy controlled by medication, including anticonvulsants, and there is a long list of possible options – the suitability depends on the type of seizure experienced, the age and gender of the person, side effects, and more. Alongside not quite knowing how epilepsy works, the mechanisms of action are often unknown for these medications.

 Brain surgery is reserved for cases where medication is ineffective and it is known (through tests including brain scans and tests of the brains electrical activity) that the seizures originate from a small part of the brain. The surgery involves removal of part of the brain under general anaesthetic, and while often effective, of course it carries some risks such as memory problems and vision loss.

 What other effects can epilepsy have on sufferers?

As with most medical conditions, epilepsy can mean many extra things for people with it. Here, just a few of those of a “scientific” nature will be covered.

 SUDEP stands for Sudden Unexpected Death from EPilepsy and is applied to instances where a person with epilepsy dies suddenly with no apparent cause. It occurs most commonly during sleep, and those who experience convulsive seizures are at extra risk. SUDEP is obviously a tragic complication and while it is poorly understood, lots of work is being put into understanding and preventing it.

 One of the most effective anticonvulsants, Epilim (sodium valproate) is known to cause physical birth defects (three times as many compared to the general population) and has been linked to autism, if women take it while pregnant – reducing the options women have for epilepsy treatment. On a similar note, many pairings of anticonvulsants and contraception are unsuitable or would need modification – for example those taking both Lamotrigine and the pill may be advised to take higher doses of both as there is a risk of reduced effectiveness due to interference between the drugs. Another example is warnings against using some medications and the contraceptive injection as both can lead to bone loss – meaning some people consider it a risky combination.

 Unfortunately, people with epilepsy are also unable to give blood – while epilepsy is not a transmittable disease through blood or any other bodily fluid, giving blood comes with a chance of triggering a seizure, meaning that it is considered unsafe for sufferers.

 Important information

Straying away from the science here, on Epilepsy Awareness Day it would seem wrong not to signpost to practical information. There are loads of amazing organisations which give information for the public, fundraisers, and sufferers alike – just a few are listed below:

https://www.epilepsysociety.org.uk/

http://www.fable.org.uk/

https://www.epilepsy.org.uk/info/what-is-epilepsy

https://sudep.org/

 The following is a general idea of what to do if you see someone having a seizure (taken from https://www.nhs.uk/Livewell/Epilepsy/Pages/Ifyouseeaseizure.aspx), and lots more information is available online.

If you’re with someone having a seizure:

·        only move them if they’re in danger  – such as near a busy road or hot cooker

·        cushion their head if they’re on the ground

·        loosen any tight clothing around their neck – such as a collar or tie to – aid breathing

·        when their convulsions stop, turn them so they’re lying on their side – read more about the recovery position

·        stay with them and talk to them calmly until they recover

·        note the time the seizure starts and finishes

Dial 999 and ask for an ambulance if:

·        it’s the first time someone has had a seizure

·        the seizure lasts for more than 5 minutes

·        the person doesn’t regain full consciousness, or has several seizures without regaining consciousness

·        the person is seriously injured during the seizure

 *** Myth Busted *** do not put a spoon, your fingers, or any object into the person’s mouth. They will not swallow their tongue, but they may well bite your fingers.

New Test Can Detect Autism in Children – Keerthana Balamurugan

Autism is a mental condition that sticks to a person throughout their entire lives as there is no cure, and in some cases only through support from specialised behavioural psychologists and therapists can this disorder be managed to an extent. It is a development disability that dictates how a person communicates with other people and the world, having adverse effects towards their communication skills and relationships. Autism affects 1 in every 100 people in the United Kingdom alone. Hence, there are numerous Organisations, Universities and hospitals conducting research on this condition where little breakthroughs constantly occur. The latest research shows one such breakthrough where scientists might have found a new test to detect autism in children and also, it’s causative factors.

Autism is usually present from early childhood therefore parents are most likely the first ones to notice differences in their child compared to other children. Common symptoms include avoiding eye contact, preferring to have a familiar routine, hyperactivity, anxiety and so much more. As Autism is a spectrum, each person affected has a different set of symptoms but the common characteristics include having difficulties with social communication and interaction, and repetitive patterns of behaviours or interests. There are so many ways in which autism affects a person, thus it is hard to set a fixed list of symptoms. Despite it being termed “symptoms”, there are those who put their unique characteristics to good use, for example; using their ability of having an intense interest towards one topic and constantly trying to gain knowledge about that field.

Currently, there is no proven medical test to diagnose autism. For children, a series of specialists have to be seen before a proper diagnosis is made; such as a development paediatrician, child neurologist and a child psychologist. As this is not the best and most accurate way to diagnose autism, scientists at the University of Warwick have and are still conducting research on a revolutionary blood and urine test for children. This has the potential to discover the medical reasoning behind the mental condition, allowing patients to receive appropriate treatment much earlier on.

The scientists collected blood and urine samples from thirty-eight children with Autism spectrum disorder (ASD) and thirty-one controls. Upon analysing all the samples, it was found that those with ASD had increased advanced glycation endproducts (AGEs) and increased oxidation damage marker, dityrosine (DT) in plasma proteins respect to controls. There were other hypotheses that this research confirmed including how mutations of amino acid transporters are common in those with ASD. These chemical differences between controls and those with ASD prove that there is a way to medically test for Autism, thus helping hundreds of thousands of those affected to be properly diagnosed. The next step for the research team at Warwick is to recruit more children in order to further better their diagnostic performance. Hopefully, after this step has completed, we can see this being implemented in clinics and hospitals worldwide.

This test does not only reveal whether a person potentially has ASD but the research behind it can hopefully reveal new causative factors. Research so far has shown that 30-35% of ASD cases were caused by genetic factors and the remaining 65-70% are thought to be caused by a combination of genetic variants, multiple mutations and environmental factors. With this new research, there is hope that new causes can be identified to further our understanding of this mental condition and give us the tools to be able to combat it efficiently and appropriately.

How likely is a zombie apocalypse? Emma Hazelwood

Zombies have been limping around Hollywood for almost 100 years. They’re the stars of popular TV series, movies and video games. The Centre for Disease Control and Prevention even give information on how to survive a zombie invasion. So, how likely is a real-life zombie apocalypse?

In theory, a virus could evolve that induces rage and the need to eat human flesh. Scientists have actually found a neurone which could be targeted by a zombie virus – the olfactory nerve. This leads to parts of the brain affecting hunger, emotions, memory and morality. From attacking this nerve, a virus could result in hungry, aggressive, brain dead victims, who can’t recognise their own family and friends, and have no control over their body other than to feed.

In fact, several viruses already exist which seem to make the first step in zombifying people. The rabies virus, causing violent movements, anxiety, hallucinations, and aggressiveness, is even transmitted through biting. However, less than 3 people die a year of rabies in the US, so it’s hardly apocalyptic. This could be because it isn’t transmitted between humans, other than in a few transplants. There was one instance of a rabid kiss, but that’s hardly the stuff from 28 Days Later. The reason animals bite and transmit the disease is that they are confused, so turn to their natural defences.

It might not be a virus that causes the zombie apocalypse, but a parasite (as seen in Resident Evil IV). Parasites are capable of entering the brain and altering behaviour. Toxoplamsma gondii is a microbe which infects rats. To reproduce, the parasite (and rat) must be eaten by a cat. To maximise the chances of this happening, T. gondii actually changes the rat’s perception of cats, so, instead of being afraid of them, the rats seek them out. Scientists are already working on weaponising these bugs to be used in wars!

Which brings us on to the idea that maybe nature isn’t the serial killer – maybe the human race will be its own demise. A plausible method that scientists (or a supervillain) could use to create a zombie army is through nanobots. Within a decade, we will have nanobots capable of crawling inside our heads to repair neural connections. If the host dies these tiny robots could be capable of keeping parts of the brain alive – specifically, the parts for motor function and the desire to feed. They may even be able to reprogram the brain to bite surrounding humans, in order to be transmitted to another host.

So, scary as it is, a human with zombie-like characteristics is possible. However, one key feature of zombies is the fact they are dead, or, rather, undead. There are instances of people being declared dead, then waking up. Clarius Narcisse from Haiti (where zombie myths originated from) was declared dead and buried in 1962. He was found wandering around town, alive and well, 18 years later. Apparently, local Voodoo priests were using Japanese blowfish poison to zombify their workers. The poison slows all bodily functions to the point of being considered medically dead. When victims wake up, they are in a trance-like state, capable of tasks like eating and sleeping, but with no emotional connection to the world.

So if someone could “die”, then wake up with a zombie virus – we could, theoretically, one day have “zombies”. But would this actually lead to an apocalypse?

Probably not – firstly, zombies aren’t that well adapted to being active predators. With open wounds and rapidly decaying flesh, they wouldn’t be able to go out in the sun, especially with their diet consisting solely of human flesh. They would also be more sensitive to cold – limited blood and fluids mean frostbite would easily set in, and that’s not even mentioning their lack of immune system.

They not only have weak bodies, but also lack the intelligence that sets humans apart. This means they wouldn’t be able to communicate a joint attack or problem solve in any way. They wouldn’t be able to drive; they probably wouldn’t even be able to use a door handle.

If Hollywood is to be believed, the zombie plague can only be transmitted through being bitten by someone with the infection. This is obviously problematic – have you ever tried to bite through denim? All we’d have to do is wear clothes and we’d be protected. Plus, their only food source is the world’s top predator.

In today’s world, as soon as there was an outbreak a video of it would be online, easily warning people to just stay inside. In films, somehow the world’s army is overthrown by the infected (even though the untrained civilian protagonist seems to take out whole swarms). This just seems unrealistic – nothing about zombies would make them bulletproof.  Also, even if they were superior to us, they could never wipe us out completely or even retreat, as we’re their only prey.

In conclusion, a zombie apocalypse is possible. However, it’s so unlikely that the human race will die off before it ever happens anyway.

Denying the evidence – Why do people stick to their beliefs in the face of so much evidence? Emma Hazelwood

It has been accepted in the scientific community that climate change is a result of human activity for almost twenty years. However, a study in 2016 found that less than half of U.S. adults believed that global climate change is due to human activity. In 2012, Trump tweeted that “The concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive”. In a world with overwhelming evidence to the contrary, how can people continue to believe that global warming doesn’t exist?

Once people believe an argument, it is very hard to persuade them otherwise, even if they are told that the information they based their opinion on is incorrect. In a study conducted at Stanford University, two groups of students were given information about a firefighter named Frank. One group were told that Frank was a good firefighter; the other that Frank was a poor firefighter. Participants were then told that the information they’d been given was fake. Afterwards, they were asked to give their own opinion on how Frank would respond to a high-risk situation. Those who had initially been told that Frank was a good firefighter thought that he would stay away from risks, but those who had been told that he was a poor firefighter thought that he would take risks. This study shows that, even though they were then told it was fabricated, the initial information influenced participants’ opinions.

Confirmation bias is when people are more likely to believe facts which support an opinion they already had, rather than evidence to the contrary. A study in Stanford in 1979 involved two groups of students. One group was for capital punishment, the other against. Both groups were shown two fabricated articles. One contained data that supported capital punishment, the other data that opposed it (the statistics were designed to be equally strong in each article). Both groups stated that the source which supported their argument was more reliable. Furthermore, when asked to express their opinions on capital punishment after the study, both groups supported their standpoint even more than before. This demonstrates human nature to selectively believe what we want to be true.

It is believed that humans act this way because it was beneficial in early hunter-gatherer societies. Confirmation bias not only encouraged humans in societies to collaborate, but it was also important for social status to be considered correct. One theory for why seemingly rational humans continue to think irrationally is that we get a rush of dopamine when we see evidence which validates our opinion.

However, early human societies were not teeming with “fake news” and fabricated studies as we are now. It is increasingly clear how having a public swayed by confirmation bias can be dangerous to modern society.

We live in an illusion, where we think we know more than we actually do. For instance, one study found that when people were told about the new (fictitious) discovery of a rock that glowed, if they were told that the scientists who discovered it did not know why it glowed, participants did not claim to know as much about the rock as those who were told that scientists understood how it works (even though the subjects were not given any information on why the rock glowed). This phenomenon of people thinking they understand more than they do is common, and has actually been advantageous in terms of scientific progress. As scientists, we do not need to understand every scientific discovery there has ever been – we rely on the knowledge of our ancestors and those around us.

Humans are programmed to be influenced by information which they are then told is fake, and to think of sources which support their pre-existing opinion as more reliable than those which question it. However, this can be dangerous in areas such as politics. For example, if people around an individual claim to know why Brexit would be economically beneficial to the country, then even when presented with evidence to the contrary the individual is less likely to believe it. Likewise, if a person believes that global warming is a conspiracy, they are more likely to believe Trump when he says it was created by the Chinese than ecologists who say we are pushing our planet to critical levels. In a world where we are bombarded with clickbait and fake news, it is more important than ever to think rationally and critically about every piece of information.