Biohacking: an upgrade to “wearable tech”, or turning ourselves into cyborgs? Ellie Marshall

Anyone who’s watched the futuristic Netflix show ‘Black Mirror’ will know of how emerging technology and our reliance on it can have unanticipated consequences – If you have not seen it, I highly recommend giving it a watch!

Yet, we might be closer to the futuristic world of Black Mirror than you think. Around the world, people are pushing the gruesome boundaries of how far we integrate tech with our lives, through a series of implants and body modifications. This is a branch of biohacking – a blanket term used to describe a whole spectrum of ways that people modify or improve their bodies. People who hack themselves with electronic hardware to extend and improve human capacities are known as Grinders or Transhumanists.

Common procedures

A common procedure is to implant a strong magnet beneath the surface of a person’s skin, often in the tip of the ring finger. Nerves in the fingertips then grow around the magnet. This allows nearby magnetic and electrical fields along with their strength and shape to become detectable to the user, thanks to the subtle currents they provoke. For a party trick, the person can also pick up metal objects or make other magnets move around.

Calling this a procedure, though, gives rather the wrong impression. Biohacking is not a field of medicine. Instead it is carried out either at home with DIY kits purchased online or in piercing shops, but without an anaesthetic (which you need a licence for). If you think this sounds painful, you are correct. With no corporate help, the only way grinders can accomplish their goals is by learning from other grinders, mainly through online forums such as

Britain is the birthplace of grinders and in 1998 Kevin Warwick, professor of cybernetics at the University of Reading had a simple radio-frequency identification transmitter (RFID) implanted in his upper left arm, in an experiment that he called Project Cyborg. The chip didn’t do much – it mainly just tracked him around the university and turned on the lights to his lab when he walked in. Still, Warwick was thrilled, and the media were enchanted, declaring him the world’s first cyborg.

RFID implants are now common among grinders and allow users to unlock physical and electronic barriers. Similar technology is already widely used in contactless card payment systems and clothing tags, and Motorola are developing an RFID-activated ‘password pill’ that a user can swallow and access their devices without the hassle of remembering them.

Other examples of biohacking

Circadia, developed by offshoot company Grindhouse Wetware is another implantable device that constantly gathers the user’s biometric data, for example transmitting temperature data via Bluetooth. The medical potential for this device is vast, and it has the most immediately practical benefits.

Additionally, the first internal compass, dubbed the ‘Southpaw’ has been invented. It works by sealing a miniature compass inside a silicon coat, within a rounded Titanium shell, to be implanted under the skin. An ultra-thin whisker juts out, which is activated when the user faces north, to lightly brush an alert on the underside of the skin.

Rich Lee, a star of forum, has magnets embedded in each ear so he can listen to music through them, via a wire coil he wears around his neck, that converts sound into electromagnetic fields, creating the first ‘internal headphones’. The implants allow him to detect different sensors, so he can ‘hear’ heat from a distance and detect magnetic fields and Wi-Fi signals too! There is a practical purpose to Lee’s experiments, as he suffers deteriorating eyesight and hopes to improve his orientation through greater sensory awareness.

A damaging concept to users and society?

The question we must ask ourselves is at what point does the incorporation of all this technology make us a different species and what are the ethics behind that?

The bluntest argument against biohacking is that it’s unnatural. For most people, especially those who benefit from medical advancements like pacemakers and cochlear implants, adding RFID or magnets to the body appears to have little value. There are very few people who can’t recognize the benefits of technological progress and how it has helped humanity. Grinding, however is often not recognized as an advancement.

Another argument against human augmentation mirrors the worries that commonly surround genetic engineering. A thought provoking possibility is that those who have access to (and can afford) augmentation procedures and devices will gain unfair advantages over those who do not. Over generations, this could create a large rift between the augmented and the unaugmented. Luckily, the grinder movement provides a solution to this problem as part of its central ethos: open source hardware and the free access of information.

A benefit to the individual and society?

To some, implanted technology represents the next stage in mankind’s evolution that may bring many medical advancements. And, indeed, the idea is not outlandish. Brain stimulation from implanted electrodes is already a routine treatment for Parkinson’s and other diseases, and there are prototypes that promise to let paralysed people control computers, wheelchairs and robotic limbs.

The Wellcome Trust has begun a trial with Alzheimer’s patients carrying a silicon chip on the brain itself, to predict dangerous episodes, and able to stimulate weakened neurons. Military researchers Darpa are also experimenting with a chip implant on humans to help control mental trauma suffered by soldiers.

There is potential to help visually and hearing impaired people by using a chip that translates words and distances into sound, which could mean the end of Braille and sticks. Neil Harbisson is the founder of the non-profit Cyborg Foundation in Barcelona and was born with achromatopsia, the inability to see colours. Since 2004, Harbisson has worn a device he calls the eyeborg, a head-mounted camera that translates colours into soundwaves and pipes them into his head via bone conduction. Today Harbisson “hears” colours, including some beyond the visible spectrum.

These experimental grinders are certainly laying the groundwork for more powerful and pervasive human enhancements in the future, but for now, a Fitbit is more than enough for me.

Face Blindness – what is it (and does it actually exist)? Gege Li

What’s the first physical thing you notice about someone when you meet them for the first time? Is it whether they’re male or female? Maybe it’s their eye colour or freckles. Their unusually-shaped nose, even. Whatever it may be, the combination of these unique traits is what we use to recognise the important people in our life, as well as all the others we know and encounter in between.

So imagine what it would be like if you couldn’t use characteristics like these to distinguish between your mum or your best friend, your old high school teacher or your postman. That’s exactly what sufferers of face blindness – also known by its fancy name, ‘prosopagnosia’ – must endure for most, if not all, of their life (people usually get it from birth).

The inability to recognise people by their face alone affects approximately two in one hundred people in the UK. Sufferers may fail to judge a person’s age, gender or emotional expression from their face or spot similarities and differences between two different faces.

Unsurprisingly, this can have a profound impact on the behaviour and even mental health of sufferers. Although many cope by using alternative strategies such as a person’s distinctive voice, hairstyle or way of walking as identifiers, for others this doesn’t always work, especially if their acquaintance has recently changed their appearance. What are your options when even secondary clues become impossible to spot?

For some, the condition will cause them to deliberately avoid social interactions altogether, which can lead to relationship and career problems, bouts of depression and, in extreme cases, the development of social anxiety disorder. The latter may prevent a person from even leaving their house for overwhelming fear of social situations and embarrassment.

And to make matters worse, it isn’t only other people that someone with face blindness might not recognise. Objects, including places, cars and animals, also present difficulties, particularly for navigation and memory – even their own face staring back at them in the mirror could be an alien sight.

However, for anyone who’s an avid watcher of Arrested Development, you might find yourself questioning the legitimacy or even existence of face blindness as it’s played out through the eccentric and often ridiculous character of Marky Bark. On the show, Marky’s affliction with face blindness, exaggerated to the point it seems at times almost unbelievable, doesn’t appear to cause him half the inconvenience or trauma that it can in reality. Though viewers are made aware of the condition and its characteristics, does the show’s overtly comedic context promote public health misconceptions while hindering an educational message?

It is important to establish that face blindness really does exist and is much less a quirky trait than a cognitive impairment with potentially serious social and emotional consequences. But what causes it exactly? Although it’s true that it can develop following brain damage (from a particularly nasty knock to the head, for example), it has become increasingly clear that ‘developmental prosopagnosia,’ where people simply don’t develop facial recognition, is the most common cause. In fact, as many as one in fifty people might have it, equating to a potential one and a half million sufferers in the UK.

What’s more, many sufferers have reported to a parent or sibling experiencing the same kinds of difficulties, so it’s likely that genetics play a role in the occurrence of face blindness among families.

With all the trouble that face blindness can cause at the expense of someone’s livelihood and even health, it might be reassuring to point out that there are ways to determine whether you might suffer from the condition.

Often, doctors will use computer-based tests that require people to memorise and identify a set of faces, including those of celebrities. Most recently in 2015, a partnership between several doctors and universities in London coined a new questionnaire that can aid both a person’s diagnosis of face blindness as well as a measurement of its severity.

The inevitable bad news, however, is that there isn’t currently a way to treat face blindness directly other than to help sufferers improve their facial recognition with training and rehabilitation programmes.

There’s still hope for the future though – the prosopagnosia research currently taking place at Bournemouth University has also hinted towards using pharmaceuticals to temporarily intervene with face blindness, to some success. Once these techniques have been developed further, a successful cure or therapy might not be too far off.

In the meantime though, if you’re able to recognise a familiar face, why not consider taking the time to appreciate the people close to you (especially their mug) just that little bit more? Don’t take face perception for granted – you’re luckier than you think…



Brain Study Evidence Shows Women are More Intelligent Than Men- Ciara Barrett

A recent study across a team of Russell group universities provides evidence that women are more intelligent than men, a source from the University of Sheffield’s neuroscience department says. The study was performed on a group of approximately 500 British women and men and came to a quite frankly controversial result.

A previous study on brain damage from the University of Illinois (see further reading) places the intelligence sectors of the brain in the left prefrontal cortex, left parietal cortex and left temporal cortex. These areas are vital to what we define as general intelligence: planning, verbal comprehension and working memory and are the same sections of the brain that were found to be “subtly more pronounced in the female subjects than the males”. Ironically, although not directly mentioned in the study, the researchers working on this were 50% female, a pleasantly surprising statistic for the face of women in science.

Subjects were tested on short and mid-term memory, verbal reasoning, simple mental maths and non-verbal aptitude while under a newly designed brain scanner. The scanner detected the parts of the brain that were the most in use during each activity using thermal and electrical impulse imaging. White matter helps to connect the different sectors of the brain and women are thought to have 10 times as much as men. This could be why women also were found to use both sides of their brains more equally, especially when listening (men predominantly use the left side). Connections are made in the brain via networks of neurons using “electro-chemical signalling” and since women have so much more white matter, it makes some sense that their brains work faster and are better at solving problems due to the improved connectivity within the brains so that more links can be made.

Historically, females have been shown to be more rational and better at spatial leaning, particularly regarding breeding, choosing a mate and place to live, but these traits don’t specifically involve the left side of the brain. As mentioned, women’s brains have been proven to use both the left and right side equally, so why did the “intelligence sector” light up for the women’s brains? Although there is little evidence that brain size affects brain power, the women’s frontal cortex was larger, allowing more space for neural connections to be made. This makes women hardwired to be better problem solvers which was said to be one of the criteria for intelligence according to the report. The hippocampus (the centre of the brain that converts short term memory to long term and helps us to remember details) is also proven to be bigger in women and this also may have had an effect on why women performed better in the memory tests too.

These are just a couple of unexpected physiological differences between male and female brains, none of which were newly founded during this study, but have been stated as possible explanations for the fascinating result. Other possible explanations listed in the discussion of the report are that women, “after being oppressed for so long have begun to evolve slightly faster than men and have done so to compensate for being underestimated in their natural habitat”, which is very unlike the original natural habitat that humans first grew up in where the males were the sole providers for their families and women were nothing but caregivers. The researchers themselves place the result down to the size differences in the intelligence sector and enhanced ability for connections in the female brain, but don’t completely discount the possibility of recent accelerated evolution in women. No matter how much we all evolve, it still comes down to survival of the fittest, and here, it seems, women are truly enduring.

 For the study from University of Illinois:

**DISCLAIMER: This was an article written for April Fool’s Day, 2018. The above article was intended for entertainment purposes only and may include completely fabricated facts.**

Sugar Certified as MORE ADDICTIVE Than Crack Cocaine – Vanessa Kam

In a brave, valiant study against the sugar-loving food conglomerates dominating our kitchen cupboards, Dr Dia Beatez1 of the University of Duncee has proven that sugar is more addictive than crack cocaine.

Using state-of-the-art facilities in Yu Chun, China, Dr Beatez attached pedometers to adult pandas2 and fed them regularly with sugar water for three weeks.  Upon withdrawing the sugar solution in place of GMO low-sugar bamboo shoots, the pedometers recorded a staggering 401% increase in the pandas’ physical activity, with the youngest subject Benben even exhibiting tree-climbing activity in search of sugar water, much to the surprise of the team.

The experiment was repeated using 1tsp crack cocaine3 in solution, but failed to elicit the same effects as the everyday commodity—sugar.

Commenting on her findings, Dr Beatez says the evidence is clear cut.

“Pandas share 68% of their DNA with us.  Accounting for the small difference in genetic makeup, our pedometer results are still significant.  Sugar is more addictive than crack cocaine.  It’s a fact.”

The research has yet to be published, but Beatez is driven to enlighten the masses before reporting her results in the likes of Nature and the BMJ, acting in the interest of the public.

“Just as craving ice cream will make the couch potato stand up, walk 10m to the kitchen, open the freezer and dig in to a tub of cookie dough ice cream, the panda too will quadruple its physical activity in search of sugar.”

Other experts echo these findings.  An article published last year in the British Journal of Sports Medicine demonstrated that sugar is more addictive than opioid drugs in rats, causing behavioural problems and depression on withdrawal, due to dopamine-related changes in the brain.

“Cocaine?  It’s as irrelevant to pandas as it is to most of us in our daily lives,” Beatez added.

These findings come in light of the Soft Drinks Levy sweeping England, which charges manufacturers per litre sugar-loaded beverage produced.

However Dr Beatez insists the government is not doing enough, neglecting the heavy consumption of sugary snacks.  “I have had patients come in because they simply cannot just eat two squares of Cadbury’s Dairy Milk—they must finish the pack, all 200g of it.”

Speaking on account of anonymity,  a student at the University of Duncee confesses to daily binge-eating of Dairy Milk.

“I used to beeline to Sainsbury’s right after lectures to pick up a bar of that milky sweet stuff, demolishing it before even reaching the library.  …  Thanks to Dia’s help, I have been taking DIAgram© twice daily and it is a true miracle!  I walk down the vegetable aisle now!”

When asked to compare sugar to crack, the student was unable to testify, having never been on the drug.

“I am convinced that sugar is more addictive than any Class A drug, just because Dia says so.”

Inspired by her patients’ success, Dr Beatez has founded the Group for Large Unhealthy Companies’ imprisOnment & Sentencing (GLUCOSE), fighting for the incarceration of “global sugar daddies” and the criminalisation of “sugar-dealing”.

The sugar war, is one to match the war on drugs.


  1. Dr Dia Beatez is a registered dietician, with a Masters in Nutritional Science from the Universidad de Noticias Falsas.  Beatez is not a medical doctor, but prefers the title “Dr” and is a recognised expert in the field.
  2. Beatez’ research on pandas was conducted in accordance with the Good Laboratory Pandas and approved by the Medical Religious Church.
  3. Beatez declined to comment on the procurement of crack cocaine for the experiment, but assures it was ethically-sourced from organic, Fairtrade croppers just south of North America.


**DISCLAIMER: This was an article written for April Fool’s Day, 2018. The above article was intended for entertainment purposes only and may include completely fabricated facts.**

Author’s notes

  • Dr Dia Beatez – Diabetes
  • University of Duncee – Dunce
  • Yu Chun, Benben – Chinese words for stupid, dumb
  • 401% for April Fool’s
  • British Journal of Sports Medicine study real, and was criticised in the following article
  • DIAgram© – Mirroring how “experts” who give nutritional advice on the internet often sell their own health products as well
  • La Universidad de noticias falsas – “The University of Fake News” in Spanish
  • GLP – Play on “Good Laboratory Practice”
  • MRC – Play on “Medical Research Council”



Epilepsy Awareness Day: The Science of Epilepsy – Emily Vincent

Today is Epilepsy Awareness Day, a fitting time to shed some light on a common, varied, and poorly understood condition. Of course the science surrounding epilepsy is almost endless (as a sufferer I am all too aware of this), so this article is by no means exhaustive!

 What is epilepsy?

The word “epilepsy” actually applies to a group of neurological disorders, and has many forms. 60 % of seizures are convulsive – the shaking and aggressive movement most people imagine when thinking about epilepsy. Other forms include “absences” – where consciousness is lost temporarily and the person “blanks out” or stares into space throughout, and focal seizures – where the person may experience a huge variety of effects such as stiff or floppy limbs, repeated actions such as picking at clothes, numbness or tingling, or sensing an unusual smell or taste.

 While relatively little is known about the mechanism, we know that seizures are brought on by an abnormal, excessive, and synchronised firing of neurons in the brain.

 Interestingly, epilepsy and seizures have distinct meanings. Anyone can have a seizure, and one incident wouldn’t be classed as epilepsy, nor would a seizure clearly brought on by a specific event or injury. The term epilepsy is then generally applied to people who experience, or have experienced, multiple seizures. It can onset at any time of life (quite commonly around puberty) and also stop affecting people at any time of life (for example some types of epilepsy mostly affect children).

 A key fact to take home here is that epilepsy can mean many different things, and affects people in dramatically different ways.

 What causes epilepsy?

The short answer is that we’re not really sure. A common misconception is that flashing lights are the trigger – for some people this is definitely true, but in fact only three percent of people with epilepsy have this as their trigger. Genetics, brain tumours, sleep deprivation, strokes, periods, stress, and alcohol are just a few of the causes or triggers of seizures – most of which aren’t really understood. For example, some people know their seizures are tied to sleep deprivation but their doctors can’t tell them why this is.

 How do we treat and manage epilepsy?

Medication, brain surgery, and implants are used, and lifestyle management such as avoiding triggers and adopting the ketogenic diet can also be effective. For some people it seems that nothing works, while others can go for years without any seizures.

 70 % of people have their epilepsy controlled by medication, including anticonvulsants, and there is a long list of possible options – the suitability depends on the type of seizure experienced, the age and gender of the person, side effects, and more. Alongside not quite knowing how epilepsy works, the mechanisms of action are often unknown for these medications.

 Brain surgery is reserved for cases where medication is ineffective and it is known (through tests including brain scans and tests of the brains electrical activity) that the seizures originate from a small part of the brain. The surgery involves removal of part of the brain under general anaesthetic, and while often effective, of course it carries some risks such as memory problems and vision loss.

 What other effects can epilepsy have on sufferers?

As with most medical conditions, epilepsy can mean many extra things for people with it. Here, just a few of those of a “scientific” nature will be covered.

 SUDEP stands for Sudden Unexpected Death from EPilepsy and is applied to instances where a person with epilepsy dies suddenly with no apparent cause. It occurs most commonly during sleep, and those who experience convulsive seizures are at extra risk. SUDEP is obviously a tragic complication and while it is poorly understood, lots of work is being put into understanding and preventing it.

 One of the most effective anticonvulsants, Epilim (sodium valproate) is known to cause physical birth defects (three times as many compared to the general population) and has been linked to autism, if women take it while pregnant – reducing the options women have for epilepsy treatment. On a similar note, many pairings of anticonvulsants and contraception are unsuitable or would need modification – for example those taking both Lamotrigine and the pill may be advised to take higher doses of both as there is a risk of reduced effectiveness due to interference between the drugs. Another example is warnings against using some medications and the contraceptive injection as both can lead to bone loss – meaning some people consider it a risky combination.

 Unfortunately, people with epilepsy are also unable to give blood – while epilepsy is not a transmittable disease through blood or any other bodily fluid, giving blood comes with a chance of triggering a seizure, meaning that it is considered unsafe for sufferers.

 Important information

Straying away from the science here, on Epilepsy Awareness Day it would seem wrong not to signpost to practical information. There are loads of amazing organisations which give information for the public, fundraisers, and sufferers alike – just a few are listed below:

 The following is a general idea of what to do if you see someone having a seizure (taken from, and lots more information is available online.

If you’re with someone having a seizure:

·        only move them if they’re in danger  – such as near a busy road or hot cooker

·        cushion their head if they’re on the ground

·        loosen any tight clothing around their neck – such as a collar or tie to – aid breathing

·        when their convulsions stop, turn them so they’re lying on their side – read more about the recovery position

·        stay with them and talk to them calmly until they recover

·        note the time the seizure starts and finishes

Dial 999 and ask for an ambulance if:

·        it’s the first time someone has had a seizure

·        the seizure lasts for more than 5 minutes

·        the person doesn’t regain full consciousness, or has several seizures without regaining consciousness

·        the person is seriously injured during the seizure

 *** Myth Busted *** do not put a spoon, your fingers, or any object into the person’s mouth. They will not swallow their tongue, but they may well bite your fingers.

New Test Can Detect Autism in Children – Keerthana Balamurugan

Autism is a mental condition that sticks to a person throughout their entire lives as there is no cure, and in some cases only through support from specialised behavioural psychologists and therapists can this disorder be managed to an extent. It is a development disability that dictates how a person communicates with other people and the world, having adverse effects towards their communication skills and relationships. Autism affects 1 in every 100 people in the United Kingdom alone. Hence, there are numerous Organisations, Universities and hospitals conducting research on this condition where little breakthroughs constantly occur. The latest research shows one such breakthrough where scientists might have found a new test to detect autism in children and also, it’s causative factors.

Autism is usually present from early childhood therefore parents are most likely the first ones to notice differences in their child compared to other children. Common symptoms include avoiding eye contact, preferring to have a familiar routine, hyperactivity, anxiety and so much more. As Autism is a spectrum, each person affected has a different set of symptoms but the common characteristics include having difficulties with social communication and interaction, and repetitive patterns of behaviours or interests. There are so many ways in which autism affects a person, thus it is hard to set a fixed list of symptoms. Despite it being termed “symptoms”, there are those who put their unique characteristics to good use, for example; using their ability of having an intense interest towards one topic and constantly trying to gain knowledge about that field.

Currently, there is no proven medical test to diagnose autism. For children, a series of specialists have to be seen before a proper diagnosis is made; such as a development paediatrician, child neurologist and a child psychologist. As this is not the best and most accurate way to diagnose autism, scientists at the University of Warwick have and are still conducting research on a revolutionary blood and urine test for children. This has the potential to discover the medical reasoning behind the mental condition, allowing patients to receive appropriate treatment much earlier on.

The scientists collected blood and urine samples from thirty-eight children with Autism spectrum disorder (ASD) and thirty-one controls. Upon analysing all the samples, it was found that those with ASD had increased advanced glycation endproducts (AGEs) and increased oxidation damage marker, dityrosine (DT) in plasma proteins respect to controls. There were other hypotheses that this research confirmed including how mutations of amino acid transporters are common in those with ASD. These chemical differences between controls and those with ASD prove that there is a way to medically test for Autism, thus helping hundreds of thousands of those affected to be properly diagnosed. The next step for the research team at Warwick is to recruit more children in order to further better their diagnostic performance. Hopefully, after this step has completed, we can see this being implemented in clinics and hospitals worldwide.

This test does not only reveal whether a person potentially has ASD but the research behind it can hopefully reveal new causative factors. Research so far has shown that 30-35% of ASD cases were caused by genetic factors and the remaining 65-70% are thought to be caused by a combination of genetic variants, multiple mutations and environmental factors. With this new research, there is hope that new causes can be identified to further our understanding of this mental condition and give us the tools to be able to combat it efficiently and appropriately.

How likely is a zombie apocalypse? Emma Hazelwood

Zombies have been limping around Hollywood for almost 100 years. They’re the stars of popular TV series, movies and video games. The Centre for Disease Control and Prevention even give information on how to survive a zombie invasion. So, how likely is a real-life zombie apocalypse?

In theory, a virus could evolve that induces rage and the need to eat human flesh. Scientists have actually found a neurone which could be targeted by a zombie virus – the olfactory nerve. This leads to parts of the brain affecting hunger, emotions, memory and morality. From attacking this nerve, a virus could result in hungry, aggressive, brain dead victims, who can’t recognise their own family and friends, and have no control over their body other than to feed.

In fact, several viruses already exist which seem to make the first step in zombifying people. The rabies virus, causing violent movements, anxiety, hallucinations, and aggressiveness, is even transmitted through biting. However, less than 3 people die a year of rabies in the US, so it’s hardly apocalyptic. This could be because it isn’t transmitted between humans, other than in a few transplants. There was one instance of a rabid kiss, but that’s hardly the stuff from 28 Days Later. The reason animals bite and transmit the disease is that they are confused, so turn to their natural defences.

It might not be a virus that causes the zombie apocalypse, but a parasite (as seen in Resident Evil IV). Parasites are capable of entering the brain and altering behaviour. Toxoplamsma gondii is a microbe which infects rats. To reproduce, the parasite (and rat) must be eaten by a cat. To maximise the chances of this happening, T. gondii actually changes the rat’s perception of cats, so, instead of being afraid of them, the rats seek them out. Scientists are already working on weaponising these bugs to be used in wars!

Which brings us on to the idea that maybe nature isn’t the serial killer – maybe the human race will be its own demise. A plausible method that scientists (or a supervillain) could use to create a zombie army is through nanobots. Within a decade, we will have nanobots capable of crawling inside our heads to repair neural connections. If the host dies these tiny robots could be capable of keeping parts of the brain alive – specifically, the parts for motor function and the desire to feed. They may even be able to reprogram the brain to bite surrounding humans, in order to be transmitted to another host.

So, scary as it is, a human with zombie-like characteristics is possible. However, one key feature of zombies is the fact they are dead, or, rather, undead. There are instances of people being declared dead, then waking up. Clarius Narcisse from Haiti (where zombie myths originated from) was declared dead and buried in 1962. He was found wandering around town, alive and well, 18 years later. Apparently, local Voodoo priests were using Japanese blowfish poison to zombify their workers. The poison slows all bodily functions to the point of being considered medically dead. When victims wake up, they are in a trance-like state, capable of tasks like eating and sleeping, but with no emotional connection to the world.

So if someone could “die”, then wake up with a zombie virus – we could, theoretically, one day have “zombies”. But would this actually lead to an apocalypse?

Probably not – firstly, zombies aren’t that well adapted to being active predators. With open wounds and rapidly decaying flesh, they wouldn’t be able to go out in the sun, especially with their diet consisting solely of human flesh. They would also be more sensitive to cold – limited blood and fluids mean frostbite would easily set in, and that’s not even mentioning their lack of immune system.

They not only have weak bodies, but also lack the intelligence that sets humans apart. This means they wouldn’t be able to communicate a joint attack or problem solve in any way. They wouldn’t be able to drive; they probably wouldn’t even be able to use a door handle.

If Hollywood is to be believed, the zombie plague can only be transmitted through being bitten by someone with the infection. This is obviously problematic – have you ever tried to bite through denim? All we’d have to do is wear clothes and we’d be protected. Plus, their only food source is the world’s top predator.

In today’s world, as soon as there was an outbreak a video of it would be online, easily warning people to just stay inside. In films, somehow the world’s army is overthrown by the infected (even though the untrained civilian protagonist seems to take out whole swarms). This just seems unrealistic – nothing about zombies would make them bulletproof.  Also, even if they were superior to us, they could never wipe us out completely or even retreat, as we’re their only prey.

In conclusion, a zombie apocalypse is possible. However, it’s so unlikely that the human race will die off before it ever happens anyway.