Learning from our ancestors – how early humans worked together to survive a changing climate – Emily Farrell

If Yellowstone was to erupt tomorrow, America might not make it through the night. Yellowstone is a super volcano that erupts roughly every 650,000 years; the last eruption was 640,000 years ago. So while it is not “overdue” for an eruption as some conspiracy theorists may think, there is one on the way. This could spell disaster for the continent. Last time, 1000 cubic km of rock, dust and volcanic ash was blown into the sky, blocking the light from plants, catastrophically polluting the air and massively changing the climate. This spelt disaster for the animals living there at the time and could again if it were to erupt.

But would this mean the end for human kind? Not if we follow in the footsteps of our ancestors.

Around 40,000 years ago, Southern Italy had its own super eruption in the volcanic Phlegraean fields and archaeologists have been studying a site in Liguria to see how we were affected by this. Humans had only been in this area for about 1000 years before this event occurred. It would have changed their climate and possibly other aspects such as the food available and air and water quality.

Researchers believe that this change in climate is what drove the Neanderthals out of this area. Current theories suggest that they were not especially capable of adapting and would not have survived well in a suddenly new environment.

But regardless of how well Neanderthals coped, it seems some humans survived and even flourished in these conditions. It appears that their tactic was to maintain links between groups. The evidence for this is on the Italian site. Tools, ornaments and human remains from an ancient rock shelter were analysed and it was found that some of the flint they were using came from hundreds of kilometres away. Having this network would mean that knowledge of how to cope in different situations and habitats would be shared between the groups. When the climate did change, due to a super-eruption or other conditions, the information on how to survive in an unfamiliar environment would already be available.

We can apply this theory to our communities in modern day. By learning from each other we can share the knowledge of how to cope with changes in our climate. Globalisation has increased our capacity for this, so instead of hundreds of kilometres, we can gain knowledge from our networks across the world. We can learn how to build houses on the water from the Pacific Islands, we can learn how to make the most of a limited water supply from Singapore. Why spend time creating novel solutions when the perfect one may already be in place somewhere else on the globe?

If a super eruption occurs and dramatically changes our climate, or even if we continue to change the climate ourselves, we will need to be able to adapt to make our lives sustainable and to be able to endure the changes. By networking, by sharing our knowledge, we can follow in the lives of our ancestors and survive whatever this world throws at us.

Why do we procrastinate? Emily Farrell

Everyone procrastinates. No one wants to write that essay, or clean the bathroom. If it’s not food, sex or sleep, your body is just not interested. Sure, in the long run you might need to write that essay, to get that degree, to get that job, to earn money to buy food to survive. But your body doesn’t understand, or care, about that. Your body is a thing made in simpler times. It is built for when survival entailed going off to pick some plants to eat, some reproducing and maybe a bit of sleep afterwards. But modern, western lifestyles are a horrible mismatch for this way of living. Imagine giving a caveman a long, boring, task to do such as moving numbers from one column to another (maybe with sticks, it could take a while to explain the concept of computers). Why should he do it? He gets no food from it. He gets no joy from it. Doing this task does not make him any more attractive to cavewomen who might then want to have his babies. It takes a reasonable amount of energy that is better spent in other labours. So why should he do it? To him, the answer is he shouldn’t. And this is the thought process your brain goes through when faced with a task. While the conscious parts of your brain know the real reason for the task, your ancient parts of the brain, which we share with our ancestors and other animals, do not.

Think about it. How do you procrastinate? Making a snack? (means you won’t starve to death) Taking a nap? (means you won’t be too tired to see the tiger of death headed your way) Talking to friends? (maintaining social bonds which one day might lead to you making tiny replicas of yourself vis someone else’s genitals) Watching cat videos? (evolution can’t explain the internet, but taking joy from something which takes away no resources you may have gained from the other tasks means your body agrees to it).

Cleaning your own room is therapeutic and has actually been shown to improve your mood while doing it and afterwards when you’re in your nice clean room. But when it comes to the gross shared bathroom every uni student has encountered, you put it off for longer. You procrastinate away from it. This is because you gain no real benefit from it. It’s not dirty enough to give you diseases (yet), and you don’t spend enough time in it for it to benefit your mental health. If you can’t see an immediate advantage, you won’t do it.

Procrastination is all about cost and benefit and finding the balance between the two. If the immediate payout does not equal or outweigh the energy expenditure required to perform the task, then the inclination to do it will disappear.

Think about this the next time you put something off and do something else instead. Would what you are putting off benefit a caveman? Would he benefit by doing what you are doing now? But don’t listen to your inner caveman. Listen to your inner modern human who wants that essay done, because they know that you really need to do it. Don’t let them in only at the last second to write it. Go and do something productive! Go!

Biohacking: an upgrade to “wearable tech”, or turning ourselves into cyborgs? Ellie Marshall

Anyone who’s watched the futuristic Netflix show ‘Black Mirror’ will know of how emerging technology and our reliance on it can have unanticipated consequences – If you have not seen it, I highly recommend giving it a watch!

Yet, we might be closer to the futuristic world of Black Mirror than you think. Around the world, people are pushing the gruesome boundaries of how far we integrate tech with our lives, through a series of implants and body modifications. This is a branch of biohacking – a blanket term used to describe a whole spectrum of ways that people modify or improve their bodies. People who hack themselves with electronic hardware to extend and improve human capacities are known as Grinders or Transhumanists.

Common procedures

A common procedure is to implant a strong magnet beneath the surface of a person’s skin, often in the tip of the ring finger. Nerves in the fingertips then grow around the magnet. This allows nearby magnetic and electrical fields along with their strength and shape to become detectable to the user, thanks to the subtle currents they provoke. For a party trick, the person can also pick up metal objects or make other magnets move around.

Calling this a procedure, though, gives rather the wrong impression. Biohacking is not a field of medicine. Instead it is carried out either at home with DIY kits purchased online or in piercing shops, but without an anaesthetic (which you need a licence for). If you think this sounds painful, you are correct. With no corporate help, the only way grinders can accomplish their goals is by learning from other grinders, mainly through online forums such as biohack.me.

Britain is the birthplace of grinders and in 1998 Kevin Warwick, professor of cybernetics at the University of Reading had a simple radio-frequency identification transmitter (RFID) implanted in his upper left arm, in an experiment that he called Project Cyborg. The chip didn’t do much – it mainly just tracked him around the university and turned on the lights to his lab when he walked in. Still, Warwick was thrilled, and the media were enchanted, declaring him the world’s first cyborg.

RFID implants are now common among grinders and allow users to unlock physical and electronic barriers. Similar technology is already widely used in contactless card payment systems and clothing tags, and Motorola are developing an RFID-activated ‘password pill’ that a user can swallow and access their devices without the hassle of remembering them.

Other examples of biohacking

Circadia, developed by Biohack.me offshoot company Grindhouse Wetware is another implantable device that constantly gathers the user’s biometric data, for example transmitting temperature data via Bluetooth. The medical potential for this device is vast, and it has the most immediately practical benefits.

Additionally, the first internal compass, dubbed the ‘Southpaw’ has been invented. It works by sealing a miniature compass inside a silicon coat, within a rounded Titanium shell, to be implanted under the skin. An ultra-thin whisker juts out, which is activated when the user faces north, to lightly brush an alert on the underside of the skin.

Rich Lee, a star of biohack.me forum, has magnets embedded in each ear so he can listen to music through them, via a wire coil he wears around his neck, that converts sound into electromagnetic fields, creating the first ‘internal headphones’. The implants allow him to detect different sensors, so he can ‘hear’ heat from a distance and detect magnetic fields and Wi-Fi signals too! There is a practical purpose to Lee’s experiments, as he suffers deteriorating eyesight and hopes to improve his orientation through greater sensory awareness.

A damaging concept to users and society?

The question we must ask ourselves is at what point does the incorporation of all this technology make us a different species and what are the ethics behind that?

The bluntest argument against biohacking is that it’s unnatural. For most people, especially those who benefit from medical advancements like pacemakers and cochlear implants, adding RFID or magnets to the body appears to have little value. There are very few people who can’t recognize the benefits of technological progress and how it has helped humanity. Grinding, however is often not recognized as an advancement.

Another argument against human augmentation mirrors the worries that commonly surround genetic engineering. A thought provoking possibility is that those who have access to (and can afford) augmentation procedures and devices will gain unfair advantages over those who do not. Over generations, this could create a large rift between the augmented and the unaugmented. Luckily, the grinder movement provides a solution to this problem as part of its central ethos: open source hardware and the free access of information.

A benefit to the individual and society?

To some, implanted technology represents the next stage in mankind’s evolution that may bring many medical advancements. And, indeed, the idea is not outlandish. Brain stimulation from implanted electrodes is already a routine treatment for Parkinson’s and other diseases, and there are prototypes that promise to let paralysed people control computers, wheelchairs and robotic limbs.

The Wellcome Trust has begun a trial with Alzheimer’s patients carrying a silicon chip on the brain itself, to predict dangerous episodes, and able to stimulate weakened neurons. Military researchers Darpa are also experimenting with a chip implant on humans to help control mental trauma suffered by soldiers.

There is potential to help visually and hearing impaired people by using a chip that translates words and distances into sound, which could mean the end of Braille and sticks. Neil Harbisson is the founder of the non-profit Cyborg Foundation in Barcelona and was born with achromatopsia, the inability to see colours. Since 2004, Harbisson has worn a device he calls the eyeborg, a head-mounted camera that translates colours into soundwaves and pipes them into his head via bone conduction. Today Harbisson “hears” colours, including some beyond the visible spectrum.

These experimental grinders are certainly laying the groundwork for more powerful and pervasive human enhancements in the future, but for now, a Fitbit is more than enough for me.











Face Blindness – what is it (and does it actually exist)? Gege Li

What’s the first physical thing you notice about someone when you meet them for the first time? Is it whether they’re male or female? Maybe it’s their eye colour or freckles. Their unusually-shaped nose, even. Whatever it may be, the combination of these unique traits is what we use to recognise the important people in our life, as well as all the others we know and encounter in between.

So imagine what it would be like if you couldn’t use characteristics like these to distinguish between your mum or your best friend, your old high school teacher or your postman. That’s exactly what sufferers of face blindness – also known by its fancy name, ‘prosopagnosia’ – must endure for most, if not all, of their life (people usually get it from birth).

The inability to recognise people by their face alone affects approximately two in one hundred people in the UK. Sufferers may fail to judge a person’s age, gender or emotional expression from their face or spot similarities and differences between two different faces.

Unsurprisingly, this can have a profound impact on the behaviour and even mental health of sufferers. Although many cope by using alternative strategies such as a person’s distinctive voice, hairstyle or way of walking as identifiers, for others this doesn’t always work, especially if their acquaintance has recently changed their appearance. What are your options when even secondary clues become impossible to spot?

For some, the condition will cause them to deliberately avoid social interactions altogether, which can lead to relationship and career problems, bouts of depression and, in extreme cases, the development of social anxiety disorder. The latter may prevent a person from even leaving their house for overwhelming fear of social situations and embarrassment.

And to make matters worse, it isn’t only other people that someone with face blindness might not recognise. Objects, including places, cars and animals, also present difficulties, particularly for navigation and memory – even their own face staring back at them in the mirror could be an alien sight.

However, for anyone who’s an avid watcher of Arrested Development, you might find yourself questioning the legitimacy or even existence of face blindness as it’s played out through the eccentric and often ridiculous character of Marky Bark. On the show, Marky’s affliction with face blindness, exaggerated to the point it seems at times almost unbelievable, doesn’t appear to cause him half the inconvenience or trauma that it can in reality. Though viewers are made aware of the condition and its characteristics, does the show’s overtly comedic context promote public health misconceptions while hindering an educational message?

It is important to establish that face blindness really does exist and is much less a quirky trait than a cognitive impairment with potentially serious social and emotional consequences. But what causes it exactly? Although it’s true that it can develop following brain damage (from a particularly nasty knock to the head, for example), it has become increasingly clear that ‘developmental prosopagnosia,’ where people simply don’t develop facial recognition, is the most common cause. In fact, as many as one in fifty people might have it, equating to a potential one and a half million sufferers in the UK.

What’s more, many sufferers have reported to a parent or sibling experiencing the same kinds of difficulties, so it’s likely that genetics play a role in the occurrence of face blindness among families.

With all the trouble that face blindness can cause at the expense of someone’s livelihood and even health, it might be reassuring to point out that there are ways to determine whether you might suffer from the condition.

Often, doctors will use computer-based tests that require people to memorise and identify a set of faces, including those of celebrities. Most recently in 2015, a partnership between several doctors and universities in London coined a new questionnaire that can aid both a person’s diagnosis of face blindness as well as a measurement of its severity.

The inevitable bad news, however, is that there isn’t currently a way to treat face blindness directly other than to help sufferers improve their facial recognition with training and rehabilitation programmes.

There’s still hope for the future though – the prosopagnosia research currently taking place at Bournemouth University has also hinted towards using pharmaceuticals to temporarily intervene with face blindness, to some success. Once these techniques have been developed further, a successful cure or therapy might not be too far off.

In the meantime though, if you’re able to recognise a familiar face, why not consider taking the time to appreciate the people close to you (especially their mug) just that little bit more? Don’t take face perception for granted – you’re luckier than you think…


  1. https://www.nhs.uk/conditions/face-blindness/
  2. https://prosopagnosiaresearch.org/index/information
  3. http://www.bbc.com/news/health-34709004

The Science of a Happy Marriage- Ciara Barrett

“Marriage is like a deck of cards, you start with 2 hearts and a diamond and by the end you wish you had a club and a spade.” If my future spouse ever says that about me you can bet they’ll be right. Recent statistics show that 50% marriages in the United States end in divorce, and this figure expands to 41% of all first marriages for the rest of the world. Couples without children are 40% less likely to divorce, and the average age for people divorcing is 30. These are some scary statistics and as someone who doesn’t believe in soulmates it only supports my personal scepticism about marriage.

Marriage is said to come in 5 stages which most couples will experience: romance, disillusionment, power struggle, awakening, then long term marriage. Romance is the fluttery stomach, heart-eyes, blissfully ignorant stage, or the honeymoon phase. Couples have high pheromones, and experience increased oxytocin, a hormone which lets them ignore each other’s irritating behavioural traits. Disillusionment is when this fades and they see each other for who they really are. The hormones wear off, they want to spend more time apart, and their flaws become visible. Couples who get married in the romance phase are likely to have second thoughts now. The misleadingly named power struggle is when couples try to revert to who they were during the Romance phase to try and “fix” the relationship. They want to spend more time with friends and family but may become jealous of their partner doing this too. The awakening phase is the resolution of this and the couple realises they need to give each other space and accept the other for who they are, flaws and all, and reclaim some of their individuality, which leads into the long-term marriage phase where they resolve conflicts easily and are very comfortable with each other.

During these stages, there are little things you can do to keep the day to day marriage alive and studies show there are some sure-fire handy tips for a happy relationship.

The number one tip might be surprising: how do you react to your partner’s good news? If they just got a promotion, or that package they ordered arrived and they’re delighted, then mirroring your significant other’s reactions to a positive situation has more of an impact on the relationship than being a shoulder to cry on during a tough time. Of course, both are important, but your response to a positive situation can make them feel so much better and it shows you care about their success as well as when they’re upset.

Next is the 5:1 rule: for every bad interaction you have, there must be 5 good ones to balance it out. Every time you fight over the remote or make a snide comment about their cooking, there needs to be at least 5 shared moments of laughter, dinner dates or meaningful compliments to make up for it.

An often forgotten piece of advice is to stay close to family and friends. Don’t rely on your SO for all your happy moments; they should be a big part of your life, not all of it. Remember to see your other circles and get emotional fulfilment from them since they’ve probably been around in your life before your partner was.

Keep it exciting! This is both in and out of the sex department (which should be happening regularly anyway); you can keep going on dinner dates and buying flowers and having movie nights in because spending time together should always be fun. When it stops being fun then you need to seriously consider why you’re in the relationship and how you can fix it. Remember to talk to them and don’t let any of your happy memories become tainted. Couples who reminisce about shared moments of laughter have more emotional satisfaction than those who just have good experiences. So laugh with them and be stupid and funny and don’t let it get boring ever.

On the other hand, there are some serious red flags to avoid: criticism, defensiveness, contempt and stonewalling. These are when they attack you as a person rather than the individual mistake you made during an argument or even daily life. Watch out for eye rolling, use of the word “you”, and the silent treatment. If/ when this happens it is important to talk it out and be prepared to take a break and move through the stages mentioned above. Remember that everyone has bad days and don’t attack them back even if they’re hurting you.

Any marriage sceptic will say all this is easier said than done, which isn’t wrong, but couples fortunate enough to want to get married will want to make it last so at least now can use the science to their advantage. Love is but a chemical reaction, after all.




New Test Can Detect Autism in Children – Keerthana Balamurugan

Autism is a mental condition that sticks to a person throughout their entire lives as there is no cure, and in some cases only through support from specialised behavioural psychologists and therapists can this disorder be managed to an extent. It is a development disability that dictates how a person communicates with other people and the world, having adverse effects towards their communication skills and relationships. Autism affects 1 in every 100 people in the United Kingdom alone. Hence, there are numerous Organisations, Universities and hospitals conducting research on this condition where little breakthroughs constantly occur. The latest research shows one such breakthrough where scientists might have found a new test to detect autism in children and also, it’s causative factors.

Autism is usually present from early childhood therefore parents are most likely the first ones to notice differences in their child compared to other children. Common symptoms include avoiding eye contact, preferring to have a familiar routine, hyperactivity, anxiety and so much more. As Autism is a spectrum, each person affected has a different set of symptoms but the common characteristics include having difficulties with social communication and interaction, and repetitive patterns of behaviours or interests. There are so many ways in which autism affects a person, thus it is hard to set a fixed list of symptoms. Despite it being termed “symptoms”, there are those who put their unique characteristics to good use, for example; using their ability of having an intense interest towards one topic and constantly trying to gain knowledge about that field.

Currently, there is no proven medical test to diagnose autism. For children, a series of specialists have to be seen before a proper diagnosis is made; such as a development paediatrician, child neurologist and a child psychologist. As this is not the best and most accurate way to diagnose autism, scientists at the University of Warwick have and are still conducting research on a revolutionary blood and urine test for children. This has the potential to discover the medical reasoning behind the mental condition, allowing patients to receive appropriate treatment much earlier on.

The scientists collected blood and urine samples from thirty-eight children with Autism spectrum disorder (ASD) and thirty-one controls. Upon analysing all the samples, it was found that those with ASD had increased advanced glycation endproducts (AGEs) and increased oxidation damage marker, dityrosine (DT) in plasma proteins respect to controls. There were other hypotheses that this research confirmed including how mutations of amino acid transporters are common in those with ASD. These chemical differences between controls and those with ASD prove that there is a way to medically test for Autism, thus helping hundreds of thousands of those affected to be properly diagnosed. The next step for the research team at Warwick is to recruit more children in order to further better their diagnostic performance. Hopefully, after this step has completed, we can see this being implemented in clinics and hospitals worldwide.

This test does not only reveal whether a person potentially has ASD but the research behind it can hopefully reveal new causative factors. Research so far has shown that 30-35% of ASD cases were caused by genetic factors and the remaining 65-70% are thought to be caused by a combination of genetic variants, multiple mutations and environmental factors. With this new research, there is hope that new causes can be identified to further our understanding of this mental condition and give us the tools to be able to combat it efficiently and appropriately.

How likely is a zombie apocalypse? Emma Hazelwood

Zombies have been limping around Hollywood for almost 100 years. They’re the stars of popular TV series, movies and video games. The Centre for Disease Control and Prevention even give information on how to survive a zombie invasion. So, how likely is a real-life zombie apocalypse?

In theory, a virus could evolve that induces rage and the need to eat human flesh. Scientists have actually found a neurone which could be targeted by a zombie virus – the olfactory nerve. This leads to parts of the brain affecting hunger, emotions, memory and morality. From attacking this nerve, a virus could result in hungry, aggressive, brain dead victims, who can’t recognise their own family and friends, and have no control over their body other than to feed.

In fact, several viruses already exist which seem to make the first step in zombifying people. The rabies virus, causing violent movements, anxiety, hallucinations, and aggressiveness, is even transmitted through biting. However, less than 3 people die a year of rabies in the US, so it’s hardly apocalyptic. This could be because it isn’t transmitted between humans, other than in a few transplants. There was one instance of a rabid kiss, but that’s hardly the stuff from 28 Days Later. The reason animals bite and transmit the disease is that they are confused, so turn to their natural defences.

It might not be a virus that causes the zombie apocalypse, but a parasite (as seen in Resident Evil IV). Parasites are capable of entering the brain and altering behaviour. Toxoplamsma gondii is a microbe which infects rats. To reproduce, the parasite (and rat) must be eaten by a cat. To maximise the chances of this happening, T. gondii actually changes the rat’s perception of cats, so, instead of being afraid of them, the rats seek them out. Scientists are already working on weaponising these bugs to be used in wars!

Which brings us on to the idea that maybe nature isn’t the serial killer – maybe the human race will be its own demise. A plausible method that scientists (or a supervillain) could use to create a zombie army is through nanobots. Within a decade, we will have nanobots capable of crawling inside our heads to repair neural connections. If the host dies these tiny robots could be capable of keeping parts of the brain alive – specifically, the parts for motor function and the desire to feed. They may even be able to reprogram the brain to bite surrounding humans, in order to be transmitted to another host.

So, scary as it is, a human with zombie-like characteristics is possible. However, one key feature of zombies is the fact they are dead, or, rather, undead. There are instances of people being declared dead, then waking up. Clarius Narcisse from Haiti (where zombie myths originated from) was declared dead and buried in 1962. He was found wandering around town, alive and well, 18 years later. Apparently, local Voodoo priests were using Japanese blowfish poison to zombify their workers. The poison slows all bodily functions to the point of being considered medically dead. When victims wake up, they are in a trance-like state, capable of tasks like eating and sleeping, but with no emotional connection to the world.

So if someone could “die”, then wake up with a zombie virus – we could, theoretically, one day have “zombies”. But would this actually lead to an apocalypse?

Probably not – firstly, zombies aren’t that well adapted to being active predators. With open wounds and rapidly decaying flesh, they wouldn’t be able to go out in the sun, especially with their diet consisting solely of human flesh. They would also be more sensitive to cold – limited blood and fluids mean frostbite would easily set in, and that’s not even mentioning their lack of immune system.

They not only have weak bodies, but also lack the intelligence that sets humans apart. This means they wouldn’t be able to communicate a joint attack or problem solve in any way. They wouldn’t be able to drive; they probably wouldn’t even be able to use a door handle.

If Hollywood is to be believed, the zombie plague can only be transmitted through being bitten by someone with the infection. This is obviously problematic – have you ever tried to bite through denim? All we’d have to do is wear clothes and we’d be protected. Plus, their only food source is the world’s top predator.

In today’s world, as soon as there was an outbreak a video of it would be online, easily warning people to just stay inside. In films, somehow the world’s army is overthrown by the infected (even though the untrained civilian protagonist seems to take out whole swarms). This just seems unrealistic – nothing about zombies would make them bulletproof.  Also, even if they were superior to us, they could never wipe us out completely or even retreat, as we’re their only prey.

In conclusion, a zombie apocalypse is possible. However, it’s so unlikely that the human race will die off before it ever happens anyway.