Running around like a headless… Pig? Hundreds of pig brains kept alive after decapitation – Rachel Jones

On March 28th, in a National Institutes of Health meeting on ethics in US neuroscience, Yale Neuroscience Professor Nenad Sestan announced that by experimenting on 100 to 200 brains of decapitated pigs from slaughterhouses, he could keep the organs alive using heaters and pumps to circulate the brains with artificial blood. Billions of cells were discovered to be healthy and capable of working as normal, despite decapitation. This is the first reported success in separating live brains from the bodies of large mammals without using cooling.

Sestan proposed that the brains may be used as models for treatment of diseases such as cancer and Alzheimer’s disease to inform on therapy for humans, since we need models with large amounts of intact brain to see the full effect of treatments. The research was initially funded to help to produce an atlas of the brain, as the connections of the brain are not yet well understood. 17 neuroscientists and bioethicists, including Sestan, published a Nature article in April 2018 proposing methods that may ensure that human brain tissue harvested using these techniques is not conscious during experimentation (experimenting on live human brain tissue is ethically complex as it is potentially conscious, making testing and termination of samples problematic). Suggestions included producing small amounts of nervous tissue known as organoids lacking capability for consciousness, preserving living human brain tissue removed in surgery and inserting human brain tissue into mice.

The immediate media response to the news was the possibility of using method in human brain transplant to keep the tissue alive between bodies. Since live brain maintenance has only been done in pigs, we cannot assume it can be done in humans, but Sestan claims that the techniques could apply to other organisms. In 1970 a successful head transplant between rhesus monkeys by Robert White produced a recipient that survived for 8 days, yet the method was discarded as connection of the spinal cords was not possible. Italian neurosurgeon Sergio Canavero announced in November 2017 that his colleague Xiaoping Ren had transplanted of the head of one human cadaver to another, as a ‘rehearsal’ for live human head transplant. Canavero and Ren have previously experimented with transplantation of live rat, mouse, dog and primate heads. Canavero and Ren predicted their first attempt at live human head transplant to be in late 2017, yet have now changed their estimate to ‘imminent’. Canavero has also alleged that he knows a method to connect the spinal cords.

However, Canavero and Ren have only managed to do human head transplants on cadavers and live head transplants on animals, so cannot claim that they are capable of performing live human head transplants. Arthur Caplan, head of ethics in the New York University medical school, does not believe that Canavero will ever receive the ethical go-ahead, and suggests that Canavero is conducting research in China as the ethics laws are more relaxed than those of the US and Europe.  Professor Sestan mentions that there is no evidence that the pig brains may regain consciousness if transplanted into another pig and has claimed no intention of using this technology to test brain transplantation. Tests suggested that the brains were not conscious, although this negative result may be due to chemicals used to prevent swelling preventing neuronal signals. Steve Hyman, director of psychiatric research at the Cambridge, Massachusetts Broad Institute has said that brain transplants are “not remotely possible”, as he is critical of the idea that we could treat the brain in the same way that we treat organs that are routinely transplanted.

Professor Sestan has refused to comment on his findings as the research is yet to be published, and he had not wished for the news to become public prior to publishing. This means that we do not know if the research will stand up to the rigorous scrutiny of a journal’s peer review. If the experiments are accepted, however, it seems that we may be conducting research on full pig brains in the future, mapping brain cell connections and testing drugs and therapies for human brains in pig brains. Testing on human brains brings up concerns including consent, the definition of death and ownership of living human brains, but using pig brains to inform us about human disease avoids these issues for the most part, so long as unnecessary suffering is not inflicted upon the animal. As for the use of these pig brains in studying human head transplants, historical experiments show that head transplants are possible in a number of animals, but even if brain transplants to lengthen life in humans becomes possible they will not be an option any time soon, due to a huge range of ethical concerns, a lack of evidence of consciousness and loss of spinal cord connection.

Is sitting too close to the TV really that bad for you? Ciara Barrett

“Watching too much TV will give you square eyes!” Imagine the classic 1960s rectangular box television sat in front of a few brothers and sisters watching their favourite afternoon show so they don’t miss it. Their mum walks in and exclaims this phrase to them in complete vain. Did they all grow up to need glasses?

The phrase originates from similar 1960s models of TVs which were found to emit 100,000 times the safe radiation rate so at that time sitting too close to the TV really was a health hazard but for a completely different reason than expected. The TVs were quickly recalled.

Overall, children are better at focusing on close objects than adults so are more likely to sit close to the TV or hold books near their face, but they should grow out of this, unless, of course, it is underlying short-sightedness.

Staring at a screen <40 cm from your eyes is known as ‘near work’ and most studies show that near work usually doesn’t permanently harm our eyes (although links between near work and short sightedness are being investigated), which is fortunate because of the number of screens we’re surrounded by today. However, it may cause fatigue and eyestrain. Eyestrain is something most people have experienced, see: submitting the final draft of a report you’ve been working on for the last 5 hours and your head, neck and eyes hurt but you need to finish before the 9am deadline. Or: it’s the 7th episode of that show you watch, and the screen is too bright for this time of night, but you need to keep going to find out if she killed her fiancée. If you feel personally attacked by these scenarios, then congratulations, you’ve experienced eyestrain. You probably also know that it can be fixed by a good night’s sleep.

Symptoms of eyestrain are a product of Computer Vision Syndrome. This is when you’re looking at a screen for too long and stop blinking enough which affects tear flow and can in turn cause headaches, dry eyes and difficulty focusing. This isn’t permanent damage and can be amended with taking breaks by concentrating on something else and blinking more often. Some cases have been studied where extensive video game play or TV watching caused damage to the watcher’s retina or cornea, but this is unlikely.

When you see something, light travels through the dome-shaped cornea at the correct angle to hit the retina that interprets the image at the back of the eye. The ciliary muscle bends the cornea to the right angle and, like any muscle, can begin to hurt if you keep it in one position for too long and, combined with squinting from the light, causes the discomfort of eyestrain. This close focusing also stops us blinking as often as we need to so the outer layer on the cornea gets dry causing foggy vision.

As mentioned, none of these symptoms are permanent but are better to be avoided. One possible way of remembering is to use the 20-20-20 rule: after being in front of a screen for 20 minutes, look at an object 20m away for another 20 minutes, which is impractical for your 5am essay endeavours but a good guideline nonetheless. Another useful measure is to get a good night’s sleep (another possibly unachievable suggestion) but will help your eyes and overall health in the long run. If all else fails, try changing the brightness, glare and text size on your screen. However, if you regularly need to sit closer to the screen then it could be a sign of short-sightedness.

The mothers from the 1960s who coined the phrase are going to need a stronger argument than square eyes, it seems, as the effects of sitting close to the TV or watching too much of it really aren’t that bad or permanent. Just don’t forget to blink.

 Further reading:

https://www.scientificamerican.com/article/earth-talk-tv-eyesight/

https://athome.readinghorizons.com/blog/why-sitting-too-close-to-the-television-makes-your-eyes-go-square

https://www.scientificamerican.com/article/is-sitting-too-close-to-screen-making-you-blind/

Why do we procrastinate? Emily Farrell

Everyone procrastinates. No one wants to write that essay, or clean the bathroom. If it’s not food, sex or sleep, your body is just not interested. Sure, in the long run you might need to write that essay, to get that degree, to get that job, to earn money to buy food to survive. But your body doesn’t understand, or care, about that. Your body is a thing made in simpler times. It is built for when survival entailed going off to pick some plants to eat, some reproducing and maybe a bit of sleep afterwards. But modern, western lifestyles are a horrible mismatch for this way of living. Imagine giving a caveman a long, boring, task to do such as moving numbers from one column to another (maybe with sticks, it could take a while to explain the concept of computers). Why should he do it? He gets no food from it. He gets no joy from it. Doing this task does not make him any more attractive to cavewomen who might then want to have his babies. It takes a reasonable amount of energy that is better spent in other labours. So why should he do it? To him, the answer is he shouldn’t. And this is the thought process your brain goes through when faced with a task. While the conscious parts of your brain know the real reason for the task, your ancient parts of the brain, which we share with our ancestors and other animals, do not.

Think about it. How do you procrastinate? Making a snack? (means you won’t starve to death) Taking a nap? (means you won’t be too tired to see the tiger of death headed your way) Talking to friends? (maintaining social bonds which one day might lead to you making tiny replicas of yourself vis someone else’s genitals) Watching cat videos? (evolution can’t explain the internet, but taking joy from something which takes away no resources you may have gained from the other tasks means your body agrees to it).

Cleaning your own room is therapeutic and has actually been shown to improve your mood while doing it and afterwards when you’re in your nice clean room. But when it comes to the gross shared bathroom every uni student has encountered, you put it off for longer. You procrastinate away from it. This is because you gain no real benefit from it. It’s not dirty enough to give you diseases (yet), and you don’t spend enough time in it for it to benefit your mental health. If you can’t see an immediate advantage, you won’t do it.

Procrastination is all about cost and benefit and finding the balance between the two. If the immediate payout does not equal or outweigh the energy expenditure required to perform the task, then the inclination to do it will disappear.

Think about this the next time you put something off and do something else instead. Would what you are putting off benefit a caveman? Would he benefit by doing what you are doing now? But don’t listen to your inner caveman. Listen to your inner modern human who wants that essay done, because they know that you really need to do it. Don’t let them in only at the last second to write it. Go and do something productive! Go!

237 Million Medication Errors Occur in NHS England Annually – an Interview with Researcher Fiona Campbell by Emma Hazelwood

A recent report revealed that 237 million medication errors occur in NHS England annually. Not only did the study reveal that these mistakes cause 712 deaths and could be a contributory factor to thousands more, but it is estimated that this costs the NHS £98.5 million a year.

Fiona Campbell, a research fellow at the University of Sheffield, was involved with the study. She met up with pH7 writer, Emma Hazelwood, to provide some more information on the report.

How did the project come about?

The team, which included Marrissa Martyn-St James and Eva Kaltenthaler, at the School of Health and Related Research were asked by the Department of Health to look at how prevalent medication errors in the NHS are, and to estimate the cost of these errors. The project was a collaboration between researchers in Sheffield, Manchester and York, with the team at Sheffield identifying and synthesising relevant literature.

How were the figures calculated?

There are many different ways that studies have measured medication errors. Some examples are to look at past prescribing practices, or adverse drug events (ADRs). The threshold for counting an error is very low – the figure of 237 million includes any small error at all.

What were the limitations of the study?

As with any study, there were some limitations. First, there was a strict time limit, set by the Department of Health – the team at Sheffield had about six weeks to analyse a mammoth amount of data. Secondly, calculations for medication errors are complicated. This is for several reasons – there are different definitions of an error across studies, and sometimes no one realises an error has been made so it may go unrecorded. There are also ethical implications of studying medication errors – if a researcher spots an error, they may feel that they have a moral obligation to stop it before it results in harm to a patient. Therefore, it is difficult to calculate what the impact of these mistakes would have been. In this study, some data goes back as far as ten years. Our healthcare system may have changed since then.

Is it a serious problem?

Considering that there are only about 50 million people in England, the figure of 237 million medication errors per year seems shocking. However, what is lost in this figure is that there are billions of prescriptions issued every year. Furthermore, the threshold for an error is very low – even if one is noticed by healthcare professionals and stopped before it reaches the patient, it is still included in these calculations. Of course, there are catastrophic errors which result in severe patient harm, or even death, but not all – in fact, three out of four result in no harm. Having 237 million medication errors does not mean that people have taken the wrong medication 237 million times. Although it is estimated that these errors are a contributory factor in 1,700 – 22,303 deaths a year, the true figure is most likely at the lower end of this range. Again, the threshold is very low – if someone dies and there was a medication error, even if it is unlikely that it was related to their death, it must be recorded as a potential contributory factor.

Although the errors result in hundreds of deaths, and cost the NHS £98.5 million per year, it seems that we are no worse than anybody else. In many countries, errors are not even recorded, and, when they are, rates are similar to those in this study. The fact that the team was able to undertake this project could be seen as a commitment to transparency within the NHS, and of the determination to reduce these errors.

What are the possible improvements for the NHS?

In order to stop these errors, we must continue to be vigilant in recording them. We rely on healthcare professionals to record their own mistakes, so it is vital that there is not an environment of guilt and shame. There are currently trials seeking to reduce error rates, in particular researching where they occur, and new systems for flagging them up. There are already different checks that happen within the NHS, and, for an error to reach the patient, every mistake has to align. The report supports more funding for research into what we can do to reduce medication errors.

What was the impact of the study?

This study has attracted the attention of a lot of media, from BBC News to Radio 4. Studies such as this highlight the role scientists have in discussing research and making it accessible to the public, without allowing it to be used as a political football.
Overall it’s clear that medication errors are prevalent in our healthcare system. On occasion they have devastating effects, and this quantification of the errors is shocking. That said, we can see that our system has a good rate of preventing these from reaching patients, and the fact that studies and trials are taking place demonstrates that the problem could be improved dramatically over the coming years.

 

Biohacking: an upgrade to “wearable tech”, or turning ourselves into cyborgs? Ellie Marshall

Anyone who’s watched the futuristic Netflix show ‘Black Mirror’ will know of how emerging technology and our reliance on it can have unanticipated consequences – If you have not seen it, I highly recommend giving it a watch!

Yet, we might be closer to the futuristic world of Black Mirror than you think. Around the world, people are pushing the gruesome boundaries of how far we integrate tech with our lives, through a series of implants and body modifications. This is a branch of biohacking – a blanket term used to describe a whole spectrum of ways that people modify or improve their bodies. People who hack themselves with electronic hardware to extend and improve human capacities are known as Grinders or Transhumanists.

Common procedures

A common procedure is to implant a strong magnet beneath the surface of a person’s skin, often in the tip of the ring finger. Nerves in the fingertips then grow around the magnet. This allows nearby magnetic and electrical fields along with their strength and shape to become detectable to the user, thanks to the subtle currents they provoke. For a party trick, the person can also pick up metal objects or make other magnets move around.

Calling this a procedure, though, gives rather the wrong impression. Biohacking is not a field of medicine. Instead it is carried out either at home with DIY kits purchased online or in piercing shops, but without an anaesthetic (which you need a licence for). If you think this sounds painful, you are correct. With no corporate help, the only way grinders can accomplish their goals is by learning from other grinders, mainly through online forums such as biohack.me.

Britain is the birthplace of grinders and in 1998 Kevin Warwick, professor of cybernetics at the University of Reading had a simple radio-frequency identification transmitter (RFID) implanted in his upper left arm, in an experiment that he called Project Cyborg. The chip didn’t do much – it mainly just tracked him around the university and turned on the lights to his lab when he walked in. Still, Warwick was thrilled, and the media were enchanted, declaring him the world’s first cyborg.

RFID implants are now common among grinders and allow users to unlock physical and electronic barriers. Similar technology is already widely used in contactless card payment systems and clothing tags, and Motorola are developing an RFID-activated ‘password pill’ that a user can swallow and access their devices without the hassle of remembering them.

Other examples of biohacking

Circadia, developed by Biohack.me offshoot company Grindhouse Wetware is another implantable device that constantly gathers the user’s biometric data, for example transmitting temperature data via Bluetooth. The medical potential for this device is vast, and it has the most immediately practical benefits.

Additionally, the first internal compass, dubbed the ‘Southpaw’ has been invented. It works by sealing a miniature compass inside a silicon coat, within a rounded Titanium shell, to be implanted under the skin. An ultra-thin whisker juts out, which is activated when the user faces north, to lightly brush an alert on the underside of the skin.

Rich Lee, a star of biohack.me forum, has magnets embedded in each ear so he can listen to music through them, via a wire coil he wears around his neck, that converts sound into electromagnetic fields, creating the first ‘internal headphones’. The implants allow him to detect different sensors, so he can ‘hear’ heat from a distance and detect magnetic fields and Wi-Fi signals too! There is a practical purpose to Lee’s experiments, as he suffers deteriorating eyesight and hopes to improve his orientation through greater sensory awareness.

A damaging concept to users and society?

The question we must ask ourselves is at what point does the incorporation of all this technology make us a different species and what are the ethics behind that?

The bluntest argument against biohacking is that it’s unnatural. For most people, especially those who benefit from medical advancements like pacemakers and cochlear implants, adding RFID or magnets to the body appears to have little value. There are very few people who can’t recognize the benefits of technological progress and how it has helped humanity. Grinding, however is often not recognized as an advancement.

Another argument against human augmentation mirrors the worries that commonly surround genetic engineering. A thought provoking possibility is that those who have access to (and can afford) augmentation procedures and devices will gain unfair advantages over those who do not. Over generations, this could create a large rift between the augmented and the unaugmented. Luckily, the grinder movement provides a solution to this problem as part of its central ethos: open source hardware and the free access of information.

A benefit to the individual and society?

To some, implanted technology represents the next stage in mankind’s evolution that may bring many medical advancements. And, indeed, the idea is not outlandish. Brain stimulation from implanted electrodes is already a routine treatment for Parkinson’s and other diseases, and there are prototypes that promise to let paralysed people control computers, wheelchairs and robotic limbs.

The Wellcome Trust has begun a trial with Alzheimer’s patients carrying a silicon chip on the brain itself, to predict dangerous episodes, and able to stimulate weakened neurons. Military researchers Darpa are also experimenting with a chip implant on humans to help control mental trauma suffered by soldiers.

There is potential to help visually and hearing impaired people by using a chip that translates words and distances into sound, which could mean the end of Braille and sticks. Neil Harbisson is the founder of the non-profit Cyborg Foundation in Barcelona and was born with achromatopsia, the inability to see colours. Since 2004, Harbisson has worn a device he calls the eyeborg, a head-mounted camera that translates colours into soundwaves and pipes them into his head via bone conduction. Today Harbisson “hears” colours, including some beyond the visible spectrum.

These experimental grinders are certainly laying the groundwork for more powerful and pervasive human enhancements in the future, but for now, a Fitbit is more than enough for me.

 

https://www.techopedia.com/definition/29897/biohacking

http://www.abc.net.au/news/2017-02-23/biohackers-transhumanists-grinders-on-living-forever/8292790

http://www.slate.com/articles/technology/superman/2013/03/cyborgs_grinders_and_body_hackers_diy_tools_for_adding_sensory_perceptions.html

https://gizmodo.com/the-most-extreme-body-hacks-that-actually-change-your-p-1704056851

https://hackaday.com/2015/10/12/cyberpunk-yourself-body-modification-augmentation-and-grinders/

https://www.wired.com/story/hannes-wiedemann-grinders/

https://www.theverge.com/2012/8/8/3177438/cyborg-america-biohackers-grinders-body-hackers

http://edition.cnn.com/2014/04/08/tech/forget-wearable-tech-embeddable-implants/index.html

https://www.digitaltrends.com/cool-tech/coolest-biohacking-implants/

The Meat Industry: friend or foe? Keerthana Balamurugan

Meat has been and still is a universal ingredient in numerous societies, not to mention a major part of many traditions, but recent studies have discovered that the consumption of meat is slowly decreasing. Those who have turned vegan or have been eating less meat in moderation have praised the fact that it reaps countless health benefits. Eating less meat has also proved its value towards our environment as problems once created by the meat industry diminish as it recedes. Counter-claims have also arisen declaring that the trend is damaging the multi-billion dollar meat industry and the economy. Where should we stand between the two sides?

Slowly replacing meat products with healthier options such as vegetables, whole grains and even seafood can alter your health immensely for the better. The WHO, World Health Organization, released a report last year linking the consumption of red meat with certain types of cancer and also stating that just by consuming up to 100 grams of meat daily, cancer risk can increase by up to 20%. This statistic jolted people into awareness of the set-backs. In certain countries, trying the vegan diet has become the new trend, with seemingly everyone raving about it on their social media, as people caught wind of how taking in less meat and replacing it with healthier alternatives can aid weight loss. Currently there are more people suffering from obesity than starvation and nutritionists are stating meat as one of the causes. From this aspect, consuming less meat would do us all a favour.

Even with such statistics that backs up claims of the positives of eating less meat, there are those who question this. If we remove meat from our diets what happens to our body with the decreased protein and iron intake? One of the most common disadvantages of not eating enough meat is iron deficiency which can drastically affect our immune systems and the speed at which our body functions. It cannot be disagreed that meat supplies us with a dense source of protein but studies from the Harvard School of Medicine proves that a healthy diet of leafy greens, mushrooms, legumes and other iron-rich plant foods can easily compensate for the nutrients meat provides us. It is simply a balancing act.

It comes as no surprise that the multi-billion dollar meat industry is damaging our ecosystem by tearing down acres and acres of woodland as well as increasing carbon emissions. Agricultural emissions alone account for 30 % of global emissions. Through the production of just 1 kg of beef, 15,000 litres of water is required and up to 30 kg of carbon dioxide is released which accounts to greenhouse gases. Now imagine this multiplied by thousands and thousands of kilograms of meat. Livestock production is the number one use of land by humankind meaning the largest deforestation contribution to our planet. In Brazil, their large-scale commercial beef farming is the cause of 70% of cleared forests in the Amazon. Precious water is being used up and wasted compared to vegan alternatives, ecosystems are being destroyed because of the land clearing and worsening climate change are all effects of the non-sustainable industry. Many would agree upon consuming less meat in order to try and lessen the harm that is being done to the planet.

In the U.S alone, the meat industry is worth more than 800 billion dollars annually, providing over 6 million jobs. Huge numbers of people see the colossal benefit towards cutting down on meat, but what would that mean to the economy, and to the millions of people who rely on it for their wages? Yes, it is true that the sudden economic shift from consuming less would affect a country’s gross domestic product as well as employment rates but only in the short term. Many protest against the cut-down on meat because of these reasons, so is the long-term effect worth the tremendous risk? There is a whole new type of industry that has been booming in the market and that is vegan alternatives. The relatively new category of food products has brought in a whole new economy to the table, providing more jobs for higher wages and with less grueling working conditions.

Consuming less meat has more benefits than drawbacks, leading to a much healthier lifestyle and a cleaner environment for our planet and its inhabitants. If everyone on the planet were to eat meat in moderation, we would have lower percentages of those suffering from obesity and certain types of cancer, not to mention the effects of climate change would be less severe. We live in the day and age where there are so many options available to replace meat in our diets and with just a change in mindset and perspective, many more people can get on board the change realising the environmental and health benefits towards eating less meat.

Face Blindness – what is it (and does it actually exist)? Gege Li

What’s the first physical thing you notice about someone when you meet them for the first time? Is it whether they’re male or female? Maybe it’s their eye colour or freckles. Their unusually-shaped nose, even. Whatever it may be, the combination of these unique traits is what we use to recognise the important people in our life, as well as all the others we know and encounter in between.

So imagine what it would be like if you couldn’t use characteristics like these to distinguish between your mum or your best friend, your old high school teacher or your postman. That’s exactly what sufferers of face blindness – also known by its fancy name, ‘prosopagnosia’ – must endure for most, if not all, of their life (people usually get it from birth).

The inability to recognise people by their face alone affects approximately two in one hundred people in the UK. Sufferers may fail to judge a person’s age, gender or emotional expression from their face or spot similarities and differences between two different faces.

Unsurprisingly, this can have a profound impact on the behaviour and even mental health of sufferers. Although many cope by using alternative strategies such as a person’s distinctive voice, hairstyle or way of walking as identifiers, for others this doesn’t always work, especially if their acquaintance has recently changed their appearance. What are your options when even secondary clues become impossible to spot?

For some, the condition will cause them to deliberately avoid social interactions altogether, which can lead to relationship and career problems, bouts of depression and, in extreme cases, the development of social anxiety disorder. The latter may prevent a person from even leaving their house for overwhelming fear of social situations and embarrassment.

And to make matters worse, it isn’t only other people that someone with face blindness might not recognise. Objects, including places, cars and animals, also present difficulties, particularly for navigation and memory – even their own face staring back at them in the mirror could be an alien sight.

However, for anyone who’s an avid watcher of Arrested Development, you might find yourself questioning the legitimacy or even existence of face blindness as it’s played out through the eccentric and often ridiculous character of Marky Bark. On the show, Marky’s affliction with face blindness, exaggerated to the point it seems at times almost unbelievable, doesn’t appear to cause him half the inconvenience or trauma that it can in reality. Though viewers are made aware of the condition and its characteristics, does the show’s overtly comedic context promote public health misconceptions while hindering an educational message?

It is important to establish that face blindness really does exist and is much less a quirky trait than a cognitive impairment with potentially serious social and emotional consequences. But what causes it exactly? Although it’s true that it can develop following brain damage (from a particularly nasty knock to the head, for example), it has become increasingly clear that ‘developmental prosopagnosia,’ where people simply don’t develop facial recognition, is the most common cause. In fact, as many as one in fifty people might have it, equating to a potential one and a half million sufferers in the UK.

What’s more, many sufferers have reported to a parent or sibling experiencing the same kinds of difficulties, so it’s likely that genetics play a role in the occurrence of face blindness among families.

With all the trouble that face blindness can cause at the expense of someone’s livelihood and even health, it might be reassuring to point out that there are ways to determine whether you might suffer from the condition.

Often, doctors will use computer-based tests that require people to memorise and identify a set of faces, including those of celebrities. Most recently in 2015, a partnership between several doctors and universities in London coined a new questionnaire that can aid both a person’s diagnosis of face blindness as well as a measurement of its severity.

The inevitable bad news, however, is that there isn’t currently a way to treat face blindness directly other than to help sufferers improve their facial recognition with training and rehabilitation programmes.

There’s still hope for the future though – the prosopagnosia research currently taking place at Bournemouth University has also hinted towards using pharmaceuticals to temporarily intervene with face blindness, to some success. Once these techniques have been developed further, a successful cure or therapy might not be too far off.

In the meantime though, if you’re able to recognise a familiar face, why not consider taking the time to appreciate the people close to you (especially their mug) just that little bit more? Don’t take face perception for granted – you’re luckier than you think…

References

  1. https://www.nhs.uk/conditions/face-blindness/
  2. https://prosopagnosiaresearch.org/index/information
  3. http://www.bbc.com/news/health-34709004