Running around like a headless… Pig? Hundreds of pig brains kept alive after decapitation – Rachel Jones

On March 28th, in a National Institutes of Health meeting on ethics in US neuroscience, Yale Neuroscience Professor Nenad Sestan announced that by experimenting on 100 to 200 brains of decapitated pigs from slaughterhouses, he could keep the organs alive using heaters and pumps to circulate the brains with artificial blood. Billions of cells were discovered to be healthy and capable of working as normal, despite decapitation. This is the first reported success in separating live brains from the bodies of large mammals without using cooling.

Sestan proposed that the brains may be used as models for treatment of diseases such as cancer and Alzheimer’s disease to inform on therapy for humans, since we need models with large amounts of intact brain to see the full effect of treatments. The research was initially funded to help to produce an atlas of the brain, as the connections of the brain are not yet well understood. 17 neuroscientists and bioethicists, including Sestan, published a Nature article in April 2018 proposing methods that may ensure that human brain tissue harvested using these techniques is not conscious during experimentation (experimenting on live human brain tissue is ethically complex as it is potentially conscious, making testing and termination of samples problematic). Suggestions included producing small amounts of nervous tissue known as organoids lacking capability for consciousness, preserving living human brain tissue removed in surgery and inserting human brain tissue into mice.

The immediate media response to the news was the possibility of using method in human brain transplant to keep the tissue alive between bodies. Since live brain maintenance has only been done in pigs, we cannot assume it can be done in humans, but Sestan claims that the techniques could apply to other organisms. In 1970 a successful head transplant between rhesus monkeys by Robert White produced a recipient that survived for 8 days, yet the method was discarded as connection of the spinal cords was not possible. Italian neurosurgeon Sergio Canavero announced in November 2017 that his colleague Xiaoping Ren had transplanted of the head of one human cadaver to another, as a ‘rehearsal’ for live human head transplant. Canavero and Ren have previously experimented with transplantation of live rat, mouse, dog and primate heads. Canavero and Ren predicted their first attempt at live human head transplant to be in late 2017, yet have now changed their estimate to ‘imminent’. Canavero has also alleged that he knows a method to connect the spinal cords.

However, Canavero and Ren have only managed to do human head transplants on cadavers and live head transplants on animals, so cannot claim that they are capable of performing live human head transplants. Arthur Caplan, head of ethics in the New York University medical school, does not believe that Canavero will ever receive the ethical go-ahead, and suggests that Canavero is conducting research in China as the ethics laws are more relaxed than those of the US and Europe.  Professor Sestan mentions that there is no evidence that the pig brains may regain consciousness if transplanted into another pig and has claimed no intention of using this technology to test brain transplantation. Tests suggested that the brains were not conscious, although this negative result may be due to chemicals used to prevent swelling preventing neuronal signals. Steve Hyman, director of psychiatric research at the Cambridge, Massachusetts Broad Institute has said that brain transplants are “not remotely possible”, as he is critical of the idea that we could treat the brain in the same way that we treat organs that are routinely transplanted.

Professor Sestan has refused to comment on his findings as the research is yet to be published, and he had not wished for the news to become public prior to publishing. This means that we do not know if the research will stand up to the rigorous scrutiny of a journal’s peer review. If the experiments are accepted, however, it seems that we may be conducting research on full pig brains in the future, mapping brain cell connections and testing drugs and therapies for human brains in pig brains. Testing on human brains brings up concerns including consent, the definition of death and ownership of living human brains, but using pig brains to inform us about human disease avoids these issues for the most part, so long as unnecessary suffering is not inflicted upon the animal. As for the use of these pig brains in studying human head transplants, historical experiments show that head transplants are possible in a number of animals, but even if brain transplants to lengthen life in humans becomes possible they will not be an option any time soon, due to a huge range of ethical concerns, a lack of evidence of consciousness and loss of spinal cord connection.

Learning from our ancestors – how early humans worked together to survive a changing climate – Emily Farrell

If Yellowstone was to erupt tomorrow, America might not make it through the night. Yellowstone is a super volcano that erupts roughly every 650,000 years; the last eruption was 640,000 years ago. So while it is not “overdue” for an eruption as some conspiracy theorists may think, there is one on the way. This could spell disaster for the continent. Last time, 1000 cubic km of rock, dust and volcanic ash was blown into the sky, blocking the light from plants, catastrophically polluting the air and massively changing the climate. This spelt disaster for the animals living there at the time and could again if it were to erupt.

But would this mean the end for human kind? Not if we follow in the footsteps of our ancestors.

Around 40,000 years ago, Southern Italy had its own super eruption in the volcanic Phlegraean fields and archaeologists have been studying a site in Liguria to see how we were affected by this. Humans had only been in this area for about 1000 years before this event occurred. It would have changed their climate and possibly other aspects such as the food available and air and water quality.

Researchers believe that this change in climate is what drove the Neanderthals out of this area. Current theories suggest that they were not especially capable of adapting and would not have survived well in a suddenly new environment.

But regardless of how well Neanderthals coped, it seems some humans survived and even flourished in these conditions. It appears that their tactic was to maintain links between groups. The evidence for this is on the Italian site. Tools, ornaments and human remains from an ancient rock shelter were analysed and it was found that some of the flint they were using came from hundreds of kilometres away. Having this network would mean that knowledge of how to cope in different situations and habitats would be shared between the groups. When the climate did change, due to a super-eruption or other conditions, the information on how to survive in an unfamiliar environment would already be available.

We can apply this theory to our communities in modern day. By learning from each other we can share the knowledge of how to cope with changes in our climate. Globalisation has increased our capacity for this, so instead of hundreds of kilometres, we can gain knowledge from our networks across the world. We can learn how to build houses on the water from the Pacific Islands, we can learn how to make the most of a limited water supply from Singapore. Why spend time creating novel solutions when the perfect one may already be in place somewhere else on the globe?

If a super eruption occurs and dramatically changes our climate, or even if we continue to change the climate ourselves, we will need to be able to adapt to make our lives sustainable and to be able to endure the changes. By networking, by sharing our knowledge, we can follow in the lives of our ancestors and survive whatever this world throws at us.

Is sitting too close to the TV really that bad for you? Ciara Barrett

“Watching too much TV will give you square eyes!” Imagine the classic 1960s rectangular box television sat in front of a few brothers and sisters watching their favourite afternoon show so they don’t miss it. Their mum walks in and exclaims this phrase to them in complete vain. Did they all grow up to need glasses?

The phrase originates from similar 1960s models of TVs which were found to emit 100,000 times the safe radiation rate so at that time sitting too close to the TV really was a health hazard but for a completely different reason than expected. The TVs were quickly recalled.

Overall, children are better at focusing on close objects than adults so are more likely to sit close to the TV or hold books near their face, but they should grow out of this, unless, of course, it is underlying short-sightedness.

Staring at a screen <40 cm from your eyes is known as ‘near work’ and most studies show that near work usually doesn’t permanently harm our eyes (although links between near work and short sightedness are being investigated), which is fortunate because of the number of screens we’re surrounded by today. However, it may cause fatigue and eyestrain. Eyestrain is something most people have experienced, see: submitting the final draft of a report you’ve been working on for the last 5 hours and your head, neck and eyes hurt but you need to finish before the 9am deadline. Or: it’s the 7th episode of that show you watch, and the screen is too bright for this time of night, but you need to keep going to find out if she killed her fiancée. If you feel personally attacked by these scenarios, then congratulations, you’ve experienced eyestrain. You probably also know that it can be fixed by a good night’s sleep.

Symptoms of eyestrain are a product of Computer Vision Syndrome. This is when you’re looking at a screen for too long and stop blinking enough which affects tear flow and can in turn cause headaches, dry eyes and difficulty focusing. This isn’t permanent damage and can be amended with taking breaks by concentrating on something else and blinking more often. Some cases have been studied where extensive video game play or TV watching caused damage to the watcher’s retina or cornea, but this is unlikely.

When you see something, light travels through the dome-shaped cornea at the correct angle to hit the retina that interprets the image at the back of the eye. The ciliary muscle bends the cornea to the right angle and, like any muscle, can begin to hurt if you keep it in one position for too long and, combined with squinting from the light, causes the discomfort of eyestrain. This close focusing also stops us blinking as often as we need to so the outer layer on the cornea gets dry causing foggy vision.

As mentioned, none of these symptoms are permanent but are better to be avoided. One possible way of remembering is to use the 20-20-20 rule: after being in front of a screen for 20 minutes, look at an object 20m away for another 20 minutes, which is impractical for your 5am essay endeavours but a good guideline nonetheless. Another useful measure is to get a good night’s sleep (another possibly unachievable suggestion) but will help your eyes and overall health in the long run. If all else fails, try changing the brightness, glare and text size on your screen. However, if you regularly need to sit closer to the screen then it could be a sign of short-sightedness.

The mothers from the 1960s who coined the phrase are going to need a stronger argument than square eyes, it seems, as the effects of sitting close to the TV or watching too much of it really aren’t that bad or permanent. Just don’t forget to blink.

 Further reading:

Why do we procrastinate? Emily Farrell

Everyone procrastinates. No one wants to write that essay, or clean the bathroom. If it’s not food, sex or sleep, your body is just not interested. Sure, in the long run you might need to write that essay, to get that degree, to get that job, to earn money to buy food to survive. But your body doesn’t understand, or care, about that. Your body is a thing made in simpler times. It is built for when survival entailed going off to pick some plants to eat, some reproducing and maybe a bit of sleep afterwards. But modern, western lifestyles are a horrible mismatch for this way of living. Imagine giving a caveman a long, boring, task to do such as moving numbers from one column to another (maybe with sticks, it could take a while to explain the concept of computers). Why should he do it? He gets no food from it. He gets no joy from it. Doing this task does not make him any more attractive to cavewomen who might then want to have his babies. It takes a reasonable amount of energy that is better spent in other labours. So why should he do it? To him, the answer is he shouldn’t. And this is the thought process your brain goes through when faced with a task. While the conscious parts of your brain know the real reason for the task, your ancient parts of the brain, which we share with our ancestors and other animals, do not.

Think about it. How do you procrastinate? Making a snack? (means you won’t starve to death) Taking a nap? (means you won’t be too tired to see the tiger of death headed your way) Talking to friends? (maintaining social bonds which one day might lead to you making tiny replicas of yourself vis someone else’s genitals) Watching cat videos? (evolution can’t explain the internet, but taking joy from something which takes away no resources you may have gained from the other tasks means your body agrees to it).

Cleaning your own room is therapeutic and has actually been shown to improve your mood while doing it and afterwards when you’re in your nice clean room. But when it comes to the gross shared bathroom every uni student has encountered, you put it off for longer. You procrastinate away from it. This is because you gain no real benefit from it. It’s not dirty enough to give you diseases (yet), and you don’t spend enough time in it for it to benefit your mental health. If you can’t see an immediate advantage, you won’t do it.

Procrastination is all about cost and benefit and finding the balance between the two. If the immediate payout does not equal or outweigh the energy expenditure required to perform the task, then the inclination to do it will disappear.

Think about this the next time you put something off and do something else instead. Would what you are putting off benefit a caveman? Would he benefit by doing what you are doing now? But don’t listen to your inner caveman. Listen to your inner modern human who wants that essay done, because they know that you really need to do it. Don’t let them in only at the last second to write it. Go and do something productive! Go!

The Science of Iron Man – Ciara Barrett

With the shadow of Infinity War looming over our heads, and the role model status I currently place in Iron Man’s hands, I wanted to find out if our (my) favourite genius billionaire playboy philanthropist could actually exist. The following issues concerned are mostly from the films (with fewer references to the comics).

Arc reactor-

From the comics and films, we know the arc reactor in Tony’s chest is a small fusion reactor built into him and runs on palladium, for which the isotope evidently matters due to its fusion properties. Moving past the fact the continuous collision of particles and subsequent emission of beta and gamma rays would cause serious damage to Tony’s health, if this power source could produce the same amount of energy as a full-sized reactor then it is entirely possible that it could power the suit. However, the physics of building a small scale nuclear reactor with a ring of electromagnets, heat containment and electron recovery into someone’s body is still far on the horizon. The arc reactor would also need to store its power, because the amount of power needed when Tony is and isn’t wearing the suit would be greatly varied. In Iron Man 2, Tony is seen to be quickly burning through his arc reactors which suggests that the power output from them can be controlled based on if he’s fighting or just going about daily life, even though it’s usually the former.

Flying suit-

Iron Man has 4 visible thrusters, one on each of his hands and feet. However, when he flies horizontally in the films there is nothing to oppose the downward force of gravity so he should continue to move horizontally but fall in altitude as he did. He usually combats this by flying in a parabolic or slightly ascending path or with his arms outstretched to mimic a plane, where the pressure difference above and below his arms would generate lift.  Next, the 4 thrusters would need to not only generate enough lift for Tony’s weight but also heavy objects like cars or an aircraft carrier which he has been known to lift while flying (see: The Avengers (2012)). This means the thrusters must be able to lift up to 100,000 tonnes (100 million kg) based on the Nimitz Class, the world’s largest aircraft carrier, which is not impossible given that we’re assuming the arc reactor works to power the suit.

Jarvis (or Friday, his badass Irish counterpart)-

Considering Jarvis is basically a better version of Alexa this is completely within reach. He can do facial recognition, use the Power of The Internet, listen to all your problems and do all the heavy brainpower lifting like calculating the results of different scenarios that could occur. He is peak AI technology which he proves when he is able to hide from Ultron by uploading himself into the internet. Again, this is possible, definitely in the near future.

Ammo firing-

The chest ray the suit is capable of firing is made up of photons from the reactor so is also possible. This is because the arc reactor can produce photons during the atom collision process, and as mentioned, the only issue is storing them. He has guns built into his arms and shoulders, like many fighter planes. One issue that could arise is how he could fire these while flying without experiencing the momentum in the opposite direction to make him fly backwards, however the guns themselves are reasonably realistic in terms of being able to build into a suit.

Suit features: calling from afar, self-fitting, freeze prevention-

In Iron Man 3, Tony develops the suit so that he can call it from miles away to fly to him and it then fits on his body by itself. Breaking this down, the first part of this problem is calling the suit, which obviously isn’t possible via Bluetooth or even radio considering the distances involved. A reasonable solution, since it isn’t listed how he manages to do this, could be through the use of satellite signalling like how a phone call works. He also manages to fix the freezing problem mentioned in Iron Man 2 by using a “gold-titanium alloy from an earlier satellite design” as mentioned in a comic. The next issue is the self-fitting suit which is completely possible through the use of an algorithm since the suit is tuned to his shape. Put simply, the suit knows to first secure the feet, then clasp the legs and so on.

Depending on how you choose to rank these issues, this shows that a suit like Iron Man’s isn’t completely out of reach and that the parts of the suit that don’t defy the laws of physics are amazing feats of technology even if they are just special effects for now.

Further Reading:

237 Million Medication Errors Occur in NHS England Annually – an Interview with Researcher Fiona Campbell by Emma Hazelwood

A recent report revealed that 237 million medication errors occur in NHS England annually. Not only did the study reveal that these mistakes cause 712 deaths and could be a contributory factor to thousands more, but it is estimated that this costs the NHS £98.5 million a year.

Fiona Campbell, a research fellow at the University of Sheffield, was involved with the study. She met up with pH7 writer, Emma Hazelwood, to provide some more information on the report.

How did the project come about?

The team, which included Marrissa Martyn-St James and Eva Kaltenthaler, at the School of Health and Related Research were asked by the Department of Health to look at how prevalent medication errors in the NHS are, and to estimate the cost of these errors. The project was a collaboration between researchers in Sheffield, Manchester and York, with the team at Sheffield identifying and synthesising relevant literature.

How were the figures calculated?

There are many different ways that studies have measured medication errors. Some examples are to look at past prescribing practices, or adverse drug events (ADRs). The threshold for counting an error is very low – the figure of 237 million includes any small error at all.

What were the limitations of the study?

As with any study, there were some limitations. First, there was a strict time limit, set by the Department of Health – the team at Sheffield had about six weeks to analyse a mammoth amount of data. Secondly, calculations for medication errors are complicated. This is for several reasons – there are different definitions of an error across studies, and sometimes no one realises an error has been made so it may go unrecorded. There are also ethical implications of studying medication errors – if a researcher spots an error, they may feel that they have a moral obligation to stop it before it results in harm to a patient. Therefore, it is difficult to calculate what the impact of these mistakes would have been. In this study, some data goes back as far as ten years. Our healthcare system may have changed since then.

Is it a serious problem?

Considering that there are only about 50 million people in England, the figure of 237 million medication errors per year seems shocking. However, what is lost in this figure is that there are billions of prescriptions issued every year. Furthermore, the threshold for an error is very low – even if one is noticed by healthcare professionals and stopped before it reaches the patient, it is still included in these calculations. Of course, there are catastrophic errors which result in severe patient harm, or even death, but not all – in fact, three out of four result in no harm. Having 237 million medication errors does not mean that people have taken the wrong medication 237 million times. Although it is estimated that these errors are a contributory factor in 1,700 – 22,303 deaths a year, the true figure is most likely at the lower end of this range. Again, the threshold is very low – if someone dies and there was a medication error, even if it is unlikely that it was related to their death, it must be recorded as a potential contributory factor.

Although the errors result in hundreds of deaths, and cost the NHS £98.5 million per year, it seems that we are no worse than anybody else. In many countries, errors are not even recorded, and, when they are, rates are similar to those in this study. The fact that the team was able to undertake this project could be seen as a commitment to transparency within the NHS, and of the determination to reduce these errors.

What are the possible improvements for the NHS?

In order to stop these errors, we must continue to be vigilant in recording them. We rely on healthcare professionals to record their own mistakes, so it is vital that there is not an environment of guilt and shame. There are currently trials seeking to reduce error rates, in particular researching where they occur, and new systems for flagging them up. There are already different checks that happen within the NHS, and, for an error to reach the patient, every mistake has to align. The report supports more funding for research into what we can do to reduce medication errors.

What was the impact of the study?

This study has attracted the attention of a lot of media, from BBC News to Radio 4. Studies such as this highlight the role scientists have in discussing research and making it accessible to the public, without allowing it to be used as a political football.
Overall it’s clear that medication errors are prevalent in our healthcare system. On occasion they have devastating effects, and this quantification of the errors is shocking. That said, we can see that our system has a good rate of preventing these from reaching patients, and the fact that studies and trials are taking place demonstrates that the problem could be improved dramatically over the coming years.


Have we really found Amelia Earhart’s bones? Fatima Sheriff

Amelia Earhart is one of the most famous aviators of her time and throughout history – breaking record after record, blazing the trail for female pilots and in July 1937, then disappearing over the Pacific Ocean, never to be seen again. But this may not be the end of her story; a new study in Forensic Anthropology claims that a set of bones found within the vicinity of her disappearance in 1940 are in fact her remains.

Born at the turn of the 20th century, Earhart found herself within a society ingrained with “age-old customs”, where women were “bred to timidity”. However, after her first time in a plane in December 1920, she found her place in the world, starting lessons 6 days later and within a year passing the test for her National Aeronautics Association licence (the 16th woman ever to do this). She quickly rose to fame becoming the first woman to fly solo above 14,000 feet in 1922.

As a passenger in the first flight across the Atlantic in 1928, whereas others were paid thousands of dollars, she was paid in… “experience and opportunity”. Confident in her own ability, she followed suit in 1932, becoming the first woman to fly solo across the Atlantic. Along the way she had to deal with leaking fuel, flames in the engine and ice on the wings of her plane with “only tomato juice to keep her own energy levels up”. Despite all these challenges, her piloting skills and quick problem solving meant she landed safely. She was awarded the Distinguished Flying Cross and continued to add to her astonishing list of achievements with the mentality that “women must try to do things that men have tried. When they fail, their failure must be but a challenge to others”.  She inspired other women to join her, founding the Ninety Nines, an organisation for the advancement of licensed female pilots.

Her spirit of adventure led her to plan the next ambitious trip: a round the world flight of 29,000 miles over 40 days with 20 stops. Leaving on the 1st of June 1937 from Oakland, California, she flew east with her navigator, Fred Noonan. After many successful stops and only 7,000 miles to go before they reached Oakland again, on the 2nd July they went off course and were never found.

The stop they were heading for was Howland Island, only a square mile in size and therefore difficult to find. Radio messages from the pair stated: “we must be on you but we cannot see you, fuel is running low, been unable to reach you by radio, flying at 1000 ft” . Their last ‘frantic’ communication was “on the line 157, 337”. Searches were conducted, covering 250,000 square miles, but on the 19th July, the plane was officially declared lost at sea.

amelia earhart 2

Various conspiracies have gripped the world decades later but renewed efforts to find the plane have found nothing. A theory that gained a lot of momentum was that she had been stranded southwest along the line 157,337 on an island nearby Howland Island, Gardner Island, now called Nikumaroro. Although planes passing over it on the 9th July 1937 reported no visible activity on the island, items recovered later included a woman’s shoe and the box for a sextant (a navigation device that could have been Earhart’s). Most importantly, 13 bones were recovered in 1941 and examined by a D. W. Hoodless in Fiji. He determined the bones to be a ‘short, stocky European man’ and in shocking scientific practice (some may say suspiciously so) he discarded the bones.

Many people have doubted his identification. Using his records, in 1998 TIGHAR (The International Group for History Aircraft Recovery) re-estimated the bones to be that of a 5ft 5-9inch European woman fitting Amelia’s biological profile. She was several inches taller than the average woman of the time, potentially accounting for the dismissal of the skeleton’s sex as male. Richard Jantz released a paper in March this year, further disputing Hoodless’ original assertion. He compared the length of the humerus, radius and tibia to photos of Amelia, information on her pilot’s license and historic seamstress measurements. His conclusion was that “Earhart is more similar to the Nikumaroro bones than 99% of individuals in a large reference sample.” The reason why only 13 bones were found could be down to the native coconut crabs, the largest living arthropods on land (and frankly, terrifying) which could have carried off the rest of the remains.

However, the lack of a complete skeleton remains an obstacle for the definitive identification of the remains. For instance, a pelvis bone could clear up any ambiguity with the sex of the skeleton. Without the original remains it is also impossible to conduct DNA analysis to confirm identity, something that soil analysis and bone sniffing dogs have failed to do. Therefore, this is a compelling argument but not close to indisputable certainty. Whatever you choose to believe, Amelia left a legacy that won’t be forgotten…

“Adventure is worthwhile in itself” – Amelia Earhart (1897-1937)