How to not become a Zombie – through pills, exercise and fasting by Abdullah Iqbal

What causes aging?

Is it possible to reverse it?

What can you do to live a longer and healthier life?

One theory about aging involves the accumulation of senescent cells or I as I like to call them ‘zombie cells’ because these cells are very different from normal cells (emphasis on very). Instead of functioning as a normal cell does their gene expression changes which just means they produce different proteins.

Can’t be that bad? Oh, but it is because these proteins are pro-inflammatory which means they cause a whole host of effects such as uncontrolled tissue death. But worst of all is the spread of zombie cells which is like a rotten strawberry in a bowl of fruits, releasing toxins that cause the healthy cells to become dysfunctional.

The negative effects of zombie cell accumulation have been proven in recent experiments by Ming Xu and others at the Mayo Clinic. By injecting previously healthy young adult mice with one million senescent cells caused a significantly lower maximal walking speed, hanging endurance, and grip strength 1 month after transplantation compared to mice transplanted with control cells.

One million may sound like a lot but the average mouse is made up of over a trillion cells.

Chronic inflammation can lead to a whole host of diseases from rheumatoid arthritis to Alzheimer’s.

What scientists are doing to stop this?

Don’t worry there is a lot of research in this field.

One company Oisín is planning on loading a suicide gene into nanoparticles (tiny packets that can be modified to make sure that the drug is not broken down on its way to its target.) that will be delivered to all cells bit only activated in cells that express high levels of a marker for zombie cells called P16.

However, we must be careful for “Not every cell that expresses high p16 is senescent; and not every senescent cell has high p16,” says James Kirkland, a researcher who studies aging at the Mayo Clinic.

Another technique is senolytics which is the process of killing senescent cells using drugs. Two drugs which have been tested on human adipose fat) cells are dasatinib and quercetin. Xu said. “We observed a reduction in the inflammatory cytokines in these tissues, while key adipokines were not affected. This demonstrates that these senolytic drugs can decrease inflammation without a global killing effect.”

Adipokines are cell proteins which are released by cells and have anti-inflammatory effects which lead to the production.

Higher levels of adiponectin (an adipokine) are associated with greater insulin sensitivity and metabolic health. You wouldn’t want senolytic drugs to inhibit adiponectin at the same time as they are influencing inflammatory cytokines levels.

 

What you can do to remove your zombie cells?

Yes, there are ways that you can ensure you age healthily. It may be hard to implement at the start but remember it will be worth it in the end.

Exercise and fasting have both been shown to reduce the number of senescent cells.

Don’t think you have to go all out and fast for days because intermittent fasting has been shown to in animal models to promote autophagy, or cellular “self-eating” that helps clear out damaged cellular components including misfolded proteins.

Intermittent fasting may also help reduce inflammation and oxidative stress processes associated with cellular senescence. For example, oxidative stress shortens telomeres, the protective DNA caps at the ends of your chromosomes, which can lead to a cell becoming senescent.

Researchers have also found that exercise reduced the number of p16-positive senescent cells in transgenic mice fed a fast food diet.

 

Problems?

Research into reversing aging or stopping it for good raises many questions, because if we do discover a way to increase our lifespan or end aging. What effects will this have on society, culture and our already struggling planet?

We need to discuss these questions before we produce any viable treatments arrive, so we are ready for the consequences instead of being caught unaware as we have with climate change.

Bibliography

Ming, X .,et al (2018) ‘Senolytics improve physical function and increase lifespan in old age’, Nature Medicine, 24(), pp. 1246- Toussaint, O. , Salmon, M. , Pascal, T. , Magalhaes, J. P. and Chainiaux, F. (2005).  https://www.nature.com/articles/s41591-018-0092-9

Stress‐induced Premature Senescence (SIPS). In eLS, (Ed.). doi:10.1038/npg.els.0003865 1256. Available at: https://www.nature.com/articles/s41591-018-0092-9

Paige Brown Jarreau (2018) Don’t Be a Zombie: Senolytics, Exercise and Fasting Fight Off Senescent Cells, Available at: https://medium.com/lifeomic/dont-be-a-zombie-senolytics-exercise-and-fasting-fight-off-senescent-cells-cc720d88240 (Accessed: 02/03/19).

Zoë Corbyn (2018) Want to live for ever? Flush out your zombie cells , Available at: https://www.theguardian.com/science/2018/oct/06/race-to-kill-killer-zombie-cells-senescent-damaged-ageing-eliminate-research-mice-aubrey-de-grey (Accessed: 03/03/19).

The Iron Man Suit for Walking by Yasmine A van Domburg

Paralysis affects about one in fifty people– amounting to approximately 5.4 million people in the United States alone (Christopher & Dana Reeve Foundation). With spinal cord injury (SCI) patients accounting for about 27.3% of the total amount of the population suffering from paralysis, and an estimated 500,000 people suffering SCIs per year (WHO), it is quickly becoming clear that a large proportion of the population has significantly decreased standards of life as a consequence.

There are three types of major paralysis: quadriplegia (also called tetraplegia), affecting all four limbs; hemiplegia, affecting one side of the body; and paraplegia, affecting only the lower limbs. Although people suffering from paraplegia are still able to use their upper body, they have a lower life expectancy and often require assistance with tasks such as getting dressed, or personal grooming (WHO).

Enter Wandercraft, a French start-up company working in tech. The first video footage of their innovative Atalante exoskeleton was released on La Chaîne Info just two weeks ago. The video shows a girl with paraplegia wearing the suit and walking while dribbling a basketball.

An exoskeleton is a wearable device that works together with the user. It is placed on the body, and works to support and strengthen the user’s capabilities. They are often used in the military, but in recent years have gained popularity in healthcare. Suits like the Atalante have been designed by other companies, but require crutches for stability, which heavily stresses patients’ shoulder muscles. This causes pain, fatigue, and limiting rehabilitation – and requires patients’ hands such that they are not free for other uses. Being upright could alleviate further issues patients often have, such as cardiovascular issues and muscle loss (Engadget, WHO).

Wandercraft had run some successful trials with clients for several years before releasing the video of Atalante in use. According to Managing Director Matthieu Masselin, there has been a massive emotional response from test subjects, many of whom had not been able to walk since their accidents (Engadget). Floriane, who can be seen using the Atalante in the video, says the suit “offers hope” to her. (BBC)

The exoskeleton has two moveable legs and a back rest, which are attached to the user with multiple straps which evenly distribute pressure for comfort. The straps fully support the patients’ weight. Moving the upper body automatically cause the exoskeleton to move – Atalante mimics human movement with sensors and motors. Sensor information runs through a microcomputer attached to the back, and algorithms define a motor output for walking and self-stability. However, in order to maintain balance, the exoskeleton weighs a heavy 60kg (Engadget).

The company hopes to reduce Atalante’s size and bulkiness in order to eventually make it available for home use. Further aims are to functionalise the suit for use on uneven surfaces, such as pavements or slopes, as well as more complex movements such as walking up stairs. This will require much more research, however. As is, the suit is being marketed at rehab centres to help stroke patients recover their motor ability. About 33.7% of cases of paralysis are caused by strokes (Christopher & Dana Reeve Foundation) – improving rehab could drastically reduce this number.

Watch the Atalante in action here

Sources and Further Reading:

The Anti Vax Movement; More Dangerous Than A Disease? by Hedda Belsnes

Going to the doctors’ or the school nurse to have a needle stuck in your arm, is typically a painful childhood memory most tend to have. For those who travel or require regular flu jabs, it could even be a reoccurring memory. However, how much thought goes into the science behind those sharp needles? What are known as vaccinations are available on the NHS and can protect an individual against a multitude of diseases, including diphtheria, tetanus, whooping cough, polio, Hib, hepatitis B, pneumococcal, rotavirus, meningococcal, measles, mumps, rubella and HPV. Protecting against these diseases not only relieves strain on the NHS, it also saves lives. So why do some choose not to vaccinate?

In 1998, the Lancet Medical Journal published the findings of Dr Andrew Wakefield, who had conducted research that proposed a link between the MMR vaccine (protecting against measles, mumps and rubella) and autism. This sparked global panic, with a dramatic drop in MMR vaccination rates, which inevitably caused a rise in measle cases. Not only did this “doctor” conduct unnecessary, invasive tests on children, without ethical approval or appropriate qualifications, the General Medical Council discredited the entire study, due to lack of medical basis.

Despite repeated research, including an extensive review, conducted by the World Health Organisation showing no evidence for a link between autism and the MMR vaccine, the anti-vax movement remains strong. Internet propaganda spouts arguments against vaccinations, yet these arguments are mainly built upon myths. These myths include vaccines being made using aborted fetal tissue, vaccines contain mercury, and the idea that vaccinations are dangerous. Numerous and extensive academic studies and medical research have disproven these myths. This is alongside proving that any side effects from vaccinations are mild and short-lived. Any severer adverse reactions are incredibly rare; however, doctors and nurses are trained to treat these.

It could be argued that the choice to vaccinate is a personal one. The basis of this argument is flawed in many ways. The first being that the fate of a young child isn’t their personal choice, it’s down to the parent’s own belief of what they think is best. It could be argued that the child’s right to safety and good health is being stripped from them, before they are old enough and informed enough to make their own choices. By the time the child is ready to decide for themselves, it could be too late. Does this mean that parents have a moral obligation to vaccinate their children?

This is also an argument that the choice to vaccinate isn’t a personal one, because of the dangers unvaccinated individuals can pose. The common idea is that if vaccinations are so effective, others aren’t at risk if someone was to chose not to vaccinate their kids. On the contrary, there are many children who would be at risk, such as those who are currently too young to be vaccinated. This would be alongside individuals, young and old, who are unable to be vaccinated, due to immune system problems, such as cancer patients. Parents have also claimed that if they didn’t vaccinate their child, but their child became ill, they could keep them home from nursery/school, to avoid infecting others. This option is also unsuitable, as those infected are often contagious before symptoms can properly begin.

It can be easy to underestimate the importance of vaccinations, thus underestimating the dangers of the anti vax movement. Vaccination remains one of the easiest ways to stay healthy, protecting against serious illnesses, which still poses a very real threat. Vaccine-preventable diseases, such as measles, mumps and whooping cough still results in hospitalisations and death every year. This is largely because these diseases are more common in other countries, meaning children can be infected by travellers, or by travelling themselves. Ultimately, a reduction in vaccination rates could result in an epidemic, where diseases which are virtually eradicated, such as Meningitis C, can return with vengeance.

Yes, many may have their fears about the dangers of vaccinations. Yes, many may believe they are unnecessary and invasive. Yes, many may believe it’s safer not to vaccinate. Yet, you are at a far, far greater risk, by choosing not to vaccinate. Not only are you at a greater risk of severe infections, which could result in death, but you could also be neglecting your moral duties. As individuals, we all have a public health commitment to protect our families, friends and communities, and the only way to sufficiently do this, is to vaccinate.

https://www.verywellhealth.com/anti-vaccine-myths-and-misinformation-2633730

https://www.nhs.uk/conditions/vaccinations/

http://www.vaccineinformation.org/vaccines-save-lives/

Does sunbathing in February feel right to you? by Libby Pool

The last week of February saw temperatures soar across the UK. Everywhere, people took to the parks, ice creams in hand, to bask under bright blue skies and glorious sunshine.

No one can deny that temperatures this high are unheard of this early in the year: the UK’s record for February was set in 1998, when temperatures in Greenwich reached 19.7 °C. But, on the 25th February 2019, a high of 20.6 °C was recorded in Trawsgoed, Wales, marking the first time temperatures have topped 20 °C in winter.

After the chilly winter months, I enjoyed the warmth as much as the next person – but, there was a sense of uneasiness impossible to ignore. This time last year, ‘the Beast from the East’ was raging across the UK. Temperatures were sub-zero, and snow covered the landscape. People took to the internet to post pictures of the same locations taken in February 2018 and 2019 side by side. We are experiencing more extreme weather events year on year…but why? Are these one-off, freak events or evidence of long-term trend climate change?

High pressure air and the Foehn effect:

The Met Office attributes the warm weather of February 2019 to two things. The first is unusually high pressure across continental Europe. This brought warm air from the Canary Islands and North Africa across the continent, warming the UK in the process. Secondly, the Foehn effect is thought to have boosted temperatures. High humidity winds flow over mountains, condense, and forms clouds. Heavy rain occurs on one side of the mountain, while the air gets warmer and drier as it sinks on the other side. This played a part in creating the sunny conditions seen in the last week of February.

pop.png

While these climatic processes explain the unusual heat of February 2019, we need to think about why climatic interactions themselves are changing. Meteorologists are now studying how much of this heat can be attributed to man-made climate change.

What do the experts say?

Geert Jan van Oldenborgh, a climate researcher at Royal Netherlands Meteorological Institute, conducted a preliminary study on UK temperature data and found February’s high temperatures to be “at least a one in 200-year event.” Temperatures deviated so far from the normal, current climate models couldn’t account for the change. Professor James Screen, a climate scientist from the University of Exeter, claims “it’s very hard to say that a couple of days of good weather is because of climate change.” However, he confirms we are seeing an increase in extreme heat events due to increasing mean global temperatures, as shown in the past decades. For example, around 500 years of temperature data shows that the five coldest years range from 1695 to 1902. The five hottest years all occurred since 2005. While experts are not yet sure the events of February 2019 are fully accounted for by climate change, there is no doubt it played a part.

Why should we care?

The sun is shining, the last thing you want to think about is the impending threat of extreme heat events. However, in 2003, a heatwave across Europe caused 70,000 deaths. Climate studies showed this extreme heat was attributable to anthropogenic warming. Carbon emissions are severely impacting human wellbeing: we have already caused a 1 °C increase in global temperatures and according to a major UN report we have ‘locked in’ an additional 0.5 °C warming. If we reach 2 °C warming, it is predicted that 411 million people will suffer from water shortages.

As an ecology student, I’m concerned with how extreme events impact wildlife. The early heat has seen species such as hedgehogs, bats and dormice coming out of hibernation too early. This puts them at risk as they use fat stores they need to reserve. They also wake up before their primary plant food sources have bloomed so they suffer food shortages, a process known as ‘trophic mismatch.’

Final thoughts

We really need to ‘sober up’ to the realities of our impact on the climate – a thought shared by Green Party MP, Caroline Lucas. The government is going backwards on climate action, unsurprising when you hear the first climate debate in parliament for two years was barely attended by MPs. In the wake of issues like Brexit, climate action seems to be low on the government’s priority list. However, Lucas claims she finds, “huge hope from the rising tide of activism” after students took to the streets to demand climate action last month. Public acknowledgment of climate change pressures the government to change legislation so, in Lucas’ words: “if sunbathing in February doesn’t feel right to you, get out on the streets instead.”

 

What a difference a year makes…check out these photos comparing February 2018 to February 2019:

Sources:

https://www.bbc.co.uk/news/uk-47360952

https://www.bbc.co.uk/news/newsbeat-47371648

https://www.theguardian.com/uk-news/2019/mar/02/uk-temperature-jump-february-incredible-climate-weather-carbon

https://www.independent.co.uk/voices/uk-weather-heatwave-climate-change-global-warming-february-met-office-a8797136.html

https://www.telegraph.co.uk/news/2019/02/25/uk-weather-hottest-february-day-record-britain-temperature-reaches/

Lecture by Professor Gareth Pheonix, University of Sheffield, 2018.

Running around like a headless… Pig? Hundreds of pig brains kept alive after decapitation – Rachel Jones

On March 28th, in a National Institutes of Health meeting on ethics in US neuroscience, Yale Neuroscience Professor Nenad Sestan announced that by experimenting on 100 to 200 brains of decapitated pigs from slaughterhouses, he could keep the organs alive using heaters and pumps to circulate the brains with artificial blood. Billions of cells were discovered to be healthy and capable of working as normal, despite decapitation. This is the first reported success in separating live brains from the bodies of large mammals without using cooling.

Sestan proposed that the brains may be used as models for treatment of diseases such as cancer and Alzheimer’s disease to inform on therapy for humans, since we need models with large amounts of intact brain to see the full effect of treatments. The research was initially funded to help to produce an atlas of the brain, as the connections of the brain are not yet well understood. 17 neuroscientists and bioethicists, including Sestan, published a Nature article in April 2018 proposing methods that may ensure that human brain tissue harvested using these techniques is not conscious during experimentation (experimenting on live human brain tissue is ethically complex as it is potentially conscious, making testing and termination of samples problematic). Suggestions included producing small amounts of nervous tissue known as organoids lacking capability for consciousness, preserving living human brain tissue removed in surgery and inserting human brain tissue into mice.

The immediate media response to the news was the possibility of using method in human brain transplant to keep the tissue alive between bodies. Since live brain maintenance has only been done in pigs, we cannot assume it can be done in humans, but Sestan claims that the techniques could apply to other organisms. In 1970 a successful head transplant between rhesus monkeys by Robert White produced a recipient that survived for 8 days, yet the method was discarded as connection of the spinal cords was not possible. Italian neurosurgeon Sergio Canavero announced in November 2017 that his colleague Xiaoping Ren had transplanted of the head of one human cadaver to another, as a ‘rehearsal’ for live human head transplant. Canavero and Ren have previously experimented with transplantation of live rat, mouse, dog and primate heads. Canavero and Ren predicted their first attempt at live human head transplant to be in late 2017, yet have now changed their estimate to ‘imminent’. Canavero has also alleged that he knows a method to connect the spinal cords.

However, Canavero and Ren have only managed to do human head transplants on cadavers and live head transplants on animals, so cannot claim that they are capable of performing live human head transplants. Arthur Caplan, head of ethics in the New York University medical school, does not believe that Canavero will ever receive the ethical go-ahead, and suggests that Canavero is conducting research in China as the ethics laws are more relaxed than those of the US and Europe.  Professor Sestan mentions that there is no evidence that the pig brains may regain consciousness if transplanted into another pig and has claimed no intention of using this technology to test brain transplantation. Tests suggested that the brains were not conscious, although this negative result may be due to chemicals used to prevent swelling preventing neuronal signals. Steve Hyman, director of psychiatric research at the Cambridge, Massachusetts Broad Institute has said that brain transplants are “not remotely possible”, as he is critical of the idea that we could treat the brain in the same way that we treat organs that are routinely transplanted.

Professor Sestan has refused to comment on his findings as the research is yet to be published, and he had not wished for the news to become public prior to publishing. This means that we do not know if the research will stand up to the rigorous scrutiny of a journal’s peer review. If the experiments are accepted, however, it seems that we may be conducting research on full pig brains in the future, mapping brain cell connections and testing drugs and therapies for human brains in pig brains. Testing on human brains brings up concerns including consent, the definition of death and ownership of living human brains, but using pig brains to inform us about human disease avoids these issues for the most part, so long as unnecessary suffering is not inflicted upon the animal. As for the use of these pig brains in studying human head transplants, historical experiments show that head transplants are possible in a number of animals, but even if brain transplants to lengthen life in humans becomes possible they will not be an option any time soon, due to a huge range of ethical concerns, a lack of evidence of consciousness and loss of spinal cord connection.

Is sitting too close to the TV really that bad for you? Ciara Barrett

“Watching too much TV will give you square eyes!” Imagine the classic 1960s rectangular box television sat in front of a few brothers and sisters watching their favourite afternoon show so they don’t miss it. Their mum walks in and exclaims this phrase to them in complete vain. Did they all grow up to need glasses?

The phrase originates from similar 1960s models of TVs which were found to emit 100,000 times the safe radiation rate so at that time sitting too close to the TV really was a health hazard but for a completely different reason than expected. The TVs were quickly recalled.

Overall, children are better at focusing on close objects than adults so are more likely to sit close to the TV or hold books near their face, but they should grow out of this, unless, of course, it is underlying short-sightedness.

Staring at a screen <40 cm from your eyes is known as ‘near work’ and most studies show that near work usually doesn’t permanently harm our eyes (although links between near work and short sightedness are being investigated), which is fortunate because of the number of screens we’re surrounded by today. However, it may cause fatigue and eyestrain. Eyestrain is something most people have experienced, see: submitting the final draft of a report you’ve been working on for the last 5 hours and your head, neck and eyes hurt but you need to finish before the 9am deadline. Or: it’s the 7th episode of that show you watch, and the screen is too bright for this time of night, but you need to keep going to find out if she killed her fiancée. If you feel personally attacked by these scenarios, then congratulations, you’ve experienced eyestrain. You probably also know that it can be fixed by a good night’s sleep.

Symptoms of eyestrain are a product of Computer Vision Syndrome. This is when you’re looking at a screen for too long and stop blinking enough which affects tear flow and can in turn cause headaches, dry eyes and difficulty focusing. This isn’t permanent damage and can be amended with taking breaks by concentrating on something else and blinking more often. Some cases have been studied where extensive video game play or TV watching caused damage to the watcher’s retina or cornea, but this is unlikely.

When you see something, light travels through the dome-shaped cornea at the correct angle to hit the retina that interprets the image at the back of the eye. The ciliary muscle bends the cornea to the right angle and, like any muscle, can begin to hurt if you keep it in one position for too long and, combined with squinting from the light, causes the discomfort of eyestrain. This close focusing also stops us blinking as often as we need to so the outer layer on the cornea gets dry causing foggy vision.

As mentioned, none of these symptoms are permanent but are better to be avoided. One possible way of remembering is to use the 20-20-20 rule: after being in front of a screen for 20 minutes, look at an object 20m away for another 20 minutes, which is impractical for your 5am essay endeavours but a good guideline nonetheless. Another useful measure is to get a good night’s sleep (another possibly unachievable suggestion) but will help your eyes and overall health in the long run. If all else fails, try changing the brightness, glare and text size on your screen. However, if you regularly need to sit closer to the screen then it could be a sign of short-sightedness.

The mothers from the 1960s who coined the phrase are going to need a stronger argument than square eyes, it seems, as the effects of sitting close to the TV or watching too much of it really aren’t that bad or permanent. Just don’t forget to blink.

 Further reading:

https://www.scientificamerican.com/article/earth-talk-tv-eyesight/

https://athome.readinghorizons.com/blog/why-sitting-too-close-to-the-television-makes-your-eyes-go-square

https://www.scientificamerican.com/article/is-sitting-too-close-to-screen-making-you-blind/

Why do we procrastinate? Emily Farrell

Everyone procrastinates. No one wants to write that essay, or clean the bathroom. If it’s not food, sex or sleep, your body is just not interested. Sure, in the long run you might need to write that essay, to get that degree, to get that job, to earn money to buy food to survive. But your body doesn’t understand, or care, about that. Your body is a thing made in simpler times. It is built for when survival entailed going off to pick some plants to eat, some reproducing and maybe a bit of sleep afterwards. But modern, western lifestyles are a horrible mismatch for this way of living. Imagine giving a caveman a long, boring, task to do such as moving numbers from one column to another (maybe with sticks, it could take a while to explain the concept of computers). Why should he do it? He gets no food from it. He gets no joy from it. Doing this task does not make him any more attractive to cavewomen who might then want to have his babies. It takes a reasonable amount of energy that is better spent in other labours. So why should he do it? To him, the answer is he shouldn’t. And this is the thought process your brain goes through when faced with a task. While the conscious parts of your brain know the real reason for the task, your ancient parts of the brain, which we share with our ancestors and other animals, do not.

Think about it. How do you procrastinate? Making a snack? (means you won’t starve to death) Taking a nap? (means you won’t be too tired to see the tiger of death headed your way) Talking to friends? (maintaining social bonds which one day might lead to you making tiny replicas of yourself vis someone else’s genitals) Watching cat videos? (evolution can’t explain the internet, but taking joy from something which takes away no resources you may have gained from the other tasks means your body agrees to it).

Cleaning your own room is therapeutic and has actually been shown to improve your mood while doing it and afterwards when you’re in your nice clean room. But when it comes to the gross shared bathroom every uni student has encountered, you put it off for longer. You procrastinate away from it. This is because you gain no real benefit from it. It’s not dirty enough to give you diseases (yet), and you don’t spend enough time in it for it to benefit your mental health. If you can’t see an immediate advantage, you won’t do it.

Procrastination is all about cost and benefit and finding the balance between the two. If the immediate payout does not equal or outweigh the energy expenditure required to perform the task, then the inclination to do it will disappear.

Think about this the next time you put something off and do something else instead. Would what you are putting off benefit a caveman? Would he benefit by doing what you are doing now? But don’t listen to your inner caveman. Listen to your inner modern human who wants that essay done, because they know that you really need to do it. Don’t let them in only at the last second to write it. Go and do something productive! Go!

237 Million Medication Errors Occur in NHS England Annually – an Interview with Researcher Fiona Campbell by Emma Hazelwood

A recent report revealed that 237 million medication errors occur in NHS England annually. Not only did the study reveal that these mistakes cause 712 deaths and could be a contributory factor to thousands more, but it is estimated that this costs the NHS £98.5 million a year.

Fiona Campbell, a research fellow at the University of Sheffield, was involved with the study. She met up with pH7 writer, Emma Hazelwood, to provide some more information on the report.

How did the project come about?

The team, which included Marrissa Martyn-St James and Eva Kaltenthaler, at the School of Health and Related Research were asked by the Department of Health to look at how prevalent medication errors in the NHS are, and to estimate the cost of these errors. The project was a collaboration between researchers in Sheffield, Manchester and York, with the team at Sheffield identifying and synthesising relevant literature.

How were the figures calculated?

There are many different ways that studies have measured medication errors. Some examples are to look at past prescribing practices, or adverse drug events (ADRs). The threshold for counting an error is very low – the figure of 237 million includes any small error at all.

What were the limitations of the study?

As with any study, there were some limitations. First, there was a strict time limit, set by the Department of Health – the team at Sheffield had about six weeks to analyse a mammoth amount of data. Secondly, calculations for medication errors are complicated. This is for several reasons – there are different definitions of an error across studies, and sometimes no one realises an error has been made so it may go unrecorded. There are also ethical implications of studying medication errors – if a researcher spots an error, they may feel that they have a moral obligation to stop it before it results in harm to a patient. Therefore, it is difficult to calculate what the impact of these mistakes would have been. In this study, some data goes back as far as ten years. Our healthcare system may have changed since then.

Is it a serious problem?

Considering that there are only about 50 million people in England, the figure of 237 million medication errors per year seems shocking. However, what is lost in this figure is that there are billions of prescriptions issued every year. Furthermore, the threshold for an error is very low – even if one is noticed by healthcare professionals and stopped before it reaches the patient, it is still included in these calculations. Of course, there are catastrophic errors which result in severe patient harm, or even death, but not all – in fact, three out of four result in no harm. Having 237 million medication errors does not mean that people have taken the wrong medication 237 million times. Although it is estimated that these errors are a contributory factor in 1,700 – 22,303 deaths a year, the true figure is most likely at the lower end of this range. Again, the threshold is very low – if someone dies and there was a medication error, even if it is unlikely that it was related to their death, it must be recorded as a potential contributory factor.

Although the errors result in hundreds of deaths, and cost the NHS £98.5 million per year, it seems that we are no worse than anybody else. In many countries, errors are not even recorded, and, when they are, rates are similar to those in this study. The fact that the team was able to undertake this project could be seen as a commitment to transparency within the NHS, and of the determination to reduce these errors.

What are the possible improvements for the NHS?

In order to stop these errors, we must continue to be vigilant in recording them. We rely on healthcare professionals to record their own mistakes, so it is vital that there is not an environment of guilt and shame. There are currently trials seeking to reduce error rates, in particular researching where they occur, and new systems for flagging them up. There are already different checks that happen within the NHS, and, for an error to reach the patient, every mistake has to align. The report supports more funding for research into what we can do to reduce medication errors.

What was the impact of the study?

This study has attracted the attention of a lot of media, from BBC News to Radio 4. Studies such as this highlight the role scientists have in discussing research and making it accessible to the public, without allowing it to be used as a political football.
Overall it’s clear that medication errors are prevalent in our healthcare system. On occasion they have devastating effects, and this quantification of the errors is shocking. That said, we can see that our system has a good rate of preventing these from reaching patients, and the fact that studies and trials are taking place demonstrates that the problem could be improved dramatically over the coming years.

 

Biohacking: an upgrade to “wearable tech”, or turning ourselves into cyborgs? Ellie Marshall

Anyone who’s watched the futuristic Netflix show ‘Black Mirror’ will know of how emerging technology and our reliance on it can have unanticipated consequences – If you have not seen it, I highly recommend giving it a watch!

Yet, we might be closer to the futuristic world of Black Mirror than you think. Around the world, people are pushing the gruesome boundaries of how far we integrate tech with our lives, through a series of implants and body modifications. This is a branch of biohacking – a blanket term used to describe a whole spectrum of ways that people modify or improve their bodies. People who hack themselves with electronic hardware to extend and improve human capacities are known as Grinders or Transhumanists.

Common procedures

A common procedure is to implant a strong magnet beneath the surface of a person’s skin, often in the tip of the ring finger. Nerves in the fingertips then grow around the magnet. This allows nearby magnetic and electrical fields along with their strength and shape to become detectable to the user, thanks to the subtle currents they provoke. For a party trick, the person can also pick up metal objects or make other magnets move around.

Calling this a procedure, though, gives rather the wrong impression. Biohacking is not a field of medicine. Instead it is carried out either at home with DIY kits purchased online or in piercing shops, but without an anaesthetic (which you need a licence for). If you think this sounds painful, you are correct. With no corporate help, the only way grinders can accomplish their goals is by learning from other grinders, mainly through online forums such as biohack.me.

Britain is the birthplace of grinders and in 1998 Kevin Warwick, professor of cybernetics at the University of Reading had a simple radio-frequency identification transmitter (RFID) implanted in his upper left arm, in an experiment that he called Project Cyborg. The chip didn’t do much – it mainly just tracked him around the university and turned on the lights to his lab when he walked in. Still, Warwick was thrilled, and the media were enchanted, declaring him the world’s first cyborg.

RFID implants are now common among grinders and allow users to unlock physical and electronic barriers. Similar technology is already widely used in contactless card payment systems and clothing tags, and Motorola are developing an RFID-activated ‘password pill’ that a user can swallow and access their devices without the hassle of remembering them.

Other examples of biohacking

Circadia, developed by Biohack.me offshoot company Grindhouse Wetware is another implantable device that constantly gathers the user’s biometric data, for example transmitting temperature data via Bluetooth. The medical potential for this device is vast, and it has the most immediately practical benefits.

Additionally, the first internal compass, dubbed the ‘Southpaw’ has been invented. It works by sealing a miniature compass inside a silicon coat, within a rounded Titanium shell, to be implanted under the skin. An ultra-thin whisker juts out, which is activated when the user faces north, to lightly brush an alert on the underside of the skin.

Rich Lee, a star of biohack.me forum, has magnets embedded in each ear so he can listen to music through them, via a wire coil he wears around his neck, that converts sound into electromagnetic fields, creating the first ‘internal headphones’. The implants allow him to detect different sensors, so he can ‘hear’ heat from a distance and detect magnetic fields and Wi-Fi signals too! There is a practical purpose to Lee’s experiments, as he suffers deteriorating eyesight and hopes to improve his orientation through greater sensory awareness.

A damaging concept to users and society?

The question we must ask ourselves is at what point does the incorporation of all this technology make us a different species and what are the ethics behind that?

The bluntest argument against biohacking is that it’s unnatural. For most people, especially those who benefit from medical advancements like pacemakers and cochlear implants, adding RFID or magnets to the body appears to have little value. There are very few people who can’t recognize the benefits of technological progress and how it has helped humanity. Grinding, however is often not recognized as an advancement.

Another argument against human augmentation mirrors the worries that commonly surround genetic engineering. A thought provoking possibility is that those who have access to (and can afford) augmentation procedures and devices will gain unfair advantages over those who do not. Over generations, this could create a large rift between the augmented and the unaugmented. Luckily, the grinder movement provides a solution to this problem as part of its central ethos: open source hardware and the free access of information.

A benefit to the individual and society?

To some, implanted technology represents the next stage in mankind’s evolution that may bring many medical advancements. And, indeed, the idea is not outlandish. Brain stimulation from implanted electrodes is already a routine treatment for Parkinson’s and other diseases, and there are prototypes that promise to let paralysed people control computers, wheelchairs and robotic limbs.

The Wellcome Trust has begun a trial with Alzheimer’s patients carrying a silicon chip on the brain itself, to predict dangerous episodes, and able to stimulate weakened neurons. Military researchers Darpa are also experimenting with a chip implant on humans to help control mental trauma suffered by soldiers.

There is potential to help visually and hearing impaired people by using a chip that translates words and distances into sound, which could mean the end of Braille and sticks. Neil Harbisson is the founder of the non-profit Cyborg Foundation in Barcelona and was born with achromatopsia, the inability to see colours. Since 2004, Harbisson has worn a device he calls the eyeborg, a head-mounted camera that translates colours into soundwaves and pipes them into his head via bone conduction. Today Harbisson “hears” colours, including some beyond the visible spectrum.

These experimental grinders are certainly laying the groundwork for more powerful and pervasive human enhancements in the future, but for now, a Fitbit is more than enough for me.

 

https://www.techopedia.com/definition/29897/biohacking

http://www.abc.net.au/news/2017-02-23/biohackers-transhumanists-grinders-on-living-forever/8292790

http://www.slate.com/articles/technology/superman/2013/03/cyborgs_grinders_and_body_hackers_diy_tools_for_adding_sensory_perceptions.html

https://gizmodo.com/the-most-extreme-body-hacks-that-actually-change-your-p-1704056851

https://hackaday.com/2015/10/12/cyberpunk-yourself-body-modification-augmentation-and-grinders/

https://www.wired.com/story/hannes-wiedemann-grinders/

https://www.theverge.com/2012/8/8/3177438/cyborg-america-biohackers-grinders-body-hackers

http://edition.cnn.com/2014/04/08/tech/forget-wearable-tech-embeddable-implants/index.html

https://www.digitaltrends.com/cool-tech/coolest-biohacking-implants/

The Meat Industry: friend or foe? Keerthana Balamurugan

Meat has been and still is a universal ingredient in numerous societies, not to mention a major part of many traditions, but recent studies have discovered that the consumption of meat is slowly decreasing. Those who have turned vegan or have been eating less meat in moderation have praised the fact that it reaps countless health benefits. Eating less meat has also proved its value towards our environment as problems once created by the meat industry diminish as it recedes. Counter-claims have also arisen declaring that the trend is damaging the multi-billion dollar meat industry and the economy. Where should we stand between the two sides?

Slowly replacing meat products with healthier options such as vegetables, whole grains and even seafood can alter your health immensely for the better. The WHO, World Health Organization, released a report last year linking the consumption of red meat with certain types of cancer and also stating that just by consuming up to 100 grams of meat daily, cancer risk can increase by up to 20%. This statistic jolted people into awareness of the set-backs. In certain countries, trying the vegan diet has become the new trend, with seemingly everyone raving about it on their social media, as people caught wind of how taking in less meat and replacing it with healthier alternatives can aid weight loss. Currently there are more people suffering from obesity than starvation and nutritionists are stating meat as one of the causes. From this aspect, consuming less meat would do us all a favour.

Even with such statistics that backs up claims of the positives of eating less meat, there are those who question this. If we remove meat from our diets what happens to our body with the decreased protein and iron intake? One of the most common disadvantages of not eating enough meat is iron deficiency which can drastically affect our immune systems and the speed at which our body functions. It cannot be disagreed that meat supplies us with a dense source of protein but studies from the Harvard School of Medicine proves that a healthy diet of leafy greens, mushrooms, legumes and other iron-rich plant foods can easily compensate for the nutrients meat provides us. It is simply a balancing act.

It comes as no surprise that the multi-billion dollar meat industry is damaging our ecosystem by tearing down acres and acres of woodland as well as increasing carbon emissions. Agricultural emissions alone account for 30 % of global emissions. Through the production of just 1 kg of beef, 15,000 litres of water is required and up to 30 kg of carbon dioxide is released which accounts to greenhouse gases. Now imagine this multiplied by thousands and thousands of kilograms of meat. Livestock production is the number one use of land by humankind meaning the largest deforestation contribution to our planet. In Brazil, their large-scale commercial beef farming is the cause of 70% of cleared forests in the Amazon. Precious water is being used up and wasted compared to vegan alternatives, ecosystems are being destroyed because of the land clearing and worsening climate change are all effects of the non-sustainable industry. Many would agree upon consuming less meat in order to try and lessen the harm that is being done to the planet.

In the U.S alone, the meat industry is worth more than 800 billion dollars annually, providing over 6 million jobs. Huge numbers of people see the colossal benefit towards cutting down on meat, but what would that mean to the economy, and to the millions of people who rely on it for their wages? Yes, it is true that the sudden economic shift from consuming less would affect a country’s gross domestic product as well as employment rates but only in the short term. Many protest against the cut-down on meat because of these reasons, so is the long-term effect worth the tremendous risk? There is a whole new type of industry that has been booming in the market and that is vegan alternatives. The relatively new category of food products has brought in a whole new economy to the table, providing more jobs for higher wages and with less grueling working conditions.

Consuming less meat has more benefits than drawbacks, leading to a much healthier lifestyle and a cleaner environment for our planet and its inhabitants. If everyone on the planet were to eat meat in moderation, we would have lower percentages of those suffering from obesity and certain types of cancer, not to mention the effects of climate change would be less severe. We live in the day and age where there are so many options available to replace meat in our diets and with just a change in mindset and perspective, many more people can get on board the change realising the environmental and health benefits towards eating less meat.