Running around like a headless… Pig? Hundreds of pig brains kept alive after decapitation – Rachel Jones

On March 28th, in a National Institutes of Health meeting on ethics in US neuroscience, Yale Neuroscience Professor Nenad Sestan announced that by experimenting on 100 to 200 brains of decapitated pigs from slaughterhouses, he could keep the organs alive using heaters and pumps to circulate the brains with artificial blood. Billions of cells were discovered to be healthy and capable of working as normal, despite decapitation. This is the first reported success in separating live brains from the bodies of large mammals without using cooling.

Sestan proposed that the brains may be used as models for treatment of diseases such as cancer and Alzheimer’s disease to inform on therapy for humans, since we need models with large amounts of intact brain to see the full effect of treatments. The research was initially funded to help to produce an atlas of the brain, as the connections of the brain are not yet well understood. 17 neuroscientists and bioethicists, including Sestan, published a Nature article in April 2018 proposing methods that may ensure that human brain tissue harvested using these techniques is not conscious during experimentation (experimenting on live human brain tissue is ethically complex as it is potentially conscious, making testing and termination of samples problematic). Suggestions included producing small amounts of nervous tissue known as organoids lacking capability for consciousness, preserving living human brain tissue removed in surgery and inserting human brain tissue into mice.

The immediate media response to the news was the possibility of using method in human brain transplant to keep the tissue alive between bodies. Since live brain maintenance has only been done in pigs, we cannot assume it can be done in humans, but Sestan claims that the techniques could apply to other organisms. In 1970 a successful head transplant between rhesus monkeys by Robert White produced a recipient that survived for 8 days, yet the method was discarded as connection of the spinal cords was not possible. Italian neurosurgeon Sergio Canavero announced in November 2017 that his colleague Xiaoping Ren had transplanted of the head of one human cadaver to another, as a ‘rehearsal’ for live human head transplant. Canavero and Ren have previously experimented with transplantation of live rat, mouse, dog and primate heads. Canavero and Ren predicted their first attempt at live human head transplant to be in late 2017, yet have now changed their estimate to ‘imminent’. Canavero has also alleged that he knows a method to connect the spinal cords.

However, Canavero and Ren have only managed to do human head transplants on cadavers and live head transplants on animals, so cannot claim that they are capable of performing live human head transplants. Arthur Caplan, head of ethics in the New York University medical school, does not believe that Canavero will ever receive the ethical go-ahead, and suggests that Canavero is conducting research in China as the ethics laws are more relaxed than those of the US and Europe.  Professor Sestan mentions that there is no evidence that the pig brains may regain consciousness if transplanted into another pig and has claimed no intention of using this technology to test brain transplantation. Tests suggested that the brains were not conscious, although this negative result may be due to chemicals used to prevent swelling preventing neuronal signals. Steve Hyman, director of psychiatric research at the Cambridge, Massachusetts Broad Institute has said that brain transplants are “not remotely possible”, as he is critical of the idea that we could treat the brain in the same way that we treat organs that are routinely transplanted.

Professor Sestan has refused to comment on his findings as the research is yet to be published, and he had not wished for the news to become public prior to publishing. This means that we do not know if the research will stand up to the rigorous scrutiny of a journal’s peer review. If the experiments are accepted, however, it seems that we may be conducting research on full pig brains in the future, mapping brain cell connections and testing drugs and therapies for human brains in pig brains. Testing on human brains brings up concerns including consent, the definition of death and ownership of living human brains, but using pig brains to inform us about human disease avoids these issues for the most part, so long as unnecessary suffering is not inflicted upon the animal. As for the use of these pig brains in studying human head transplants, historical experiments show that head transplants are possible in a number of animals, but even if brain transplants to lengthen life in humans becomes possible they will not be an option any time soon, due to a huge range of ethical concerns, a lack of evidence of consciousness and loss of spinal cord connection.

Learning from our ancestors – how early humans worked together to survive a changing climate – Emily Farrell

If Yellowstone was to erupt tomorrow, America might not make it through the night. Yellowstone is a super volcano that erupts roughly every 650,000 years; the last eruption was 640,000 years ago. So while it is not “overdue” for an eruption as some conspiracy theorists may think, there is one on the way. This could spell disaster for the continent. Last time, 1000 cubic km of rock, dust and volcanic ash was blown into the sky, blocking the light from plants, catastrophically polluting the air and massively changing the climate. This spelt disaster for the animals living there at the time and could again if it were to erupt.

But would this mean the end for human kind? Not if we follow in the footsteps of our ancestors.

Around 40,000 years ago, Southern Italy had its own super eruption in the volcanic Phlegraean fields and archaeologists have been studying a site in Liguria to see how we were affected by this. Humans had only been in this area for about 1000 years before this event occurred. It would have changed their climate and possibly other aspects such as the food available and air and water quality.

Researchers believe that this change in climate is what drove the Neanderthals out of this area. Current theories suggest that they were not especially capable of adapting and would not have survived well in a suddenly new environment.

But regardless of how well Neanderthals coped, it seems some humans survived and even flourished in these conditions. It appears that their tactic was to maintain links between groups. The evidence for this is on the Italian site. Tools, ornaments and human remains from an ancient rock shelter were analysed and it was found that some of the flint they were using came from hundreds of kilometres away. Having this network would mean that knowledge of how to cope in different situations and habitats would be shared between the groups. When the climate did change, due to a super-eruption or other conditions, the information on how to survive in an unfamiliar environment would already be available.

We can apply this theory to our communities in modern day. By learning from each other we can share the knowledge of how to cope with changes in our climate. Globalisation has increased our capacity for this, so instead of hundreds of kilometres, we can gain knowledge from our networks across the world. We can learn how to build houses on the water from the Pacific Islands, we can learn how to make the most of a limited water supply from Singapore. Why spend time creating novel solutions when the perfect one may already be in place somewhere else on the globe?

If a super eruption occurs and dramatically changes our climate, or even if we continue to change the climate ourselves, we will need to be able to adapt to make our lives sustainable and to be able to endure the changes. By networking, by sharing our knowledge, we can follow in the lives of our ancestors and survive whatever this world throws at us.

Why do we procrastinate? Emily Farrell

Everyone procrastinates. No one wants to write that essay, or clean the bathroom. If it’s not food, sex or sleep, your body is just not interested. Sure, in the long run you might need to write that essay, to get that degree, to get that job, to earn money to buy food to survive. But your body doesn’t understand, or care, about that. Your body is a thing made in simpler times. It is built for when survival entailed going off to pick some plants to eat, some reproducing and maybe a bit of sleep afterwards. But modern, western lifestyles are a horrible mismatch for this way of living. Imagine giving a caveman a long, boring, task to do such as moving numbers from one column to another (maybe with sticks, it could take a while to explain the concept of computers). Why should he do it? He gets no food from it. He gets no joy from it. Doing this task does not make him any more attractive to cavewomen who might then want to have his babies. It takes a reasonable amount of energy that is better spent in other labours. So why should he do it? To him, the answer is he shouldn’t. And this is the thought process your brain goes through when faced with a task. While the conscious parts of your brain know the real reason for the task, your ancient parts of the brain, which we share with our ancestors and other animals, do not.

Think about it. How do you procrastinate? Making a snack? (means you won’t starve to death) Taking a nap? (means you won’t be too tired to see the tiger of death headed your way) Talking to friends? (maintaining social bonds which one day might lead to you making tiny replicas of yourself vis someone else’s genitals) Watching cat videos? (evolution can’t explain the internet, but taking joy from something which takes away no resources you may have gained from the other tasks means your body agrees to it).

Cleaning your own room is therapeutic and has actually been shown to improve your mood while doing it and afterwards when you’re in your nice clean room. But when it comes to the gross shared bathroom every uni student has encountered, you put it off for longer. You procrastinate away from it. This is because you gain no real benefit from it. It’s not dirty enough to give you diseases (yet), and you don’t spend enough time in it for it to benefit your mental health. If you can’t see an immediate advantage, you won’t do it.

Procrastination is all about cost and benefit and finding the balance between the two. If the immediate payout does not equal or outweigh the energy expenditure required to perform the task, then the inclination to do it will disappear.

Think about this the next time you put something off and do something else instead. Would what you are putting off benefit a caveman? Would he benefit by doing what you are doing now? But don’t listen to your inner caveman. Listen to your inner modern human who wants that essay done, because they know that you really need to do it. Don’t let them in only at the last second to write it. Go and do something productive! Go!

237 Million Medication Errors Occur in NHS England Annually – an Interview with Researcher Fiona Campbell by Emma Hazelwood

A recent report revealed that 237 million medication errors occur in NHS England annually. Not only did the study reveal that these mistakes cause 712 deaths and could be a contributory factor to thousands more, but it is estimated that this costs the NHS £98.5 million a year.

Fiona Campbell, a research fellow at the University of Sheffield, was involved with the study. She met up with pH7 writer, Emma Hazelwood, to provide some more information on the report.

How did the project come about?

The team, which included Marrissa Martyn-St James and Eva Kaltenthaler, at the School of Health and Related Research were asked by the Department of Health to look at how prevalent medication errors in the NHS are, and to estimate the cost of these errors. The project was a collaboration between researchers in Sheffield, Manchester and York, with the team at Sheffield identifying and synthesising relevant literature.

How were the figures calculated?

There are many different ways that studies have measured medication errors. Some examples are to look at past prescribing practices, or adverse drug events (ADRs). The threshold for counting an error is very low – the figure of 237 million includes any small error at all.

What were the limitations of the study?

As with any study, there were some limitations. First, there was a strict time limit, set by the Department of Health – the team at Sheffield had about six weeks to analyse a mammoth amount of data. Secondly, calculations for medication errors are complicated. This is for several reasons – there are different definitions of an error across studies, and sometimes no one realises an error has been made so it may go unrecorded. There are also ethical implications of studying medication errors – if a researcher spots an error, they may feel that they have a moral obligation to stop it before it results in harm to a patient. Therefore, it is difficult to calculate what the impact of these mistakes would have been. In this study, some data goes back as far as ten years. Our healthcare system may have changed since then.

Is it a serious problem?

Considering that there are only about 50 million people in England, the figure of 237 million medication errors per year seems shocking. However, what is lost in this figure is that there are billions of prescriptions issued every year. Furthermore, the threshold for an error is very low – even if one is noticed by healthcare professionals and stopped before it reaches the patient, it is still included in these calculations. Of course, there are catastrophic errors which result in severe patient harm, or even death, but not all – in fact, three out of four result in no harm. Having 237 million medication errors does not mean that people have taken the wrong medication 237 million times. Although it is estimated that these errors are a contributory factor in 1,700 – 22,303 deaths a year, the true figure is most likely at the lower end of this range. Again, the threshold is very low – if someone dies and there was a medication error, even if it is unlikely that it was related to their death, it must be recorded as a potential contributory factor.

Although the errors result in hundreds of deaths, and cost the NHS £98.5 million per year, it seems that we are no worse than anybody else. In many countries, errors are not even recorded, and, when they are, rates are similar to those in this study. The fact that the team was able to undertake this project could be seen as a commitment to transparency within the NHS, and of the determination to reduce these errors.

What are the possible improvements for the NHS?

In order to stop these errors, we must continue to be vigilant in recording them. We rely on healthcare professionals to record their own mistakes, so it is vital that there is not an environment of guilt and shame. There are currently trials seeking to reduce error rates, in particular researching where they occur, and new systems for flagging them up. There are already different checks that happen within the NHS, and, for an error to reach the patient, every mistake has to align. The report supports more funding for research into what we can do to reduce medication errors.

What was the impact of the study?

This study has attracted the attention of a lot of media, from BBC News to Radio 4. Studies such as this highlight the role scientists have in discussing research and making it accessible to the public, without allowing it to be used as a political football.
Overall it’s clear that medication errors are prevalent in our healthcare system. On occasion they have devastating effects, and this quantification of the errors is shocking. That said, we can see that our system has a good rate of preventing these from reaching patients, and the fact that studies and trials are taking place demonstrates that the problem could be improved dramatically over the coming years.

 

Have we really found Amelia Earhart’s bones? Fatima Sheriff

Amelia Earhart is one of the most famous aviators of her time and throughout history – breaking record after record, blazing the trail for female pilots and in July 1937, then disappearing over the Pacific Ocean, never to be seen again. But this may not be the end of her story; a new study in Forensic Anthropology claims that a set of bones found within the vicinity of her disappearance in 1940 are in fact her remains.

Born at the turn of the 20th century, Earhart found herself within a society ingrained with “age-old customs”, where women were “bred to timidity”. However, after her first time in a plane in December 1920, she found her place in the world, starting lessons 6 days later and within a year passing the test for her National Aeronautics Association licence (the 16th woman ever to do this). She quickly rose to fame becoming the first woman to fly solo above 14,000 feet in 1922.

As a passenger in the first flight across the Atlantic in 1928, whereas others were paid thousands of dollars, she was paid in… “experience and opportunity”. Confident in her own ability, she followed suit in 1932, becoming the first woman to fly solo across the Atlantic. Along the way she had to deal with leaking fuel, flames in the engine and ice on the wings of her plane with “only tomato juice to keep her own energy levels up”. Despite all these challenges, her piloting skills and quick problem solving meant she landed safely. She was awarded the Distinguished Flying Cross and continued to add to her astonishing list of achievements with the mentality that “women must try to do things that men have tried. When they fail, their failure must be but a challenge to others”.  She inspired other women to join her, founding the Ninety Nines, an organisation for the advancement of licensed female pilots.

Her spirit of adventure led her to plan the next ambitious trip: a round the world flight of 29,000 miles over 40 days with 20 stops. Leaving on the 1st of June 1937 from Oakland, California, she flew east with her navigator, Fred Noonan. After many successful stops and only 7,000 miles to go before they reached Oakland again, on the 2nd July they went off course and were never found.

The stop they were heading for was Howland Island, only a square mile in size and therefore difficult to find. Radio messages from the pair stated: “we must be on you but we cannot see you, fuel is running low, been unable to reach you by radio, flying at 1000 ft” . Their last ‘frantic’ communication was “on the line 157, 337”. Searches were conducted, covering 250,000 square miles, but on the 19th July, the plane was officially declared lost at sea.

amelia earhart 2

Various conspiracies have gripped the world decades later but renewed efforts to find the plane have found nothing. A theory that gained a lot of momentum was that she had been stranded southwest along the line 157,337 on an island nearby Howland Island, Gardner Island, now called Nikumaroro. Although planes passing over it on the 9th July 1937 reported no visible activity on the island, items recovered later included a woman’s shoe and the box for a sextant (a navigation device that could have been Earhart’s). Most importantly, 13 bones were recovered in 1941 and examined by a D. W. Hoodless in Fiji. He determined the bones to be a ‘short, stocky European man’ and in shocking scientific practice (some may say suspiciously so) he discarded the bones.

Many people have doubted his identification. Using his records, in 1998 TIGHAR (The International Group for History Aircraft Recovery) re-estimated the bones to be that of a 5ft 5-9inch European woman fitting Amelia’s biological profile. She was several inches taller than the average woman of the time, potentially accounting for the dismissal of the skeleton’s sex as male. Richard Jantz released a paper in March this year, further disputing Hoodless’ original assertion. He compared the length of the humerus, radius and tibia to photos of Amelia, information on her pilot’s license and historic seamstress measurements. His conclusion was that “Earhart is more similar to the Nikumaroro bones than 99% of individuals in a large reference sample.” The reason why only 13 bones were found could be down to the native coconut crabs, the largest living arthropods on land (and frankly, terrifying) which could have carried off the rest of the remains.

However, the lack of a complete skeleton remains an obstacle for the definitive identification of the remains. For instance, a pelvis bone could clear up any ambiguity with the sex of the skeleton. Without the original remains it is also impossible to conduct DNA analysis to confirm identity, something that soil analysis and bone sniffing dogs have failed to do. Therefore, this is a compelling argument but not close to indisputable certainty. Whatever you choose to believe, Amelia left a legacy that won’t be forgotten…

“Adventure is worthwhile in itself” – Amelia Earhart (1897-1937)

Biohacking: an upgrade to “wearable tech”, or turning ourselves into cyborgs? Ellie Marshall

Anyone who’s watched the futuristic Netflix show ‘Black Mirror’ will know of how emerging technology and our reliance on it can have unanticipated consequences – If you have not seen it, I highly recommend giving it a watch!

Yet, we might be closer to the futuristic world of Black Mirror than you think. Around the world, people are pushing the gruesome boundaries of how far we integrate tech with our lives, through a series of implants and body modifications. This is a branch of biohacking – a blanket term used to describe a whole spectrum of ways that people modify or improve their bodies. People who hack themselves with electronic hardware to extend and improve human capacities are known as Grinders or Transhumanists.

Common procedures

A common procedure is to implant a strong magnet beneath the surface of a person’s skin, often in the tip of the ring finger. Nerves in the fingertips then grow around the magnet. This allows nearby magnetic and electrical fields along with their strength and shape to become detectable to the user, thanks to the subtle currents they provoke. For a party trick, the person can also pick up metal objects or make other magnets move around.

Calling this a procedure, though, gives rather the wrong impression. Biohacking is not a field of medicine. Instead it is carried out either at home with DIY kits purchased online or in piercing shops, but without an anaesthetic (which you need a licence for). If you think this sounds painful, you are correct. With no corporate help, the only way grinders can accomplish their goals is by learning from other grinders, mainly through online forums such as biohack.me.

Britain is the birthplace of grinders and in 1998 Kevin Warwick, professor of cybernetics at the University of Reading had a simple radio-frequency identification transmitter (RFID) implanted in his upper left arm, in an experiment that he called Project Cyborg. The chip didn’t do much – it mainly just tracked him around the university and turned on the lights to his lab when he walked in. Still, Warwick was thrilled, and the media were enchanted, declaring him the world’s first cyborg.

RFID implants are now common among grinders and allow users to unlock physical and electronic barriers. Similar technology is already widely used in contactless card payment systems and clothing tags, and Motorola are developing an RFID-activated ‘password pill’ that a user can swallow and access their devices without the hassle of remembering them.

Other examples of biohacking

Circadia, developed by Biohack.me offshoot company Grindhouse Wetware is another implantable device that constantly gathers the user’s biometric data, for example transmitting temperature data via Bluetooth. The medical potential for this device is vast, and it has the most immediately practical benefits.

Additionally, the first internal compass, dubbed the ‘Southpaw’ has been invented. It works by sealing a miniature compass inside a silicon coat, within a rounded Titanium shell, to be implanted under the skin. An ultra-thin whisker juts out, which is activated when the user faces north, to lightly brush an alert on the underside of the skin.

Rich Lee, a star of biohack.me forum, has magnets embedded in each ear so he can listen to music through them, via a wire coil he wears around his neck, that converts sound into electromagnetic fields, creating the first ‘internal headphones’. The implants allow him to detect different sensors, so he can ‘hear’ heat from a distance and detect magnetic fields and Wi-Fi signals too! There is a practical purpose to Lee’s experiments, as he suffers deteriorating eyesight and hopes to improve his orientation through greater sensory awareness.

A damaging concept to users and society?

The question we must ask ourselves is at what point does the incorporation of all this technology make us a different species and what are the ethics behind that?

The bluntest argument against biohacking is that it’s unnatural. For most people, especially those who benefit from medical advancements like pacemakers and cochlear implants, adding RFID or magnets to the body appears to have little value. There are very few people who can’t recognize the benefits of technological progress and how it has helped humanity. Grinding, however is often not recognized as an advancement.

Another argument against human augmentation mirrors the worries that commonly surround genetic engineering. A thought provoking possibility is that those who have access to (and can afford) augmentation procedures and devices will gain unfair advantages over those who do not. Over generations, this could create a large rift between the augmented and the unaugmented. Luckily, the grinder movement provides a solution to this problem as part of its central ethos: open source hardware and the free access of information.

A benefit to the individual and society?

To some, implanted technology represents the next stage in mankind’s evolution that may bring many medical advancements. And, indeed, the idea is not outlandish. Brain stimulation from implanted electrodes is already a routine treatment for Parkinson’s and other diseases, and there are prototypes that promise to let paralysed people control computers, wheelchairs and robotic limbs.

The Wellcome Trust has begun a trial with Alzheimer’s patients carrying a silicon chip on the brain itself, to predict dangerous episodes, and able to stimulate weakened neurons. Military researchers Darpa are also experimenting with a chip implant on humans to help control mental trauma suffered by soldiers.

There is potential to help visually and hearing impaired people by using a chip that translates words and distances into sound, which could mean the end of Braille and sticks. Neil Harbisson is the founder of the non-profit Cyborg Foundation in Barcelona and was born with achromatopsia, the inability to see colours. Since 2004, Harbisson has worn a device he calls the eyeborg, a head-mounted camera that translates colours into soundwaves and pipes them into his head via bone conduction. Today Harbisson “hears” colours, including some beyond the visible spectrum.

These experimental grinders are certainly laying the groundwork for more powerful and pervasive human enhancements in the future, but for now, a Fitbit is more than enough for me.

 

https://www.techopedia.com/definition/29897/biohacking

http://www.abc.net.au/news/2017-02-23/biohackers-transhumanists-grinders-on-living-forever/8292790

http://www.slate.com/articles/technology/superman/2013/03/cyborgs_grinders_and_body_hackers_diy_tools_for_adding_sensory_perceptions.html

https://gizmodo.com/the-most-extreme-body-hacks-that-actually-change-your-p-1704056851

https://hackaday.com/2015/10/12/cyberpunk-yourself-body-modification-augmentation-and-grinders/

https://www.wired.com/story/hannes-wiedemann-grinders/

https://www.theverge.com/2012/8/8/3177438/cyborg-america-biohackers-grinders-body-hackers

http://edition.cnn.com/2014/04/08/tech/forget-wearable-tech-embeddable-implants/index.html

https://www.digitaltrends.com/cool-tech/coolest-biohacking-implants/

The Meat Industry: friend or foe? Keerthana Balamurugan

Meat has been and still is a universal ingredient in numerous societies, not to mention a major part of many traditions, but recent studies have discovered that the consumption of meat is slowly decreasing. Those who have turned vegan or have been eating less meat in moderation have praised the fact that it reaps countless health benefits. Eating less meat has also proved its value towards our environment as problems once created by the meat industry diminish as it recedes. Counter-claims have also arisen declaring that the trend is damaging the multi-billion dollar meat industry and the economy. Where should we stand between the two sides?

Slowly replacing meat products with healthier options such as vegetables, whole grains and even seafood can alter your health immensely for the better. The WHO, World Health Organization, released a report last year linking the consumption of red meat with certain types of cancer and also stating that just by consuming up to 100 grams of meat daily, cancer risk can increase by up to 20%. This statistic jolted people into awareness of the set-backs. In certain countries, trying the vegan diet has become the new trend, with seemingly everyone raving about it on their social media, as people caught wind of how taking in less meat and replacing it with healthier alternatives can aid weight loss. Currently there are more people suffering from obesity than starvation and nutritionists are stating meat as one of the causes. From this aspect, consuming less meat would do us all a favour.

Even with such statistics that backs up claims of the positives of eating less meat, there are those who question this. If we remove meat from our diets what happens to our body with the decreased protein and iron intake? One of the most common disadvantages of not eating enough meat is iron deficiency which can drastically affect our immune systems and the speed at which our body functions. It cannot be disagreed that meat supplies us with a dense source of protein but studies from the Harvard School of Medicine proves that a healthy diet of leafy greens, mushrooms, legumes and other iron-rich plant foods can easily compensate for the nutrients meat provides us. It is simply a balancing act.

It comes as no surprise that the multi-billion dollar meat industry is damaging our ecosystem by tearing down acres and acres of woodland as well as increasing carbon emissions. Agricultural emissions alone account for 30 % of global emissions. Through the production of just 1 kg of beef, 15,000 litres of water is required and up to 30 kg of carbon dioxide is released which accounts to greenhouse gases. Now imagine this multiplied by thousands and thousands of kilograms of meat. Livestock production is the number one use of land by humankind meaning the largest deforestation contribution to our planet. In Brazil, their large-scale commercial beef farming is the cause of 70% of cleared forests in the Amazon. Precious water is being used up and wasted compared to vegan alternatives, ecosystems are being destroyed because of the land clearing and worsening climate change are all effects of the non-sustainable industry. Many would agree upon consuming less meat in order to try and lessen the harm that is being done to the planet.

In the U.S alone, the meat industry is worth more than 800 billion dollars annually, providing over 6 million jobs. Huge numbers of people see the colossal benefit towards cutting down on meat, but what would that mean to the economy, and to the millions of people who rely on it for their wages? Yes, it is true that the sudden economic shift from consuming less would affect a country’s gross domestic product as well as employment rates but only in the short term. Many protest against the cut-down on meat because of these reasons, so is the long-term effect worth the tremendous risk? There is a whole new type of industry that has been booming in the market and that is vegan alternatives. The relatively new category of food products has brought in a whole new economy to the table, providing more jobs for higher wages and with less grueling working conditions.

Consuming less meat has more benefits than drawbacks, leading to a much healthier lifestyle and a cleaner environment for our planet and its inhabitants. If everyone on the planet were to eat meat in moderation, we would have lower percentages of those suffering from obesity and certain types of cancer, not to mention the effects of climate change would be less severe. We live in the day and age where there are so many options available to replace meat in our diets and with just a change in mindset and perspective, many more people can get on board the change realising the environmental and health benefits towards eating less meat.