Biohacking: an upgrade to “wearable tech”, or turning ourselves into cyborgs? Ellie Marshall

Anyone who’s watched the futuristic Netflix show ‘Black Mirror’ will know of how emerging technology and our reliance on it can have unanticipated consequences – If you have not seen it, I highly recommend giving it a watch!

Yet, we might be closer to the futuristic world of Black Mirror than you think. Around the world, people are pushing the gruesome boundaries of how far we integrate tech with our lives, through a series of implants and body modifications. This is a branch of biohacking – a blanket term used to describe a whole spectrum of ways that people modify or improve their bodies. People who hack themselves with electronic hardware to extend and improve human capacities are known as Grinders or Transhumanists.

Common procedures

A common procedure is to implant a strong magnet beneath the surface of a person’s skin, often in the tip of the ring finger. Nerves in the fingertips then grow around the magnet. This allows nearby magnetic and electrical fields along with their strength and shape to become detectable to the user, thanks to the subtle currents they provoke. For a party trick, the person can also pick up metal objects or make other magnets move around.

Calling this a procedure, though, gives rather the wrong impression. Biohacking is not a field of medicine. Instead it is carried out either at home with DIY kits purchased online or in piercing shops, but without an anaesthetic (which you need a licence for). If you think this sounds painful, you are correct. With no corporate help, the only way grinders can accomplish their goals is by learning from other grinders, mainly through online forums such as

Britain is the birthplace of grinders and in 1998 Kevin Warwick, professor of cybernetics at the University of Reading had a simple radio-frequency identification transmitter (RFID) implanted in his upper left arm, in an experiment that he called Project Cyborg. The chip didn’t do much – it mainly just tracked him around the university and turned on the lights to his lab when he walked in. Still, Warwick was thrilled, and the media were enchanted, declaring him the world’s first cyborg.

RFID implants are now common among grinders and allow users to unlock physical and electronic barriers. Similar technology is already widely used in contactless card payment systems and clothing tags, and Motorola are developing an RFID-activated ‘password pill’ that a user can swallow and access their devices without the hassle of remembering them.

Other examples of biohacking

Circadia, developed by offshoot company Grindhouse Wetware is another implantable device that constantly gathers the user’s biometric data, for example transmitting temperature data via Bluetooth. The medical potential for this device is vast, and it has the most immediately practical benefits.

Additionally, the first internal compass, dubbed the ‘Southpaw’ has been invented. It works by sealing a miniature compass inside a silicon coat, within a rounded Titanium shell, to be implanted under the skin. An ultra-thin whisker juts out, which is activated when the user faces north, to lightly brush an alert on the underside of the skin.

Rich Lee, a star of forum, has magnets embedded in each ear so he can listen to music through them, via a wire coil he wears around his neck, that converts sound into electromagnetic fields, creating the first ‘internal headphones’. The implants allow him to detect different sensors, so he can ‘hear’ heat from a distance and detect magnetic fields and Wi-Fi signals too! There is a practical purpose to Lee’s experiments, as he suffers deteriorating eyesight and hopes to improve his orientation through greater sensory awareness.

A damaging concept to users and society?

The question we must ask ourselves is at what point does the incorporation of all this technology make us a different species and what are the ethics behind that?

The bluntest argument against biohacking is that it’s unnatural. For most people, especially those who benefit from medical advancements like pacemakers and cochlear implants, adding RFID or magnets to the body appears to have little value. There are very few people who can’t recognize the benefits of technological progress and how it has helped humanity. Grinding, however is often not recognized as an advancement.

Another argument against human augmentation mirrors the worries that commonly surround genetic engineering. A thought provoking possibility is that those who have access to (and can afford) augmentation procedures and devices will gain unfair advantages over those who do not. Over generations, this could create a large rift between the augmented and the unaugmented. Luckily, the grinder movement provides a solution to this problem as part of its central ethos: open source hardware and the free access of information.

A benefit to the individual and society?

To some, implanted technology represents the next stage in mankind’s evolution that may bring many medical advancements. And, indeed, the idea is not outlandish. Brain stimulation from implanted electrodes is already a routine treatment for Parkinson’s and other diseases, and there are prototypes that promise to let paralysed people control computers, wheelchairs and robotic limbs.

The Wellcome Trust has begun a trial with Alzheimer’s patients carrying a silicon chip on the brain itself, to predict dangerous episodes, and able to stimulate weakened neurons. Military researchers Darpa are also experimenting with a chip implant on humans to help control mental trauma suffered by soldiers.

There is potential to help visually and hearing impaired people by using a chip that translates words and distances into sound, which could mean the end of Braille and sticks. Neil Harbisson is the founder of the non-profit Cyborg Foundation in Barcelona and was born with achromatopsia, the inability to see colours. Since 2004, Harbisson has worn a device he calls the eyeborg, a head-mounted camera that translates colours into soundwaves and pipes them into his head via bone conduction. Today Harbisson “hears” colours, including some beyond the visible spectrum.

These experimental grinders are certainly laying the groundwork for more powerful and pervasive human enhancements in the future, but for now, a Fitbit is more than enough for me.

The Meat Industry: friend or foe? Keerthana Balamurugan

Meat has been and still is a universal ingredient in numerous societies, not to mention a major part of many traditions, but recent studies have discovered that the consumption of meat is slowly decreasing. Those who have turned vegan or have been eating less meat in moderation have praised the fact that it reaps countless health benefits. Eating less meat has also proved its value towards our environment as problems once created by the meat industry diminish as it recedes. Counter-claims have also arisen declaring that the trend is damaging the multi-billion dollar meat industry and the economy. Where should we stand between the two sides?

Slowly replacing meat products with healthier options such as vegetables, whole grains and even seafood can alter your health immensely for the better. The WHO, World Health Organization, released a report last year linking the consumption of red meat with certain types of cancer and also stating that just by consuming up to 100 grams of meat daily, cancer risk can increase by up to 20%. This statistic jolted people into awareness of the set-backs. In certain countries, trying the vegan diet has become the new trend, with seemingly everyone raving about it on their social media, as people caught wind of how taking in less meat and replacing it with healthier alternatives can aid weight loss. Currently there are more people suffering from obesity than starvation and nutritionists are stating meat as one of the causes. From this aspect, consuming less meat would do us all a favour.

Even with such statistics that backs up claims of the positives of eating less meat, there are those who question this. If we remove meat from our diets what happens to our body with the decreased protein and iron intake? One of the most common disadvantages of not eating enough meat is iron deficiency which can drastically affect our immune systems and the speed at which our body functions. It cannot be disagreed that meat supplies us with a dense source of protein but studies from the Harvard School of Medicine proves that a healthy diet of leafy greens, mushrooms, legumes and other iron-rich plant foods can easily compensate for the nutrients meat provides us. It is simply a balancing act.

It comes as no surprise that the multi-billion dollar meat industry is damaging our ecosystem by tearing down acres and acres of woodland as well as increasing carbon emissions. Agricultural emissions alone account for 30 % of global emissions. Through the production of just 1 kg of beef, 15,000 litres of water is required and up to 30 kg of carbon dioxide is released which accounts to greenhouse gases. Now imagine this multiplied by thousands and thousands of kilograms of meat. Livestock production is the number one use of land by humankind meaning the largest deforestation contribution to our planet. In Brazil, their large-scale commercial beef farming is the cause of 70% of cleared forests in the Amazon. Precious water is being used up and wasted compared to vegan alternatives, ecosystems are being destroyed because of the land clearing and worsening climate change are all effects of the non-sustainable industry. Many would agree upon consuming less meat in order to try and lessen the harm that is being done to the planet.

In the U.S alone, the meat industry is worth more than 800 billion dollars annually, providing over 6 million jobs. Huge numbers of people see the colossal benefit towards cutting down on meat, but what would that mean to the economy, and to the millions of people who rely on it for their wages? Yes, it is true that the sudden economic shift from consuming less would affect a country’s gross domestic product as well as employment rates but only in the short term. Many protest against the cut-down on meat because of these reasons, so is the long-term effect worth the tremendous risk? There is a whole new type of industry that has been booming in the market and that is vegan alternatives. The relatively new category of food products has brought in a whole new economy to the table, providing more jobs for higher wages and with less grueling working conditions.

Consuming less meat has more benefits than drawbacks, leading to a much healthier lifestyle and a cleaner environment for our planet and its inhabitants. If everyone on the planet were to eat meat in moderation, we would have lower percentages of those suffering from obesity and certain types of cancer, not to mention the effects of climate change would be less severe. We live in the day and age where there are so many options available to replace meat in our diets and with just a change in mindset and perspective, many more people can get on board the change realising the environmental and health benefits towards eating less meat.

What is Earth Day, and do we really need it? Emma Hazelwood

Happy Earth Day!

Earth Day is the world’s largest environmental movement, celebrated by more than a billion people every year. It is a day dedicated to raising awareness of various environmental issues worldwide.

Earth Day was started in 1970. Its founder, Gaylord Nelson, had witnessed the appalling consequences of a massive oil spill in California, 1969. Inspired by the student anti-war movement in the US, he wanted to have a similar campaign for environmental protection, in the hopes that politicians would have no choice but to start taking the conservation of our planet seriously.

It is believed that on the first Earth Day, 22nd April 1970, twenty million Americans took part in rallies across the county. It united different groups of activists, as well as people from all walks of life, and is often credited with launching the modern environmental movement. The demonstration led to the creation of the Environmental Protection Agency, and the passing of the Clean Air, Clean Water, and the Endangered Species Acts.

Since then, the campaign has grown beyond what anyone could have predicted. An effort to make it go global in 1990 paved the way for the United Nations Earth Summit in 1992. With the 50th anniversary of Earth Day on the horizon, the campaign is now aiming to reignite the flame of environmental activism, in a bid to fight the rising atmosphere of cynicism and distrust surrounding climate change.

The Earth Day 2018 campaign is based on ending plastic pollution. Plastic doesn’t biodegrade, so it piles up in the environment, and poisons marine life and our water systems. More than eight million tonnes of plastic are dumped into our oceans every year, and by 2050, the oceans will contain more plastic than fish by weight. Plastic build up results in animal entanglement, ingestion, or habitat disruption. This not only depletes fish stocks, but can result in the build-up of toxins in our food, resulting in higher rates of cancer, birth defects, impaired immunity and many more health issues for humans. As is often the case with rising environmental issues, communities which suffer the most from plastic pollution are often already vulnerable. With the rise of zero waste shops (and David Attenborough’s calls for no more plastic straws), plastic-free life is becoming more and more achievable.


Plastic washed ashore on a beach in San Francisco.

However, plastic pollution is just one of the issues facing our planet. The recent death of the last male Northern white rhino has rendered the species functionally extinct (though there are hopes that IVF may be able to bring the species back from the brink). Unfortunately, this is not an isolated issue, with 5,583 animal species considered to be critically endangered. Habitat loss due to human expansion and hunting are often major causes.

Another area for concern is the bleaching of coral reefs. We have already lost half of the world’s coral reefs, and 90% will be gone by 2050. Death of the organisms which used to inhabited reefs is linked to a rise in ocean temperature, as a result of climate change. Bleaching happens when coral get so stressed by extreme temperature that they release the tiny algae, known as zooxanthellae, which provide the coral with their food. A higher concentration of CO2 in the atmosphere means that more is being dissolved into the ocean, causing ocean acidification. This, along with overfishing, is another problem for coral reefs, as it makes it harder for vital reef organisms to build their exoskeletons. Coral reefs are often described as “underwater rainforests” because of the vast numbers of species that rely on them. Their death is not only a tragedy for biodiversity, but could actually make global warming even worse, as they also produce oxygen. They are also important for tourism, and bring in billions of dollars in revenue in some places.


A bleached coral reef, which was once teaming with life and full of vibrant colours.

This Earth Day, there is a drive to cut down on plastic use. You can calculate your plastic pollution here:, and there is a guide to living plastic free here: Hopefully Earth Day 2018 will convince both governments and individuals to start making meaningful and much needed changes to cut down on plastic pollution. However, with so many ecosystems being threatened by human activity, it is vital that we start to consider the wider effects of our lifestyle every day.

More information on Earth Day’s End Plastic Pollution Campaign:

For more information on plastic build up in oceans:

More information on endangered species:

More information on coral reef bleaching:

A few zero waste shops will soon be opening in Sheffield – one in our very own  SU and one in Crookes



Shoot for the Moon: Would the USA’s Cold War plan to blow it out of our night sky really work? Fiona McBride

In 1958 – the year after the Soviet Union’s Sputnik became the first object to be launched into space by humankind – 60 years ago – the government of the USA began to work on a secret plan to assert their dominance on the stage of world power: by blowing up the moon. Known covertly as “project A119”, the intention was to make the military might of the USA abundantly clear to all on earth.

 Of course, the first question this raises is: would such a show of force actually be possible? Though it may look small from down here, and is supposedly made of green cheese, the moon is actually a seventy trillion megaton rock located four hundred thousand kilometers away. That’s quite a big thing to blow up, and a significant distance to send explosives. The explosion would have to have enough energy to not only break the moon into pieces, but also send them far enough away from one another that their gravitational fields – the attractive forces that act between all objects – wouldn’t be able to pull them back together. Otherwise, the single lump of geological matter we call our moon would simply be replaced by a pile of lunar rubble. It is estimated that such an explosion would be equivalent to the detonation of thirty trillion megatons of TNT; given that the Tsar Bomba – the most powerful nuclear bomb ever built – had an explosive power of fifty megatons, blowing up the moon would require six hundred billion of these. Humanity has neither the uranium supplies to build such a bomb, nor the rocket technology to get it there.

Other options include creating a “moon quake” to split apart the internal structure of the rock; this would need to be equivalent to a 16.5 on the Richter scale. The most violent earthquake recorded read just 9.5 on the Richter scale, so it’s unlikely that such a quake could be artificially produced on the moon. Alternatively, the moon could be zapped with a giant laser, however this would need to provide the same amount of energy instantaneously as the sun outputs every six minutes. Humans don’t really have the resources to power such a thing.

 It seems, therefore, that blowing up the moon to assert their dominance over the space and nuclear spheres wasn’t really an option for the USA in 1958 – or even sixty years later – due to a lack of both technology and resources. However, the idea of blowing a large crater in the moon, in order to produce a giant explosion to demonstrate to the world the might of the USA, and leave behind a crater visible from earth to remind them of it forevermore was also considered. This, too, was dismissed in 1959; the reasons for this are not clear, but perhaps those in charge of the project realised how utterly ridiculous their own idea sounded.

 But let’s just take a step back for a moment, and imagine if exploding the moon were possible: what would the consequences be here on earth? Would lumps of moonrock kill us all? What would life be like on a moonless planet?

So the moon has exploded. The first thing most humans notice is a big, bright cloud spreading out through the sky where the moon used to be. This is the light from the explosion illuminating the moon debris. Dust then covers the sky for a while, making daylight darker and air travel impossible for a few months. Our seas and lakes are still tidal – the sun exerts a gravitational pull on the earth that contributes to this, but does not move relative to the earth – so there will be no spring or neap tides – the water will rise to one-quarter the height of a spring tide and return to the same lower level each day. Fragments of moon start to fall to earth; some burn up as they enter our atmosphere; others hit the ground and wreak havoc where they land, though it is unlikely that this would be catastrophic for humanity, as they would move slowly in comparison to other astronomical objects that fall to earth, such as asteroids.

 Once the dust clouds have cleared, the next noticeable thing is a lot more stars. The moon is by far the brightest object in the night sky, so with it out of the way, nighttime will be darker and the stars much brighter by comparison. One –or more – smaller ‘moon replacements’ may also appear in the sky, if the explosion leaves some larger chunks of rock as well as debris and dust. Of course, this debris and dust continues to rain down on the earth whenever a piece falls out of orbit.

 Only after the majority of this debris has cleared – in perhaps a few thousand years – is the next major effect noticeable by humans: the earth will tip over. Gravitational interactions between the earth and the moon are what is currently preventing this; without it, the earth will tip on its axis, causing the poles to melt and an ice age to occur every few thousand years on whichever part of the planet is furthest from the sun at that point.

 So, although exploding the moon isn’t really possible – and certainly wasn’t in the 1950s – it wouldn’t have utterly catastrophic consequences for the earth, just bring significant change. However, as a show of force, it still seems somewhat excessive.


From Rulers of Countries to Rulers of Length – Chloe McCole

From the watch around your wrist, to the speedometer in your car, our lives are filled with measuring devices. Many believe we can trace our love of measuring to the ancient obsession over measuring the length of a certain body part – No, I don’t mean that one.

In Ancient Egypt the ‘cubit’ was an important measurement and was the length of the arm, from elbow to outstretched fingertips. This simple measurement was used to realise the design and construction of arguably their greatest achievement, the great pyramids.

However, like us the Ancient Egyptians weren’t all the same size or shape so how did they cut all their stones to an accurate and consistent size? This they managed by standardising the length in the form of The Royal Cubit, a piece of black granite cut to a fixed length. This was used as a guide for the production of the wooden cubits that were then used on building sites throughout Egypt.

Jump forward 4000 years and head 3000 kilometres to the west, to the people of France who weren’t so much building pyramids as storming palaces. It’s the 18th century and the French Revolution is in full swing and while a certain monarch purportedly mocked the starving peasants with taunts of “let them eat cake”, the Academy of Sciences decided the time had come for a total overhaul of the measurement system, they wanted to be able to measure the exact amount of said brioche, in standardised units.

Whilst the timing of this may seem to many to be a little strange, in this time of great confusion the development of the metric system provided much needed order. France, like many other countries, had already defined units of measure; the problem being that even though some units shared a name across many countries, their magnitudes varied, sometimes they even varied from town to town, imagine Rotherham measuring things differently to Sheffield. So the Academy wanted to define a set of base units of measurement that they could then use to derive all other measurements. First though they had to agree on how to determine the unit for distance.

There was initially two front runners in solving this problem, the first, using pendulums was dismissed due to subtle differences in the force of gravity across the world affecting the pendulum, the second involved a much trickier proposition. Pierre Méchain and Jean Baptiste Joseph Delambre were assigned the task of working out the distance from the Equator to the North Pole, using the invisible meridian through Paris. The Academy then decided that the base unit of length would be set as one ten-millionth of this calculated distance. This unit was to be known as a metre.

As you can imagine this assignment took a while, and the Academy quickly grew impatient, so whilst these calculations were going ahead they had a number of platinum rods commissioned, and just like the Royal Cubit, these bars were used as the calibration standard of all measurements.

It actually took six long years for Méchain and Delambre to finally report their findings and the platinum rod that most closely corresponded to their resulting value gained a spot in the National Archives.
The Academy didn’t stop there, with a total of seven base Système International d’Unités (SI units) being established since the French Revolution. Up next was the kilogram or, as it was known at the time, le grave. Using the newly defined metre, a base unit of mass was determined, set at one decimetre cubed of water at a temperature of 4˚C.  Engineers then created a platinum cube that corresponded to this mass and sent it to sit next to the metre rod in the National Archives.

Three of the remaining five units define the everyday quantities of electrical current (ampere), temperature (kelvin) and time (second), while the other two are more specialist and refer to amount of substance (mole) and luminous intensity (candela).

Interestingly the kilogram is the only base unit still defined by the mass of a physical object, although even this is set to change with a push to define units in terms of measurable natural constants rather than the properties of manufactured objects. For example, the metre is now defined as the length of the path travelled by light in a vacuum during a time interval of 1/299,792,458 of a second. That second no longer defined as the archaic value set by a fraction of a day, but rather the time taken for 9,192,631,770 cycles of the radiation required for a caesium atom to vibrate between two defined states of energy! – simple right?

Face Blindness – what is it (and does it actually exist)? Gege Li

What’s the first physical thing you notice about someone when you meet them for the first time? Is it whether they’re male or female? Maybe it’s their eye colour or freckles. Their unusually-shaped nose, even. Whatever it may be, the combination of these unique traits is what we use to recognise the important people in our life, as well as all the others we know and encounter in between.

So imagine what it would be like if you couldn’t use characteristics like these to distinguish between your mum or your best friend, your old high school teacher or your postman. That’s exactly what sufferers of face blindness – also known by its fancy name, ‘prosopagnosia’ – must endure for most, if not all, of their life (people usually get it from birth).

The inability to recognise people by their face alone affects approximately two in one hundred people in the UK. Sufferers may fail to judge a person’s age, gender or emotional expression from their face or spot similarities and differences between two different faces.

Unsurprisingly, this can have a profound impact on the behaviour and even mental health of sufferers. Although many cope by using alternative strategies such as a person’s distinctive voice, hairstyle or way of walking as identifiers, for others this doesn’t always work, especially if their acquaintance has recently changed their appearance. What are your options when even secondary clues become impossible to spot?

For some, the condition will cause them to deliberately avoid social interactions altogether, which can lead to relationship and career problems, bouts of depression and, in extreme cases, the development of social anxiety disorder. The latter may prevent a person from even leaving their house for overwhelming fear of social situations and embarrassment.

And to make matters worse, it isn’t only other people that someone with face blindness might not recognise. Objects, including places, cars and animals, also present difficulties, particularly for navigation and memory – even their own face staring back at them in the mirror could be an alien sight.

However, for anyone who’s an avid watcher of Arrested Development, you might find yourself questioning the legitimacy or even existence of face blindness as it’s played out through the eccentric and often ridiculous character of Marky Bark. On the show, Marky’s affliction with face blindness, exaggerated to the point it seems at times almost unbelievable, doesn’t appear to cause him half the inconvenience or trauma that it can in reality. Though viewers are made aware of the condition and its characteristics, does the show’s overtly comedic context promote public health misconceptions while hindering an educational message?

It is important to establish that face blindness really does exist and is much less a quirky trait than a cognitive impairment with potentially serious social and emotional consequences. But what causes it exactly? Although it’s true that it can develop following brain damage (from a particularly nasty knock to the head, for example), it has become increasingly clear that ‘developmental prosopagnosia,’ where people simply don’t develop facial recognition, is the most common cause. In fact, as many as one in fifty people might have it, equating to a potential one and a half million sufferers in the UK.

What’s more, many sufferers have reported to a parent or sibling experiencing the same kinds of difficulties, so it’s likely that genetics play a role in the occurrence of face blindness among families.

With all the trouble that face blindness can cause at the expense of someone’s livelihood and even health, it might be reassuring to point out that there are ways to determine whether you might suffer from the condition.

Often, doctors will use computer-based tests that require people to memorise and identify a set of faces, including those of celebrities. Most recently in 2015, a partnership between several doctors and universities in London coined a new questionnaire that can aid both a person’s diagnosis of face blindness as well as a measurement of its severity.

The inevitable bad news, however, is that there isn’t currently a way to treat face blindness directly other than to help sufferers improve their facial recognition with training and rehabilitation programmes.

There’s still hope for the future though – the prosopagnosia research currently taking place at Bournemouth University has also hinted towards using pharmaceuticals to temporarily intervene with face blindness, to some success. Once these techniques have been developed further, a successful cure or therapy might not be too far off.

In the meantime though, if you’re able to recognise a familiar face, why not consider taking the time to appreciate the people close to you (especially their mug) just that little bit more? Don’t take face perception for granted – you’re luckier than you think…



The Science of a Happy Marriage- Ciara Barrett

“Marriage is like a deck of cards, you start with 2 hearts and a diamond and by the end you wish you had a club and a spade.” If my future spouse ever says that about me you can bet they’ll be right. Recent statistics show that 50% marriages in the United States end in divorce, and this figure expands to 41% of all first marriages for the rest of the world. Couples without children are 40% less likely to divorce, and the average age for people divorcing is 30. These are some scary statistics and as someone who doesn’t believe in soulmates it only supports my personal scepticism about marriage.

Marriage is said to come in 5 stages which most couples will experience: romance, disillusionment, power struggle, awakening, then long term marriage. Romance is the fluttery stomach, heart-eyes, blissfully ignorant stage, or the honeymoon phase. Couples have high pheromones, and experience increased oxytocin, a hormone which lets them ignore each other’s irritating behavioural traits. Disillusionment is when this fades and they see each other for who they really are. The hormones wear off, they want to spend more time apart, and their flaws become visible. Couples who get married in the romance phase are likely to have second thoughts now. The misleadingly named power struggle is when couples try to revert to who they were during the Romance phase to try and “fix” the relationship. They want to spend more time with friends and family but may become jealous of their partner doing this too. The awakening phase is the resolution of this and the couple realises they need to give each other space and accept the other for who they are, flaws and all, and reclaim some of their individuality, which leads into the long-term marriage phase where they resolve conflicts easily and are very comfortable with each other.

During these stages, there are little things you can do to keep the day to day marriage alive and studies show there are some sure-fire handy tips for a happy relationship.

The number one tip might be surprising: how do you react to your partner’s good news? If they just got a promotion, or that package they ordered arrived and they’re delighted, then mirroring your significant other’s reactions to a positive situation has more of an impact on the relationship than being a shoulder to cry on during a tough time. Of course, both are important, but your response to a positive situation can make them feel so much better and it shows you care about their success as well as when they’re upset.

Next is the 5:1 rule: for every bad interaction you have, there must be 5 good ones to balance it out. Every time you fight over the remote or make a snide comment about their cooking, there needs to be at least 5 shared moments of laughter, dinner dates or meaningful compliments to make up for it.

An often forgotten piece of advice is to stay close to family and friends. Don’t rely on your SO for all your happy moments; they should be a big part of your life, not all of it. Remember to see your other circles and get emotional fulfilment from them since they’ve probably been around in your life before your partner was.

Keep it exciting! This is both in and out of the sex department (which should be happening regularly anyway); you can keep going on dinner dates and buying flowers and having movie nights in because spending time together should always be fun. When it stops being fun then you need to seriously consider why you’re in the relationship and how you can fix it. Remember to talk to them and don’t let any of your happy memories become tainted. Couples who reminisce about shared moments of laughter have more emotional satisfaction than those who just have good experiences. So laugh with them and be stupid and funny and don’t let it get boring ever.

On the other hand, there are some serious red flags to avoid: criticism, defensiveness, contempt and stonewalling. These are when they attack you as a person rather than the individual mistake you made during an argument or even daily life. Watch out for eye rolling, use of the word “you”, and the silent treatment. If/ when this happens it is important to talk it out and be prepared to take a break and move through the stages mentioned above. Remember that everyone has bad days and don’t attack them back even if they’re hurting you.

Any marriage sceptic will say all this is easier said than done, which isn’t wrong, but couples fortunate enough to want to get married will want to make it last so at least now can use the science to their advantage. Love is but a chemical reaction, after all.