Clearing Up Our Space Junk – Matt Jones

Over the last sixty years or so, space exploration has been at the fore of the public’s imagination, and our desire to learn more about the universe we live in has led to the advancement of space technology. This has resulted in many test launches and experiments, during which a whole array of spacecraft and satellites have been sent into space. Consequently, we have slowly been contributing to an ever-growing jumble of junk that is now orbiting Earth. Although it’s out of sight, the University of Surrey are working to make sure that it is not kept out of mind. Later this year, they are launching a spacecraft on a mission called RemoveDebris, which will hopefully do exactly what is says on the tin before burning up into flames.

Broadly speaking, the term “space junk” refers to any man-made object in space that no longer serves a useful purpose. This definition encompasses objects such as used boosters, dead satellites and even Elon Musk’s Tesla Roadster, a sports car owned by the CEO of Tesla and SpaceX, which was used as a dummy payload for the test flight of the SpaceX Heavy Falcon earlier this month. Even though the car is technically space junk, it is following an orbit around the sun and so it poses little cause for concern. Unfortunately, the same cannot be said for the 7500 tonnes of junk that the European Space Agency have estimated orbits the Earth.

Having so much debris orbiting the Earth is a problem because even the smallest objects can cause a lot of damage. In 2016, a fleck of paint chipped a window on the International Space Station (ISS), which regularly has to move out of the way of bigger pieces of junk. Furthermore, a piece of junk just 10 centimetres long could devastate a satellite. This could have detrimental effects on communication and weather forecasting, making clearing our cluttered low orbit environment a joint responsibility.

Another dangerous aspect of the debris is the potential for a cascading collision effect known as the Kessler Syndrome. This is a scenario triggered by the collision of two large objects that then cause a self-sustaining chain reaction of collisions, producing more debris. The large inactive satellite Envisat, which is owned by the European Space Agency, has been listed as a potential trigger for a Kessler event. It weighs roughly 8 tonnes and it passes within 200 metres of two other pieces of catalogued space junk every year.

Research into ways of clearing up our space junk is therefore of immediate relevance and we will hopefully be able to learn a lot from the RemoveDebris mission, which is being led by the Surrey Space Centre at the University of Surrey.

The small RemoveDebris spacecraft – the size of a washing machine – was shipped to the Kennedy Space Centre in Florida in December. It will be launched into space later this year on a ISS resupply mission. Once at the ISS it will be unpacked by astronauts and deployed on its mission to experiment with techniques in which debris can be collected and removed from orbit.

In the first scheduled experiment a cubesat (a miniaturised satellite used for space research) will be ejected from the spacecraft. A net will then be ejected from the spacecraft to ensnare the cubesat. The development of this kind of capture technique could lead to space junk being hauled out of orbit by spacecraft in the future. The heat upon re-entry to the Earth’s atmosphere will cause the space junk to burn up.

The second capture experiment is due to test a harpoon system. In this experiment, a target, made out of the same materials as satellite panels, will be extended out by the spacecraft. A harpoon will then be fired at the target. If a successful hit is made, this will be the first harpoon capture in orbit.

The third experiment will test vision based navigation. In this experiment, another cubesat will be ejected from the spacecraft. Cameras on-board the spacecraft will be used to collect data, which will then be sent to Earth and processed on the ground. If successful this will validate the use of vision-based navigation equipment, and ground-based image processing, in the context of active debris removal.

Finally, at the end of the mission, the spacecraft will deploy a large drag sail. The sail is made out of a reflective material and uses radiation pressure exerted by photons of light from the sun to produce thrust in a phenomenon known as solar sailing. The sail will cause the spacecraft to gently de-orbit before it violently burns up upon re-entry into the Earth’s atmosphere so that, crucially, it doesn’t become space junk itself.

Whether the RemoveDebris mission is a success or not, it is important that we keep the ball rolling into the future. Clearing up our skies is a collective responsibility between governments and space agencies as the consequences of an essential satellite being damaged, or space missions being grounded, will affect us all. Space exploration has an inexhaustible ability to inspire and thrill us, so let us not call time early on this journey simply because of our inability to look after the planet we live on.

 

First Primates Cloned – Keerthana Balamurugan

On the 5th of December 2017, a pair of identical macaques, a type of monkey, was born through the method of SCNT (somatic cell nuclear transfer) in China. This was the same method that was used to create Dolly the sheep who was the first successful case of animal cloning from an adult cell. When the news came out of the successful birth of the monkeys, the world was in awe of all the possibilities that could arise from the cloning of animals; using them to model diseases, to make stem cells and for drug production. There was also a part of the world that was sceptical of the news, what could this mean for the future and possibility of human cloning and gene manipulation to create a race of perfect humans. Ethical issues are raised as well as to how far humans should be in involved in the production of new life.

Let us start of with SCNT or somatic cell nuclear transfer. This process starts with a cell containing a donor nucleus from female monkey A and this is fused with an egg from female monkey B that has been stripped of its nucleus. The resulting embryo is placed in the uterus of monkey C. Hence, the infant that is born is a clone of monkey A. This is a small part of a very large and complicated process that is SCNT, where there is a need to use enzymes that return the fused cell into an early embryonic state where it can differentiate into every cell type in the body.

This new version of SCNT is slightly different compared to when it was used for Dolly the sheep. When Dolly was created, the scientists wanted to clone cells that came from the udder of a pregnant sheep. The researchers then starved the cells for a week in order to stop them from dividing. From here onwards, the researchers followed the steps mentioned earlier apart from one small difference where gentle pulses of electricity were applied which fused the egg and the new nucleus together. After the successful implantation of the embryo into the uterus of a surrogate, Dolly was born into the world and this was witnessed by only a select few as the project was conducted with great secrecy. Sadly, Dolly the sheep was euthanised in 2003 as it had lung disease. Dolly was alive for six years but most sheep live twice as long, so questions were raised as to what went wrong in the cloning process that made Dolly age so quickly.

So how easy or hard was it to produce the pair of identical macaques? The researchers in China who conducted the experiment went through 79 embryos that were implanted into 21 surrogates, and this resulted in two babies being born. This set of experiments used cells from the foetus while another set of experiments used adult cells that resulted in 181 embryos and 42 surrogates. Again, two babies were born, but they died shortly after because of the complicated process of reprogramming genes in adult cells.

Half the world was in awe when the two baby monkeys were born while the other half raised questions over the ethics of the situation. The researchers who conducted the experiment in China have clearly stated that they have no intention of moving on to cloning humans but still the populace is sceptical towards those who do not have similar intentions. In addition, as seen in Dolly’s case, the clones of animals have shorter survival rates, so are we being inhumane by cloning animals knowing they are going to die sooner? The debate goes on.

So what purpose does cloning primates serve? One possible use is to explore how genes react in conditions such as Alzheimer’s and Parkinson’s disease. These two diseases are still poorly understood to this day so with gene manipulation in clones, we can see how they interact with drugs and basically how they behave in the body in order to grasp a better understanding of the two diseases. There are also some researchers out there looking into cloning as a way to create stem cells in order to remove the danger of immune rejection. Other such purposes could include reviving endangered or extinct species, reproducing a deceased pet, cloning livestock and drug production where scientists could insert a gene into the DNA of cells that codes for a drug or a vaccine. The animal could pass the gene for example in its milk.

There is a lot of controversy and concerns over cloning, but that should not sway people away from the myriad of useful possibilities especially when it comes to understanding diseases and helping those with it. The sheer possibility of being able to clone the pair of macaques proves that there is so much good that could come out of this research. Who knows what we might come up with in the future?

The Science of Spice – Ciara Barrett

We, as Brits, love spicy food. Studies show that as a nation, Chinese and Indian are our all-round favourite cuisines even after our own classics like fish and chips or the humble traditional roast. This leads us to ask why and how do so many people, despite not having grown up eating spicy, rich food, love it with such a passion? These foods originated from other parts of the world and even the growing effect of multicultural communities doesn’t explain why we have grown so diverse in our food choices. Another possible question is why is there such a big divide between chilli lovers and haters? There are those that have stuck to the British stereotype of preferring milder food, which is perfectly fine, but what determines this difference?

Diving into the chemistry side, capsaicin is the most common molecule found to give chillies their heat, belonging to the capsaicinoids, a class of compounds found in almost all peppers. It stimulates the pain receptors in your mouth which explains the burning felt when eating spicy food. This is the same receptor that responds when you touch hot objects by transmitting pain to the brain. (As a side note, capsaicin is hydrophobic, meaning it repels water, which explains why drinking water really won’t help to wash it down when you accidentally put too much chilli sauce on your food.)

There are a number of relatively untested theories as to why we love spicy food (spicy food here meaning food containing chillies and not food with actual spices like ginger, paprika, etc.) and why some people can tolerate higher levels of heat than others who prefer none at all.

One possible explanation is that spice tolerance is genetic; some people have less responsive receptors which gives them a high tolerance. In the same way that people may have a naturally high pain tolerance (read: fire eaters), people with a high spice tolerance aren’t affected as much by the effects of spicy food and therefore feel less pain from it and can eat hotter chillies to get the same effect as someone with more sensitive receptors. Similarly, fondness of strong flavours has been shown to have genetic factors in some people too, so some people really are born with it.

The next theory is that tolerance of spice is an environmentally influenced trait. People who eat spicy food regularly and/or from a young age gain a greater tolerance and don’t feel as much burn from it. It was thought that people who ate lots of hot chillies from a young age had damaged nerve endings in their taste buds, but taste buds only have a lifespan of 10-14 days so, like burning your tongue on a hot drink, the damage would be non-permanent even if chillies did burn off your taste buds. However, with this myth busted, it is in fact possible to become used to a certain level of spice by eating it from a young age and thus be able to tolerate hotter food later in life because you’ve become accustomed. This redefines your threshold of what constitutes ‘very spicy’ relative to others and is known as desensitisation.

The final, least tested theory is that for some people, eating spicy food is a “thrill-seeking activity”. Like touching a hot surface, the pain receptors transmit a message to the brain that this food is possibly dangerous, but the logical side of the brain knows it isn’t. This is a similar situation as coming down a rollercoaster- once the body realises this seemingly dangerous activity isn’t dangerous at all, it gives a thrilling rush of endorphins.

These theories all aim to explain why some people go for the Extra Hot option at Nando’s and how this is possible in an infamously “bland” society like Britain. They’re all relatively uncharted so for now it will remain a mystery as to why we love spicy food; maybe it’s because it just feels like a rollercoaster.

Denying the evidence – Why do people stick to their beliefs in the face of so much evidence? Emma Hazelwood

It has been accepted in the scientific community that climate change is a result of human activity for almost twenty years. However, a study in 2016 found that less than half of U.S. adults believed that global climate change is due to human activity. In 2012, Trump tweeted that “The concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive”. In a world with overwhelming evidence to the contrary, how can people continue to believe that global warming doesn’t exist?

Once people believe an argument, it is very hard to persuade them otherwise, even if they are told that the information they based their opinion on is incorrect. In a study conducted at Stanford University, two groups of students were given information about a firefighter named Frank. One group were told that Frank was a good firefighter; the other that Frank was a poor firefighter. Participants were then told that the information they’d been given was fake. Afterwards, they were asked to give their own opinion on how Frank would respond to a high-risk situation. Those who had initially been told that Frank was a good firefighter thought that he would stay away from risks, but those who had been told that he was a poor firefighter thought that he would take risks. This study shows that, even though they were then told it was fabricated, the initial information influenced participants’ opinions.

Confirmation bias is when people are more likely to believe facts which support an opinion they already had, rather than evidence to the contrary. A study in Stanford in 1979 involved two groups of students. One group was for capital punishment, the other against. Both groups were shown two fabricated articles. One contained data that supported capital punishment, the other data that opposed it (the statistics were designed to be equally strong in each article). Both groups stated that the source which supported their argument was more reliable. Furthermore, when asked to express their opinions on capital punishment after the study, both groups supported their standpoint even more than before. This demonstrates human nature to selectively believe what we want to be true.

It is believed that humans act this way because it was beneficial in early hunter-gatherer societies. Confirmation bias not only encouraged humans in societies to collaborate, but it was also important for social status to be considered correct. One theory for why seemingly rational humans continue to think irrationally is that we get a rush of dopamine when we see evidence which validates our opinion.

However, early human societies were not teeming with “fake news” and fabricated studies as we are now. It is increasingly clear how having a public swayed by confirmation bias can be dangerous to modern society.

We live in an illusion, where we think we know more than we actually do. For instance, one study found that when people were told about the new (fictitious) discovery of a rock that glowed, if they were told that the scientists who discovered it did not know why it glowed, participants did not claim to know as much about the rock as those who were told that scientists understood how it works (even though the subjects were not given any information on why the rock glowed). This phenomenon of people thinking they understand more than they do is common, and has actually been advantageous in terms of scientific progress. As scientists, we do not need to understand every scientific discovery there has ever been – we rely on the knowledge of our ancestors and those around us.

Humans are programmed to be influenced by information which they are then told is fake, and to think of sources which support their pre-existing opinion as more reliable than those which question it. However, this can be dangerous in areas such as politics. For example, if people around an individual claim to know why Brexit would be economically beneficial to the country, then even when presented with evidence to the contrary the individual is less likely to believe it. Likewise, if a person believes that global warming is a conspiracy, they are more likely to believe Trump when he says it was created by the Chinese than ecologists who say we are pushing our planet to critical levels. In a world where we are bombarded with clickbait and fake news, it is more important than ever to think rationally and critically about every piece of information.

The Teenage Brain – Charlie Delilkan

We’ve all been there. “I’m leaving home and I’m never coming back!” “It’s not just a phase, Mum.” Slammed doors. Smashed plates. My Chemical Romance t-shirts and “bold” eyeliner. If you haven’t guessed already, I’m referring to those golden teenage years. Whilst we may have given our parents a hard time, we may not be completely responsible for that increased phone bill.

When we’re born, our brains aren’t fully formed so the first few years of our existence involve an expansion of connections – synapses – between cells. Approximately 10,000 different connections are made between the hundred billion brain cells you were born with by the time you are six-years-old!

But during our teenage years, these numerous connections are trimmed down; the brain decides which connections are important enough to keep, and which can be let go, depending on how frequently each neural link is used. This process is called synaptic pruning. This process actually continues well after we stop calling people “teenagers” – some researchers believe this only ceases in our mid twenties, sometimes later! But sometimes this process can go wrong, leading to important connections being lost which could lead to psychiatric disorders such as schizophrenia.

The synapses that are kept are then subjected to a process called myelination, where the synapse is given a sheath that helps them transmit signals more quickly. That is why the teenage years are so critical to your future development! Skills and habits laid down at this point are likely to stay in the long run.

Interestingly, the prefrontal cortex is the last part of the brain to fully mature (or finish pruning). However, this is the part that allows us to be an adult – it controls our emotions and helps us to empathise with others. Therefore, if your prefrontal cortex isn’t functioning fully, you tend to be impulsive and insensitive to other people’s feelings. Sound familiar? Don’t worry though – as teenagers mature, the prefrontal cortex is used a lot more when making decisions, showing that they start to consider others when making choices.

What about that stereotype that teenagers are “hormonal”? Well stereotypes usually come from some truth! Teenagers are hypersensitive to pleasure; rewards such as the neurotransmitter dopamine release is at its peak during adolescence. Any action that causes dopamine release is positively reinforced, but the actions that cause the most dopamine release are usually associated with a stereotypical teenager – reckless driving, drug taking, and/or risk taking. Or in my case, 7 hours of dungeons and dragons on a Friday night – please don’t judge. This reward system is also closely harmonious with the brain’s social network, which uses oxytocin, a neurotransmitter that strengthens bonding between mammals. This causes teenagers to strongly associate social interactions with happiness  and so constantly seek out social situations. This explains why we usually see a dynamic shift from kids being close to parents to teenagers having friends being their emotional centres.

So the next time the teenager in your life is threatening to throw a chair at you, just remember that parts of their brain are literally being destroyed. Cut them some slack, bro.

What Causes Alzheimer’s? Emma Pallen

Alzheimer’s disease is a chronic neurodegenerative disorder with a wide range of emotional, behavioural, and cognitive symptoms. It is the most common cause of dementia, causing around 60-70% of dementias and is primarily associated with older age, with around 6% of the global population over 65 being affected and risk increasing with age. This is especially concerning considering our ageing population and, by 2040, it is expected that there will be 81.1 million people suffering with Alzheimer’s worldwide. It is also one of the costliest conditions to society, costing the US $259 billion in 2017.

Symptoms of Alzheimer’s can be grouped into three categories. Perhaps the most recognisable category is cognitive dysfunction, which includes symptoms such as memory loss, difficulties with language, and executive dysfunction. Another category of Alzheimer’s symptoms is known as disruption to activities of daily living (ADLs). Initially this can be difficulty performing complex tasks such as driving and shopping, later developing to needing assistance with basic tasks such as dressing oneself and eating. A third category of AD symptoms are related to emotional and behavioural disturbances. This can range from depression and agitation in earlier stages of the disease to hallucinations and delusions as the disease progresses.

What causes Alzheimer’s Disease?

We know that the symptoms of Alzheimer’s are caused by a gross loss of brain volume, also known as atrophy, in a number of regions that progress as the disease develops. As brain tissue is lost, symptoms associated with the function of the lost area emerge, such as personality changes developing as tissue is lost in the prefrontal cortex.

We also know that this brain atrophy is caused by a loss of neurons and synapses in the brain. However, what we don’t know is exactly why this neuronal loss occurs. One way to attempt to solve this question is to compare the brains of Alzheimer’s patients to normally ageing brains. This has led to the observation that the brains of Alzheimer’s patients have two distinct biochemical markers: amyloid plaques and neurofibrillary tangles, which are both abnormal bundles of proteins. While these features are often present to some degree in normal ageing and are not always observed in Alzheimer’s, they are often more associated with specific brain regions, such as the temporal lobe, in Alzheimer’s than in regular ageing. There are a number of theories as to how these biochemical markers may be linked to neuronal and synaptic loss, however none are fully conclusive.

One such theory is the amyloid cascade hypothesis. This hypothesis suggests that amyloid plaques, which are made up of a protein known as amyloid beta, are the primary cause of the disease and that all other pathological features of Alzheimer’s are as a consequence. This theory suggests that the accumulation of amyloid beta into plaques leads to disrupted calcium homeostasis in the cells, which can lead to excitotoxicity and ultimately cell death. Evidence in support of this theory comes from the fact that Down’s Syndrome, a condition in which almost all sufferers display some degree of Alzheimer’s disease by age 40, is associated with a mutation on chromosome 21 which is also the location for the gene coding for Amyloid Precursor Protein (APP), a precursor protein that leads to the formation of amyloid beta.

However, if the buildup of amyloid plaques are the cause of cell death in Alzheimer’s disease, it stands to reason that the removal of these plaques should at the very least stop the progression of the disease, which has not been found to be the case. Furthermore, whilst APP producing transgenic mice do end up having more amyloid beta and amyloid plaques, this does not lead to other features of the disease such as neurofibrillary tangles and most importantly, no neuronal loss. This suggests that there may be some other cause for the neuronal loss seen in Alzheimer’s.

Another theory about the cause of neuronal loss in Alzheimer’s focuses on hyperphosphorylated tau, a protein that is the main component of neurofibrillary tangles. The tau hypothesis suggests that the hyperphosphorylation of tau leads to the formation of these neurofibrillary tangles which can result in depleted axonal transport, a potential cause of cell death. This idea is supported by the fact that the number of neurofibrillary tangles is linked to the degree of observed cognitive impairment. Additionally the progression of where tangles are found is similar to the known progression of atrophy observed in Alzheimer’s. Dysfunction of tau is also known to be linked to another type of dementia, frontotemporal dementia, so it seems plausible that similar mechanisms may be at work in Alzheimer’s.

Whilst these are the two of the most prominent explanations for neuronal death in Alzheimer’s, there are a multitude of other potential explanations, and it is likely that no single explanation will capture all facets of the disease. Rather, it is more likely that there is a complex interplay of biochemical reactions along multiple pathways that lead to the clinical features we see in Alzheimer’s disease. These are likely affected by many other risk factors, such as genetics, or environmental factors such as smoking or head trauma.

A, T, C, G… and more? Adding Letters to Life’s Genetic Code – Alex Marks

Scientists have created bacteria that carries two extra synthetic ‘letters’ of the genetic code.

The genetic code is made from four bases, more commonly known as the ‘letters’, A, T, C and G. It is the order of these ‘letters’ that create the genetic blueprint for all life: DNA. Scientists have modified the bacteria, E. coli, so that it can carry two unnatural ‘letters’ in its DNA.

By adding the extra two ‘letters’, which are named X and Y, scientists have increased the number of combinations that the ‘letters’ could make. These additional combinations could potentially increase the number of biological functions this bacterium could do. The international team of scientists hope that this can lead to the creation of new classes of drugs to treat diseases.

In a standard cell, the four ‘letters’ of the genetic code tell the cell how to make proteins. Proteins are responsible for almost every function and structure within a cell. They repair and maintain the cell; they transport atoms and small molecules; and they make up an important part of your immune system.

By expanding the genetic alphabet from four to six ‘letters’ the potential number of proteins that could be synthesised dramatically increases, allowing for semisynthetic organisms that have new qualities not found anywhere in nature.

It had already been shown that semisynthetic organisms could be created. However, the ones that had been made were slow to replicate and regularly lost their unnatural ‘letters’. The new study has “made this semisynthetic organism more life-like,” according to Prof Romesberg, senior author of the study.

By modifying the existing version of the genetic ‘letter’ Y, the team created a semisynthetic organism that could hold on to the unnatural ‘letters’ X and Y for 60 generations. The scientists believe that the bacterium will keep the letters indefinitely.  Making the DNA is still stable, even with the extra ‘letters’ in it.

“Your genome isn’t just stable for a day,” said Prof Romesberg. “Your genome has to be stable for the scale of your lifetime. If the semisynthetic organism is going to really be an organism, it has to be able to stably maintain that information.”

They managed to make the DNA stable by destroying the bacteria that lost the unnatural ‘letters’. Using CRISPR-Cas9 genome editing tool, the scientist could check the bacteria to see if they had retained X and Y. This tool can read specific parts of the DNA and can also add tags. If the bacteria had not kept X and Y, CRISPR-Cas9 marked them for destruction.

By destroying the unstable bacteria, only the stable bacteria could go on and replicate. By doing this, the scientist’s increased the chance that the replicated bacteria was stable.

“This science suggests that all of life’s processes can be subject to manipulation.” Said Prof Romesberg.

Being able to manipulate processes within cells will help us understand these processes and might be able to help cure diseases.

Are Puppy Dog Eyes A Real Thing? Emily Farrell

I went to a wedding this weekend and met a family friend’s dog. When she was happy, as well as wagging her tail, she pulled a face which showed her teeth. This was definitely not an aggressive face, it was far too derpy for that; they claimed that she was smiling.

She made this face when she was happy, but was it an innate action, not under her control, or was it only to tell us that she was enjoying being fussed? Facial expressions in non-human animals are hard to understand without anthropomorphising, but researchers have been working on how to better know what our pets are telling us.

They found that dogs make “puppy eyes” in order to communicate with their humans. It doesn’t matter whether or not the people are holding food, all that matters is that we are watching them and they will raise their eyebrows to make their eyes look bigger and sadder.

Wolves and dogs make some facial expressions such as snarling, which is used as a precursor to biting, whether they are in the company of humans or not. But it is currently unknown if this raised eyebrow look is used among each other or if this is just for us. If it is only used for our benefit, then it means that they have adapted to effectively communicate with us, something which may have occurred sometime after domestication around 14,000 years ago.

Humans are biologically destined to find things with big eyes cute, it makes them look younger and more vulnerable and means we want to look after them and smush their little faces. Dogs tap into this weakness for big eyes when they raise their eyebrows, making their eyes seem more expressive.

It is well established that dogs are more excitable when food is near, but the fact that they respond the same way to us with or without food, led researchers to suspect that these faces were not a purely innate, emotional response.

Dogs only do this as a way of connecting. They don’t actually feel sad, which is what a lot of owners think (much like how people think their dogs or cats look guilty when they’ve pooped in the shoe of an unsuspecting owner). It might instead just be a way to appease their owners and communicate that they have done this action as opposed to the emotional reaction guilt. In the same way, puppy dog eyes don’t reflect actual sadness, they have just learnt to make this signal as it creates a desirable reaction in the closest available human.

This is a good way of communicating between the species and was previously thought to only occur in apes. Apes and other primates are well known for making faces to communicate with each other, but they have also been found to use a selection of these to convey information to humans.

Dogs are one of the only animals to respond to a human’s gaze and will follow a person’s gaze, implying they can understand what people are trying to communicate. Horses can also do this to an extent, which suggests that it is a by product of domestication and a willingness to understand.

So there you go. Your dog is probably trying to communicate with you, just don’t be fooled into thinking that they’re sad and give them your last chicken nugget.

Natural Cycles – Rhiannon Lyon

Contraception can be a pain. From the long list of side-effects associated with hormonal pills, to the painful and invasive nature of implants and IUDs, women put up with a lot to avoid getting pregnant. And with the search for a male contraceptive pill that lacks undesirable side-effects (the type that women have put up with for decades) still unfruitful, things look set to stay this way for a while.

Or do they? As the first and only app to become certified as a contraceptive in Europe, Natural Cycles promises a hormone-free, non-invasive alternative to traditional forms of birth control.

Natural Cycles was developed by physicist Dr Elina Berglund, who works at CERN and was part of the team responsible for confirming the existence of the Higgs boson.  The app started out as an algorithm Berglund developed after deciding to stop taking hormonal contraceptives. She started looking into the biology of the menstrual cycle and found that ovulation can be accurately predicted by small changes in body temperature, and this data can be used to calculate when an individual is and is not fertile. Berglund began to monitor her own cycle using the algorithm, along with some of her colleagues at CERN. This ended up working so well that her Berglund and her husband decided to develop the algorithm into an app, so that more people could benefit from it. The latest study shows that the app is 99% effective when used perfectly, or 93% effective with typical use (for comparison, the pill is 91% effective with typical use).

So how does a simple fertility awareness method manage to have such success in preventing pregnancy? To answer this, we first need to understand a bit of the biology of the menstrual cycle.

nat cycles graphPhoto source: https://www.naturalcycles.com/en/science/menstrual-cycle

The menstrual cycle can be roughly divided into three stages: the follicular (pre-ovulatory) phase, ovulation, and the luteal (post-ovulatory) phase. The levels of the hormones oestrogen, progesterone and LH vary over these stages, as shown in the diagram above, with the body’s basal body temperature (temperature at rest) changing as a result of these different levels. This is how Natural Cycles detects where the user is in their menstrual cycle: a temperature taken each morning with a two decimal place thermometer.

During the follicular phase oestrogen levels are high, and progesterone levels low, leading to a lower body temperature. At the end of the follicular phase is the fertile window. This is approximately six days long – starting five days before ovulation occurs. This is because sperm can survive in the uterus and fallopian tubes for up to five days waiting for an egg to fertilise.

At ovulation an egg is released by one of the ovaries, and travels through the fallopian tube, where it can be fertilised if it encounters a sperm (which could have been hanging around in the tube for several days).

After ovulation the luteal phase starts. Progesterone levels increase in order to aid the foetus’s development if fertilisation has occurred. The rise in progesterone causes the basal body temperature to go up an average of 0.3°C. If fertilisation has not occurred the progesterone levels then fall again, and the uterine wall begins to shed with the beginning of menstruation, which starts a new cycle.

From this we can see that there is actually only a window of around 6 days each cycle where fertilisation could actually occur, on all the other days of the cycle intercourse will not result in a pregnancy. The Natural Cycles app uses this logic to assign ‘red’ and ‘green’ days – those on which you do and do not need to use protection, respectively. Of course an app that accurately tracks fertility can also be used to increase chances of pregnancy, and around 20% of Natural Cycles users are in fact using it to aid in becoming pregnant.

However, the app may not be for everyone. Success depends on users strictly abstaining or using barrier protection such as condoms on red days, and making sure to take their temperature each morning, having had a decent amount of sleep (as sleep deprivation can cause fluctuations in the basal body temperature). Those who have irregular menstrual cycles, such as people with PCOS (polycystic ovarian syndrome), which affects around 10% of women, may not benefit so much from Natural Cycles, as the algorithm is likely to give them many more red days per cycle. A subscription to the app also costs around £40 per year, which is pretty pricey considering that all other birth control is free on the NHS (although you do get a thermometer thrown in). Whether that is value for money for a side-effect-free form of contraception is down to the individual.

 

Sources:

https://www.naturalcycles.com/en

http://nordic.businessinsider.com/birth-control-app-as-effective-as-the-pill-2017-2/

http://www.vogue.co.uk/article/natural-cycles-app-hormone-free-non-intrusive-contraceptive-method

http://www.wired.co.uk/article/natural-cycles-as-effective-as-traditional-contraceptives

https://www.theguardian.com/lifeandstyle/2016/nov/07/natural-cycles-fertility-app-algorithm-replace-pill-contraception

 

The Northern Lights – Naomi Brown

At the beginning of November, residents of Scotland and Northern England were
able to view a dazzling light show in the sky: the Northern Lights. But what
causes them and how can we predict when it will happen again?
The Northern Lights are a natural phenomenon where brightly, coloured lights
are seen across the night sky in the appearance of sheets or bands. They are
generally seen close the magnetic poles in an area called the ‘auroral zone’. The
best time to spot the auroras is when the Earth’s magnetic pole is between the
sun and the location of the person observing. This is called magnetic midnight.
The Northern lights are caused by gaseous particles in the Earth’s atmosphere,
colliding with charged particles, released from the sun’s atmosphere.  The
charged particles are carried towards Earth by solar winds. The particles are
deflected from the Earth’s magnetic field. However, at the poles, the field is
weaker allowing a few particles to enter the atmosphere. Hence this is why
auroras are more likely to be seen close the magnetic poles; making Iceland and
Northern Scandinavia common destinations for travellers searching for the
Northern Lights.
The colours of the Northern Lights are dependent on the type of gas molecule
involved in the collisions. Green is one of the most common colours seen and is
caused by collisions of oxygen molecules, whereas blue or purple auroras are
caused by nitrogen molecules.
Why can the northern lights sometimes be seen in places further from the
Earth’s poles e.g. the UK ? The answer is the spread of aurora oval due to
ageomagnetic storm. Geomagnetic storms are more common after the maximum
in the solar cycle, a repeating 11-year cycle. The most recent solar maximum
was in 2013.
The Northern Lights are notoriously unpredictable. There are many forecast
apps available such as “My Aurora Forecast”. One of the best websites to check
out when the auroras will be visible from where you are is the Aurora Service
(www.aurora-service.eu/aurora- forecast/). The site gives the Kp value
predicted for the next hour by using solar activity data obtained from a NASA
spacecraft, ACE. The ACE orbits 1.5 million kilometres from Earth: the prime
position to view the solar winds.
A common way to represent geomagnetic activity is the Kp index. Magnetic
observatories located all over the world use instruments to measure the largest
magnetic change every three hours. The recorded data from all these
observatories is averaged to generate Kp values, which range from 0 to 9. The
larger the value the more active the Earth’s magnetic field is due to geomagnetic
storms and the further the aurora oval spreads. If the Kp value is above 4, then it
is storm-level geomagnetic activity. These Kp values are useful in predicting
when auroras will be visible. To see the aurora from the UK, the Kp value would
have to be at least 6.

To get a great show, the conditions are important. Clear nights with no clouds
are best. It is also worth checking the moon cycle: the brightness of a full moon
drowns out the lights of aurora.