Santa Science – Merry Christmas from pH7!

Sintija Jurkevica

On the night of Christmas Eve, Santa Claus delivers presents to the homes of all the good kids, but how does he do it? In this joyous spirit of Christmas, Santa’s gift delivery-system is worthy of scientific investigation.

How does Santa separate the good from the naughty?

With around 2,000,000,000 children worldwide, and approximately 35% parents of those children being Christians, Santa is faced with the problem of observing an impressive number of 700,000,000 children to make the big decision, – whether each child deserves a present or not. Santa may take an example from the current surveillance systems used in tracking human behaviour. Santa may choose to employ Orwellian, high-tech electronic surveillance, which includes monitoring each child using distant electronic tools, like 24-hour cameras and face-recognition systems. Or alternatively Santa could make use of a less high-tech method that depends upon direct observation of children behaviour by human (or elf!) agents.

How many elves does Santa employ to make all the gifts?

Assuming every kid is receiving the appropriate gift on Christmas Eve, Santa has to organise the manufacturing and production of approximately 700,000,000 objects, meaning that on average, 1,923,076 gifts would have to be assembled and packaged every day, for 364 days. Comparing this with LEGO™, employing almost 14,000 workers that end up producing around 220,000 Lego sets per day, Santa is the CEO of an intensive toy company functioning on the shoulders of thousands of elf employees. Based on the numbers of workers LEGO™ are required to employ to meet the demand of Lego sets, Santa would have to employ 128,205 elves to meet requirements of 700,000,000 children.

How powerful are Santa’s reindeers?

If the average kid’s present weighs around 700g, the final weight of a filled Santa’s bag would total a massive 490,000 tonnes. Note, at this point, the weight of Santa and his sleigh is almost negligible compared to the total weight of presents. Assuming Santa is the only person who delivers presents, his delivery system is fully dependant on nine reindeers. Each reindeer has to be able to pull over 54,444 tonnes of presents and generate enough thrust to lift the mass from the ground. Thrust generation is dependent on one of two things, either direct ejection of hot gas, or streaming air under fixed wings. But reindeers don’t have wings or jetpacks – so they must just be magical!

How fast is Santa’s present delivery?

During Christmas Eve, Santa has to fly over the total surface area of the world, which is more or less 510,000,000 km2, and he has 32 hours to complete his Christmas gift delivery. This means that reindeers have to fly at a velocity 4,427,083 m/s, – at 1.5% of speed of light. Although the high speed is not sufficient to cause any increase in mass due to relativity, Santa’s sleigh remains significantly faster than X-15, the aircraft holding official record for the highest speed reached by a manned aircraft at 2,020 m/s.

If there is anything that can be learnt from Santa, it is that he disregards ethics and challenges the laws of physics. In other words, Santa truly does make sure we have a magical Christmas!

Should Vaccination Be Compulsory?

Jamie Hakham

This is a discussion about the how vaccination should occur, rather than the concept of it. That is, we’re not going to be discussing autism, mercury poisoning, or any other discredited, unscientific, or just plain wrong arguments so-called anti-vaxxers use when dismissing vaccines. Got that? Good!

Let’s instead boil this argument down into its simplest form. Should my choice, as the individual, to not vaccinate my child, be considered more important than the government’s advisement to the contrary? Or, to put that into plain English: is personal choice more important than common welfare? In case you should run out of topics to discuss on Christmas day, we’re going to give a brief overview of both sides of the vaccine argument, and some links to further reading. Won’t that be fun!

Arguments for

There are two main arguments here: a scientific one, and a legal one:

First off, there are always going to be people who, for whatever medical reason, can’t be vaccinated. This is usually due to advanced age, congenital illness, or some other medical condition – but also includes those who are also too young to have received their vaccines yet. These people rely on what is known as ‘Herd Immunity’. This refers to a state where an overwhelming majority of people (around 95%) are vaccinated against a disease. This means that the disease can’t travel through the population to reach those who aren’t protected against it.


The benefits of herd immunity (Image Credit: wikimedia)

Therefore, if vaccination isn’t compulsory, it’s nigh-on impossible to make sure that these vulnerable members of society are protected by the rest – think back to the recent outbreaks of measles in the US and UK that claimed the lives of people who were unable to be vaccinated due to health reasons.

Secondly, and here it’s going to get a bit legal-ese, we have to deal with what’s known as personal autonomy, and how that relates to parental consent. That is, is it right for you to refuse vaccination for yourself, but also for your children? Refusing vaccination for yourself is your personal choice – you become a ‘fringe rider’ of society’s healthcare, benefiting from it without participating. While ethically dubious, it’s well within your rights over your body.

However, if the question concerns your child, it all gets a little bit more complex. That’s because your kid can’t legally consent to anything – it’s not their choice. Therefore, you, as the parent, are legally capable of consenting for them in matters like vaccination. But, this child also has the right to a healthy life, up to and including preventative measures – like vaccination. If your personal views mean that your child doesn’t receive such preventative measures, it conflicts with that right. Therefore, if vaccination isn’t compulsory, how can society be sure that the rights of that child are upheld?

Arguments against

 There’s a couple of reasons here, and they’re both relatively practical:

Firstly, this discussion does not claim that vaccination is a bad thing. Rather, compulsory vaccination has been shown, counterintuitively, to be less effective at increasing vaccination coverage, due to exemptions, as in the US, or due to it’s simple unenforceability, as in Italy. It’s suggested that this was because people are more likely to do something if they are empowered to do it, rather than forced. And, with trust in government at all-time lows, compulsory vaccination is likely to lead to even greater levels of truancy.

A better system is the one currently in use in the UK, where vaccination is free, and strongly recommended. The result is a vaccine coverage of roughly 94%, which is directly comparable to the US for the same vaccines, for significantly less bureaucratic effort.

Secondly, studies have shown time and time again that people are poor assessors of risk, and that most opposition of vaccines is linked to a poor understanding of the facts and risks involved. Many people misunderstand the chemistry, biology of vaccines, show a lack of appreciation for the statistics involved, or have issues outside of the science of vaccines. Forcing these people to do something they don’t understand is a sure-fire way to make martyrs of them, gaining them publicity and popularity. For example, Jenny Mccarthy, a now well known anti-vaxxer, who has become one of the louder voices linking vaccination with autism.

Surely, as modern society, we should focus on educating those less knowledgeable, in order to allow them to appreciate their actions. When Italy moved from a compulsory to a more liberalised system, they found that simply providing and disseminating correct information was hugely important in increasing vaccine uptake. And furthermore, increasing scientific literacy should be a general goal for all of us, and this is a perfect ground to educate those who need it.


Vaccination is, in the scientific consensus, an ethical imperative. It’s our first line of defence against some of the worst diseases in the world. In almost every case the risk of side effects is significantly lower than the risks posed by the disease vaccinated against. Ethically, it is imperative that every single eligible individual is vaccinated, for the good of the whole community.

In practice, however, forcing the thing seems to have unwanted negatives, either because people don’t understand the science, or are mistrustful of its source. In this modern age of information, however, is it a government’s prerogative to supply the facts, or should people come to conclusions on their own?

Food for thought

Jacobsen vs Massachusetts:





Herd Immunity:



Cultural Perspectives:

We Might Be Able To Save Predators And Livestock At The Same Time.

Katy Drake

The debate is ongoing; should lethal or nonlethal predator control methods be used to protect livestock? According to logic, if predators are killing livestock, by removing those predators, livestock losses should decrease. However, as there are legal, ethical and ecological risks at stake, common sense may no longer pass as sound justification.


Image credit: wallpapercave

Research, led by Associate Professor Adrian Treves of the University of Wisconsin, Madison, and published in the Frontiers of Ecology and the Environment, examined more than one hundred peer-reviewed studies of predator control methods and livestock in Europe and North America. Yet, of the over one hundred studies analysed, only twelve met the academic standards from which scientific inference could be drawn; with two reaching ‘gold standard’ and the other ten, a lesser ‘silver standard’.

The results from the twelve studies examined, suggest nonlethal methods of predator control are generally more effective and do not lead to counterproductive consequences.

A variety of predator control methods are used by livestock owners. Lethal methods include hunting, poisoning, kill traps and destroying the litters of young. Nonlethal methods include livestock-guarding animals, fladry (visual deterrents), other types of repellents, fences, diversionary feeding and sterilisation.

Of the one dozen studies analysed, seven examined lethal methods of predator control; two of which appeared to conclude a decrease in livestock loss but only to a minor degree and on a short term basis. In one particular case, it was found that less than one lamb was saved per lynx killed and had negligible practical benefits.

The remaining five lethal method studies concluded either no effect or, in two cases, an actual increase in predation. A study published in 2013 determined that killing cougars resulted in detrimental effects to livestock numbers. Older, male cougars would keep the younger, more aggressive males at bay. Consequently, the hunting of older males, resulted in the immigration of younger males and increased livestock loss.

By contrast, not one of the nonlethal method studies showed an increase in predation. Of the twelve studies examined by Treves and his colleagues, the only two that met ‘gold standards’ examined nonlethal methods which effectively decreased livestock losses through the use of livestock-guarding dogs and fladry, although fladry may be limited to deterring wolves.

One long term and in-depth study, conducted in France, concluded that a combination of mobile electric fences at night and at least five livestock-guarding dogs prevented almost all wolf predation on sheep.

Treves’ critics have suggested that his own study may not be living up to his standards as no independent experts were asked to review the validity of his research. They also suggest that Treves’ expectations of academic standards in predator control research may be impractical as complexities in the field of biology result in most ‘gold standard’ experiments being precluded.

So, what does this mean for the future?

Treves and colleagues have called for a suspension of predator control programs that do not have strong evidence to support their efficacy. They suggest that, like the EU Directive and many U.S. Federal policies, decision-making should be based on clear evidence and as such, until ‘gold standard’ tests have been completed, evidence-based policy should focus on nonlethal methods. However, a major culture shift will also be required amongst ranchers and livestock owners to turn from quick and easy lethal methods to nonlethal predator control.

Humans Are Still Evolving. Here Is How…

Dan Bennison

In his 1895 novel, The Time Machine, H.G. Wells describes a world in which subterranean and forest-dwelling subspecies of humans coexist. To a distant relative of Homo sapiens, the concept of modern humans was likely as foreign as Wells’ world is to us. Still, there is no doubt in the mind of scientists that humans are still evolving.


Image Credit: Pixabay

The main driving forces of evolution are known as selection pressures. How well an individual can overcome these pressures, like predators and diseases, will determine how long it survives and whether it reproduces. By evolving, and passing on genes considered ‘successful’, a species can develop features to help overcome these selection pressures and live longer, reproduce more and ultimately become better suited to the environment in which it lives.

The major advantage we humans possess is the ability and the intelligence to alter our environment, while other species must adapt to better survive within theirs. Farming, healthcare, and lack of predators remove most selection pressures towards humans, and can lead to changes that may well be mistaken for evolution. Is the increasing human lifespan due to evolution or healthcare? We can’t say for sure.

Saying that, there are certain physiological changes that are definitely evolutionary. If wisdom teeth have been as painful a problem for you as for me, you might be glad to know that humans are predicted to lose them altogether. Our jaws are becoming smaller and more bullet-shaped, as cooking and utensils have reduced the need for large, strong jaws with extra teeth for eating tough food.

In addition, our brains are getting smaller, but are being ‘rewired’ to become faster and more efficient. The number of blue-eyed individuals worldwide is increasing. This is thought to be because blue eyes, fair hair and pale skin are linked, and allow greater vitamin D production is low-light environments such as northern Europe. Dark eyes, hair and skin provide more protection from harmful UV rays in hotter regions such as Africa and the Caribbean.

The environment in which human populations live is a major factor when considering evolution, predominantly due to food availability and disease. Seeing as different diseases are more or less common in different areas in the world, humans have evolved different mechanisms to survive.

A huge selective pressure on humans is malaria, a disease transmitted by mosquitos between the tropics of Capricorn and Cancer. In these regions, multiple types of resistance have evolved (and are evolving) separately, such as the sickle cell trait and a type of anaemia called thalassemia. Both of which alter the red blood cells – where the malaria parasite grows in humans. Some mechanisms that have evolved, such as Pyruvate Kinase deficiency, would not be beneficial in malaria-free regions. PKD means your cells cannot make enough energy, leading to many serious health problems. But these health problems are less severe than malaria, so these mutations are beneficial (a type of bet-hedging).

In remote Papua New Guinea, the Fore tribe are the only population of humans that are commonly infected with Kuru, a rare neurodegenerative disease caused by misfolded prion proteins in the brain. Within this tribe, a change in this protein grants resistance to Kuru, and is found nowhere else in nature. Studying human resistance to disease shows how selection pressure has directed mutations within an isolated population.


A map of where different heritable genetic resistances to malaria can be found, the majority of which coincide with the malaria belt. Image Credit: Wikicommons

In terms of how far evolution can go, if you’re hoping that humans will suddenly sprout wings and take to the skies, then I’m sorry to disappoint. It is far more likely that subtle changes will take place to help us better suit the environment in which we live. In rural, developing countries, this may include resistance to diseases to compensate for less effective medical options. However, in more technology-driven communities, eye size is likely to increase and become more efficient in low-light. Facial features are likely to become more objectively appealing, with perfect left:right symmetry.

On a worldwide scale, gene flow as a result of genetic mixing between ethnicities may lead to a future in which humans look extremely similar, with many of the differences between races being lost – a human ‘standard’ of darker skin, large heads and large eyes. The global nature of the modern world, ease of transportation and even social media make this convergence increasingly likely.

Another more alarming possibility is the increase in less favourable traits that become widespread as a result of the modern lifestyle. For example, celiac disease (gluten intolerance), is becoming increasingly common. 50 years ago, it is estimated that 0.2% of people had this disease in the U.S., compared to one in 100 people today. With global trade, humans are becoming less reliant on staples such as grains and wheat, meaning the selection pressure of having celiac disease is less of an issue and individuals can eat enough to survive – increasing the number of people with this disease. Within developing countries where staples are still essential, celiac disease is much rarer.

This is also occurring in type one diabetes, with insulin therapy so effective that patients can live near normal lives, with normal lifespans, and potentially pass the condition onto their children. Furthermore, the battering of our perfectly adapted gut bacteria with prescription antibiotics reduces much innate resistance to disease, and may also affect individual traits such as weight gain and digestive health.

Of course, the possibility of editing human embryos is still a possibility. But it remains to be seen whether this technology is ever routinely used in practice. All in all, the impact of selection has been reduced in humans because of our own intelligence. Evolution will never cease, but just continue adapting humanity to the ever-changing conditions in which we live, with each major change or event providing more ground for evolution to work with.

Bird Flu Is Back

Naomi Brown

Recently there have been numerous reports of the rapid spread of avian flu across Europe. With the memories of previous outbreaks that caused widespread devastation across poultry farms and fatalities in humans, there are concerns that this time it could be worse.


Image Credit: Flickr

What is avian flu?

Avian flu is a type of influenza virus adapted to live in birds.  A virus is made up of genetic material, such as DNA or RNA, covered by a protective protein coat. Flu viruses are constantly changing, which means they have the ability adapt to become the best at infecting hosts. This is why they have the potential to cause pandemics (the spread of an infectious disease that has spread through human populations across a large region).   

Upon the discovery of the influenza virus the first parts to be identified were two proteins on the virus surface called hemagglutinin and neuraminidase. This lead the naming system still used today: ‘H’ for hemagglutinin and ‘N’ for neuraminidase. The types of virus were numbered as they were discovered, for example: the first virus identified was H1. However, there are six other genes present in flu viruses. This means that although strains have the same name, they have six genes that could be different. Therefore, it is possible that two viruses with the same name could either cause mild symptoms or be highly contagious.

It is worth noting that most types of avian flu do not infect humans. However, a number of the ones that do, cause serious infection. The strains of the virus that can cause fatalities in poultry are the H5 and H8.

The Last Outbreak

The current strain, H5N8, has evolved from H5N1, which was first recorded in a goose on a Chinese farm in 1996.  H5N1 is highly pathogenic, meaning it is contagious and so spreads quickly. This led to the rapid spread of the disease across Asia, Europe and Africa; hundreds of birds died, significantly impacting the poultry markets. The disease spread to humans from contact with these birds causing 452 deaths.

This Time

The H5N1 virus has had the opportunity to hybridise with other types of flu because the migrating birds congregate in North-Central Asia during the warm summer months before dispersing all over Africa, Europe and Asia.  This the first time that wild birds have died because this H5N8 strain has picked up new genes from flu in wild birds. There is a high likelihood of more H5N8 outbreaks in both wild bird populations – such as geese, ducks and gulls – and farmed animals, due to the migration of wild birds.

The first case of infection report in Europe was on a farm in Germany where there was swift response; a 3 km2 quarantine was set up and 30,000 chickens were culled. There have been further reports of infected birds from Austria, Lake Geneva in Switzerland and Romania.  

So far, no humans have been affected. A report from the World Health Organisation has concluded that the risk of human infection is low but cannot be excluded. H5 flu viruses rarely infect humans however one similar strain, H5N6, has caused 6 fatalities out of the 14 reported cases of infection in humans.  The disease has only been transmitted to humans when a person has come into contact with infected poultry, and there is no evidence that eating infected meat that has been cooked correctly can cause transfer of the disease.

If you’re worried about Avian flu, their advice is to avoid contact with dead or sick birds, wash your hands thoroughly after any contact with livestock and make sure to cook poultry thoroughly.


Why Do I Hate Marmite?

Helen Alford

Ah, marmite. The notorious dark brown, gloopy sludge (I’m hard-pushed to call it a food) is a byproduct of beer brewing. The yeast extract left over from brewing lager, bitter, and ale is mixed with vegetable and spice extracts, along with some other ingredients that are ‘trade secrets’. The manufacturing process sounds just as unappetising as the end product tastes.

The spread is behind one of the UK’s most divisive advertising campaigns – ‘Love it or hate it’. Polling agents YouGov ran a poll in 2011, asking 2,500 British adults whether they loved, hated, or had no opinion on marmite. The results were 33% for both love and hate, with 27% remaining neutral. That’s a pretty even split. Personally, I just can’t understand how anybody could willingly eat this vile excuse for a condiment. But clearly, people do. So what makes our palates so different?


Love it or hate it? (Image Credit: Helen Alford)

Taste and smell have long been evolutionary survival tactics. If something doesn’t taste right, we know to avoid it. Bitter tastes usually mean poison, while a sulphurous smell can be associated with something harbouring dangerous bacteria. The ‘survival’ role of these senses has declined as we have developed ways to keep food safe and the availability of food has grown. Even so, we still use these senses to judge food. If milk smells bad, we don’t put it in our tea. Evolution can explain the general aversion to bitter-tasting foods like grapefruit and broccoli, but what about more personal preferences?

Babies may inherit food preferences from their mothers. The flavours are transferred to the child from the mum through the amniotic fluid. One study found that babies whose mothers consumed carrots during the last stage of pregnancy were more likely to eat carrot-flavoured food compared to babies whose mothers did not eat carrots. The same principle could potentially be applied to other foods.

Another biological factor that can account for differing tastes is the amount of taste buds on an individual’s tongue. Taste buds detect the five tastes: sweet, sour, salty, butter and umami (savoury). “Supertasters” have more fungiform papillae – projections holding taste buds – and so taste things with much more intensity. They also have an increased sensitivity to bitterness. Research has shown supertasters have reduced preferences for foods including coffee, mushrooms, gin, tequila, green tea, and cabbage. Average tasters usually have a more accepting palate.

To try and understand why supertasters react negatively to certain foods, scientists are considering a gene named TAS2R38. The gene encodes a protein which is a bitter taste receptor. People who have a version of the gene that is very influential in tasting (as opposed to a non-tasting or subdued version) may be supertasters. Fussy eating could well be the results of genetic factors like this.

A person’s taste can evolve through the influence of various psychological factors. For example, association of a food with a feeling or emotion can affect the way the food is perceived in the future. If a person eats a food which makes them ill, chances are they won’t like that food anymore. The appeal of the food in terms of all 5 senses is diminished.  In contrast, if a person associated a food with being exceptionally positive, they’re probably more likely to keep eating it.

Interestingly, recent studies have shown that people who enjoy bitter foods like gin may have psychopathic traits.  One experiment showed that the ‘agreeableness’ of a person is negatively correlated with a liking of bitter foods.

Societal influences can also play a role in determining what people like and don’t like. As children, we naturally have a fear of trying new things. It’s possible that people whose parents encourage them to eat new foods regularly could grow up to be less fussy than those who stuck to a more restricted diet. Adults are expected to be less fussy than children, and so may be forced to consume foods they don’t like to ‘fit in’. Olives seem to be the prime example of this. Luckily, research shows that the more times you eat a food, the more you grow to like it.

Food preferences are down to the interaction of numerous factors. Gene variation, upbringing, number of taste buds, psychology, society, experience… Call me narrow minded but none of these factors could ever explain to me the love people have for marmite.


Bringing Lost Coral Back to Life

Jonathan Cooke

If you weren’t aware of just how badly coral reefs have been faring due to global warming, it was announced in late November that just over two-thirds of the Great Barrier Reef has been killed off due to warming seas. That is over 435 miles of coral that has died, three times the distance between Sheffield and London. Coral reefs account for the largest areas of biodiversity in the ocean and are considered some of the most diverse habitats on the globe.


Image Credit: Wikimedia

Losing these areas of biodiversity is a disaster for many marine species. Coral reefs act as both a nursery and a shelter for smaller fish, helping to shield them from the ocean’s larger predators. Indeed, many species are specialized to living on or around reefs, creatures such as everyone’s favourite clownfish, Nemo. Without their symbiotic partners, the anemones, these clownfish face likely extinction.

The main problem facing coral reefs is that the rate at which they are dying is massively outpacing the rate at which they grow. The gradual heating up of our oceans is intolerable to many species and causes them to instinctively eject the algae, or zooxanthellae, with which they have formed symbiotic relationships. These algae are responsible for the brilliant and vivid colours we are so familiar with, and provide the coral with nutrients, via photosynthesis, which are key to their survival.

‘Coral bleaching’ is the direct result of the ejection of these algae and whilst it does not spell the immediate death of the coral itself, it leaves them considerably more stressed and prone to suffer from disease. As such, mass coral bleaching is seen as a warning sign of coral’s imminent death.

Can anything be done to counteract such effects however? Well that really depends on which reef we are talking about. Most coral species are ‘endemic’, typically found within one particular reef system. You are unlikely to find many shared species between the Great Barrier Reef and the Florida Keys reef, for example. This means our approach to fixing the reefs must be adapted in each instance to better tackle the issues each coral species faces.

However, some techniques can be transplanted. Ruth Gates, a biologist over in Hawaii institute of Marine Biology, has become somewhat of a coral gardener, growing and cultivating baby corals in controlled environments away from the sea. Since polyps (organisms that build the reefs) grow at tediously slow rate, growing mere millimetres over the course of years, we must find another, quicker way to replace the destroyed coral. It turns out that breaking the polyps up increases the rate at which they grow, as they work quickly to replace the damage.

This finding has fuelled our ability to seed the water with millions upon millions of ‘microfragments’, which can be planted on the surface of dead or dying coral and use this as a structure from which to grow. This technique works for other types of reefs as well; in Britain, many have been using the shells of farmed oysters as bases for young oysters to grow from.  Having a rooted skeleton saves the new coral the energy it would expend rooting itself to the ground, which it can then invest in growth.

Planting juvenile corals is not the only method of reseeding an area and it does have its difficulties. Growing coral takes a long time as previously mentioned; you could be waiting almost half a decade before the corals are ready to be seeded. Even then there’s no guarantee they will survive. Instead, some scientists have taken to using larvae to try and reseed corals.

Coral larvae is free-floating in nature due to their method of reproduction. Adult corals simply release either eggs or sperm into the sea at one or two points throughout the year in mass spawning events. Most coral species do this at the same time, which could lead to some confusing parentage! The eggs are then fertilized in the water column and the larvae develop there until they are big enough to anchor themselves to the seabed. Now, for a sedentary animal this makes sense; releasing your children to be carried by the current limits the chances they will settle down next to you and compete for resources.

However, this method would be ineffective for reseeding efforts as you never know how many would actually settle. To combat this, the teams trapped the coral larvae underneath mesh enclosures so that they would stay at the sites that needed to be rebuilt. Whilst many of the coral settlers will die, enough will survive to sexual maturity to begin the next lifecycle of the reef; in the world of coral breeding, it’s a numbers game.

Now, these methods are all well and good but they miss one key problem: all these corals will be brother and sister. They’ll share all the same strengths, but also all the same weaknesses. As such, any environmental triggers that induce bleaching will cause them all to bleach, landing us back where we started. However, by collecting larvae from nearby surviving reefs, as well as breeding different genetic varieties of coral, we can make sure that we build a genetically diverse reef that can survive into the future.  

Rebuilding the reefs we have destroyed will by no means be an easy task, but it is one we must accomplish. The Great Barrier Reef existed for almost half a million years before we arrived; it is our duty to ensure it is there after we are gone.