The Science of Iron Man – Ciara Barrett

With the shadow of Infinity War looming over our heads, and the role model status I currently place in Iron Man’s hands, I wanted to find out if our (my) favourite genius billionaire playboy philanthropist could actually exist. The following issues concerned are mostly from the films (with fewer references to the comics).

Arc reactor-

From the comics and films, we know the arc reactor in Tony’s chest is a small fusion reactor built into him and runs on palladium, for which the isotope evidently matters due to its fusion properties. Moving past the fact the continuous collision of particles and subsequent emission of beta and gamma rays would cause serious damage to Tony’s health, if this power source could produce the same amount of energy as a full-sized reactor then it is entirely possible that it could power the suit. However, the physics of building a small scale nuclear reactor with a ring of electromagnets, heat containment and electron recovery into someone’s body is still far on the horizon. The arc reactor would also need to store its power, because the amount of power needed when Tony is and isn’t wearing the suit would be greatly varied. In Iron Man 2, Tony is seen to be quickly burning through his arc reactors which suggests that the power output from them can be controlled based on if he’s fighting or just going about daily life, even though it’s usually the former.

Flying suit-

Iron Man has 4 visible thrusters, one on each of his hands and feet. However, when he flies horizontally in the films there is nothing to oppose the downward force of gravity so he should continue to move horizontally but fall in altitude as he did. He usually combats this by flying in a parabolic or slightly ascending path or with his arms outstretched to mimic a plane, where the pressure difference above and below his arms would generate lift.  Next, the 4 thrusters would need to not only generate enough lift for Tony’s weight but also heavy objects like cars or an aircraft carrier which he has been known to lift while flying (see: The Avengers (2012)). This means the thrusters must be able to lift up to 100,000 tonnes (100 million kg) based on the Nimitz Class, the world’s largest aircraft carrier, which is not impossible given that we’re assuming the arc reactor works to power the suit.

Jarvis (or Friday, his badass Irish counterpart)-

Considering Jarvis is basically a better version of Alexa this is completely within reach. He can do facial recognition, use the Power of The Internet, listen to all your problems and do all the heavy brainpower lifting like calculating the results of different scenarios that could occur. He is peak AI technology which he proves when he is able to hide from Ultron by uploading himself into the internet. Again, this is possible, definitely in the near future.

Ammo firing-

The chest ray the suit is capable of firing is made up of photons from the reactor so is also possible. This is because the arc reactor can produce photons during the atom collision process, and as mentioned, the only issue is storing them. He has guns built into his arms and shoulders, like many fighter planes. One issue that could arise is how he could fire these while flying without experiencing the momentum in the opposite direction to make him fly backwards, however the guns themselves are reasonably realistic in terms of being able to build into a suit.

Suit features: calling from afar, self-fitting, freeze prevention-

In Iron Man 3, Tony develops the suit so that he can call it from miles away to fly to him and it then fits on his body by itself. Breaking this down, the first part of this problem is calling the suit, which obviously isn’t possible via Bluetooth or even radio considering the distances involved. A reasonable solution, since it isn’t listed how he manages to do this, could be through the use of satellite signalling like how a phone call works. He also manages to fix the freezing problem mentioned in Iron Man 2 by using a “gold-titanium alloy from an earlier satellite design” as mentioned in a comic. The next issue is the self-fitting suit which is completely possible through the use of an algorithm since the suit is tuned to his shape. Put simply, the suit knows to first secure the feet, then clasp the legs and so on.

Depending on how you choose to rank these issues, this shows that a suit like Iron Man’s isn’t completely out of reach and that the parts of the suit that don’t defy the laws of physics are amazing feats of technology even if they are just special effects for now.

Further Reading:

https://www.wired.com/2008/04/iron-mans-suit-defies-physics-mostly/

http://marvelcinematicuniverse.wikia.com/wiki/Iron_Man_Armor:_Mark_III

Shoot for the Moon: Would the USA’s Cold War plan to blow it out of our night sky really work? Fiona McBride

In 1958 – the year after the Soviet Union’s Sputnik became the first object to be launched into space by humankind – 60 years ago – the government of the USA began to work on a secret plan to assert their dominance on the stage of world power: by blowing up the moon. Known covertly as “project A119”, the intention was to make the military might of the USA abundantly clear to all on earth.

 Of course, the first question this raises is: would such a show of force actually be possible? Though it may look small from down here, and is supposedly made of green cheese, the moon is actually a seventy trillion megaton rock located four hundred thousand kilometers away. That’s quite a big thing to blow up, and a significant distance to send explosives. The explosion would have to have enough energy to not only break the moon into pieces, but also send them far enough away from one another that their gravitational fields – the attractive forces that act between all objects – wouldn’t be able to pull them back together. Otherwise, the single lump of geological matter we call our moon would simply be replaced by a pile of lunar rubble. It is estimated that such an explosion would be equivalent to the detonation of thirty trillion megatons of TNT; given that the Tsar Bomba – the most powerful nuclear bomb ever built – had an explosive power of fifty megatons, blowing up the moon would require six hundred billion of these. Humanity has neither the uranium supplies to build such a bomb, nor the rocket technology to get it there.

Other options include creating a “moon quake” to split apart the internal structure of the rock; this would need to be equivalent to a 16.5 on the Richter scale. The most violent earthquake recorded read just 9.5 on the Richter scale, so it’s unlikely that such a quake could be artificially produced on the moon. Alternatively, the moon could be zapped with a giant laser, however this would need to provide the same amount of energy instantaneously as the sun outputs every six minutes. Humans don’t really have the resources to power such a thing.

 It seems, therefore, that blowing up the moon to assert their dominance over the space and nuclear spheres wasn’t really an option for the USA in 1958 – or even sixty years later – due to a lack of both technology and resources. However, the idea of blowing a large crater in the moon, in order to produce a giant explosion to demonstrate to the world the might of the USA, and leave behind a crater visible from earth to remind them of it forevermore was also considered. This, too, was dismissed in 1959; the reasons for this are not clear, but perhaps those in charge of the project realised how utterly ridiculous their own idea sounded.

 But let’s just take a step back for a moment, and imagine if exploding the moon were possible: what would the consequences be here on earth? Would lumps of moonrock kill us all? What would life be like on a moonless planet?

So the moon has exploded. The first thing most humans notice is a big, bright cloud spreading out through the sky where the moon used to be. This is the light from the explosion illuminating the moon debris. Dust then covers the sky for a while, making daylight darker and air travel impossible for a few months. Our seas and lakes are still tidal – the sun exerts a gravitational pull on the earth that contributes to this, but does not move relative to the earth – so there will be no spring or neap tides – the water will rise to one-quarter the height of a spring tide and return to the same lower level each day. Fragments of moon start to fall to earth; some burn up as they enter our atmosphere; others hit the ground and wreak havoc where they land, though it is unlikely that this would be catastrophic for humanity, as they would move slowly in comparison to other astronomical objects that fall to earth, such as asteroids.

 Once the dust clouds have cleared, the next noticeable thing is a lot more stars. The moon is by far the brightest object in the night sky, so with it out of the way, nighttime will be darker and the stars much brighter by comparison. One –or more – smaller ‘moon replacements’ may also appear in the sky, if the explosion leaves some larger chunks of rock as well as debris and dust. Of course, this debris and dust continues to rain down on the earth whenever a piece falls out of orbit.

 Only after the majority of this debris has cleared – in perhaps a few thousand years – is the next major effect noticeable by humans: the earth will tip over. Gravitational interactions between the earth and the moon are what is currently preventing this; without it, the earth will tip on its axis, causing the poles to melt and an ice age to occur every few thousand years on whichever part of the planet is furthest from the sun at that point.

 So, although exploding the moon isn’t really possible – and certainly wasn’t in the 1950s – it wouldn’t have utterly catastrophic consequences for the earth, just bring significant change. However, as a show of force, it still seems somewhat excessive.

 

From Rulers of Countries to Rulers of Length – Chloe McCole

From the watch around your wrist, to the speedometer in your car, our lives are filled with measuring devices. Many believe we can trace our love of measuring to the ancient obsession over measuring the length of a certain body part – No, I don’t mean that one.

In Ancient Egypt the ‘cubit’ was an important measurement and was the length of the arm, from elbow to outstretched fingertips. This simple measurement was used to realise the design and construction of arguably their greatest achievement, the great pyramids.

However, like us the Ancient Egyptians weren’t all the same size or shape so how did they cut all their stones to an accurate and consistent size? This they managed by standardising the length in the form of The Royal Cubit, a piece of black granite cut to a fixed length. This was used as a guide for the production of the wooden cubits that were then used on building sites throughout Egypt.

Jump forward 4000 years and head 3000 kilometres to the west, to the people of France who weren’t so much building pyramids as storming palaces. It’s the 18th century and the French Revolution is in full swing and while a certain monarch purportedly mocked the starving peasants with taunts of “let them eat cake”, the Academy of Sciences decided the time had come for a total overhaul of the measurement system, they wanted to be able to measure the exact amount of said brioche, in standardised units.

Whilst the timing of this may seem to many to be a little strange, in this time of great confusion the development of the metric system provided much needed order. France, like many other countries, had already defined units of measure; the problem being that even though some units shared a name across many countries, their magnitudes varied, sometimes they even varied from town to town, imagine Rotherham measuring things differently to Sheffield. So the Academy wanted to define a set of base units of measurement that they could then use to derive all other measurements. First though they had to agree on how to determine the unit for distance.

There was initially two front runners in solving this problem, the first, using pendulums was dismissed due to subtle differences in the force of gravity across the world affecting the pendulum, the second involved a much trickier proposition. Pierre Méchain and Jean Baptiste Joseph Delambre were assigned the task of working out the distance from the Equator to the North Pole, using the invisible meridian through Paris. The Academy then decided that the base unit of length would be set as one ten-millionth of this calculated distance. This unit was to be known as a metre.

As you can imagine this assignment took a while, and the Academy quickly grew impatient, so whilst these calculations were going ahead they had a number of platinum rods commissioned, and just like the Royal Cubit, these bars were used as the calibration standard of all measurements.

It actually took six long years for Méchain and Delambre to finally report their findings and the platinum rod that most closely corresponded to their resulting value gained a spot in the National Archives.
The Academy didn’t stop there, with a total of seven base Système International d’Unités (SI units) being established since the French Revolution. Up next was the kilogram or, as it was known at the time, le grave. Using the newly defined metre, a base unit of mass was determined, set at one decimetre cubed of water at a temperature of 4˚C.  Engineers then created a platinum cube that corresponded to this mass and sent it to sit next to the metre rod in the National Archives.

Three of the remaining five units define the everyday quantities of electrical current (ampere), temperature (kelvin) and time (second), while the other two are more specialist and refer to amount of substance (mole) and luminous intensity (candela).

Interestingly the kilogram is the only base unit still defined by the mass of a physical object, although even this is set to change with a push to define units in terms of measurable natural constants rather than the properties of manufactured objects. For example, the metre is now defined as the length of the path travelled by light in a vacuum during a time interval of 1/299,792,458 of a second. That second no longer defined as the archaic value set by a fraction of a day, but rather the time taken for 9,192,631,770 cycles of the radiation required for a caesium atom to vibrate between two defined states of energy! – simple right?

New Glasses Let Wearer See Dead People – Charlie Delilkan

Do you remember one of those old icebreaker questions that asked which five people either living or dead you would take to dinner? Well it could be possible in a few years for this dinner to become a reality! Kind of…

 Turns out, Samsung have discovered a way to detect a new type of particle called fizons, which only exist in an alternate universe where deceased people go. It’s basically a purgatory – we still don’t know anything about a proper afterlife, unfortunately. However, they also predominantly constitute the “people” there so when we visualise these particles, we can see people who have passed on! Cool or creepy? You decide.

With Fizoptic glasses, we would be able to do this! However, we wouldn’t be able to interact with anyone, as they wouldn’t be able to see us since they’re still in another universe, remember. But wouldn’t it be fascinating to see what Stephen Hawking does in his spare time now that he’s left us? Or see whether Adolf Hitler is being consumed by his own guilt or if he’s at peace? Or if you’re a lonely widower, you may be able to put on Fizoptics and see that your deceased partner is actually sat right next to you, watching the same television show!

The glasses have a tiny transmitter in them which broadcasts a beam of extremely high frequency waves (out of reach for human detection). They are so small that they are picked up by fizons in the alternate dimension. When fizons detect these waves, they get highly excited and start to vibrate. After a few milliseconds, they relax again, which transmits energy back to the glasses to allow us to visualise the particles.

Think of it like that old Doctor Who episode (spoiler alert) where everyone was interacting with ghosts only to find out they were really Cybermen and then Rose died and the world ended. Minus the last part.

Now, I understand that there aren’t many advantages to having these glasses other than maybe curing some loneliness or quenching some curiosity, but these glasses are still in their early stages of testing so it may not even make it to the market. But perhaps given time, the deceased people on the other side will also create these glasses too, and maybe one day we’ll be able to communicate! The possibilities are endless. Besides, the other side has Stephen Hawking AND David Bowie, so it’s probably a lot better over there, anyway.

So, get your dinner tables ready, because in a few short years, your imaginary dinner parties may just become a reality.

**DISCLAIMER: This was an article written for April Fool’s Day, 2018. The above article was intended for entertainment purposes only and may include completely fabricated facts.**

The Science of The Flash – Naomi Brown

Barry Allen, aka The Flash is a crime scene investigator who developed superhuman abilities allowing him to travel extraordinarily fast, dodge bullets and save the world.  Whilst watching the first series, I wondered: ‘Could the flash actually exist?’  Of course, we would have to choose to ignore some of the physics-bending facts such as that nothing can move faster than the speed of light but just say for a minute someone did have these powers.  How feasible would it be? What forces would this superhuman be subjected to?  What other powers would he need to have?

 Firstly, it is worth considering how The Flash got his powers in the first place. In the original comic book, his powers were gained when he inhaled the fumes of ‘hard water’. Well, it doesn’t sound like the most likely story! However, the reason in the current TV show is a little bit more plausible: Barry Allen is working in a forensic science lab with a particle accelerator when lightening strikes.

 Next, we need to consider what other traits The Flash would need to travel superfast.  Firstly, to travel very fast requires superstrength. This is because you need a large amount of force to create great acceleration (you might remember from physics class that force = mass x acceleration).   Therefore, if the flash were to throw a punch at you, it would be fatal!

 The Flash travels at such fast speed that he would need some other upgrades to his anatomy. For example, his eyesight would need to be highly superior in order to see, and avoid, any items coming towards him. He would also require an adapted brain in order to process sensory information quickly to allow him to react at the same speed as his movement. 

 The Flash has superhealing powers.  It is essential that he have this ability as he can accelerate and decelerate almost instantaneously from up to 200 mph speeds.  This would cause such a huge force that his bones and organs would be crushed every time.  

 Intriguingly, at superspeed The Flash would turn into a magnet! The friction created between the ground and his feet whilst running would cause a static charge to the extent that a magnetic field around his body would be created. This field would mean anything magnetic in the surrounding area would be attracted to him; can you imagine the destruction that would cause?

 We also should consider the suit that the Flash wears: presumably spandex and/or latex. These materials would need to be very heat resistant, as the movement of the Flash’s limbs would generate lots of friction leading to blistering heat.

 Maybe the moral to this story is to sit back and enjoy the entertainment provided by these superhero stories. I imagine any comic book hero that was scientifically accurate would be very boring to watch!

 

 

The Northern Lights – Naomi Brown

At the beginning of November, residents of Scotland and Northern England were
able to view a dazzling light show in the sky: the Northern Lights. But what
causes them and how can we predict when it will happen again?
The Northern Lights are a natural phenomenon where brightly, coloured lights
are seen across the night sky in the appearance of sheets or bands. They are
generally seen close the magnetic poles in an area called the ‘auroral zone’. The
best time to spot the auroras is when the Earth’s magnetic pole is between the
sun and the location of the person observing. This is called magnetic midnight.
The Northern lights are caused by gaseous particles in the Earth’s atmosphere,
colliding with charged particles, released from the sun’s atmosphere.  The
charged particles are carried towards Earth by solar winds. The particles are
deflected from the Earth’s magnetic field. However, at the poles, the field is
weaker allowing a few particles to enter the atmosphere. Hence this is why
auroras are more likely to be seen close the magnetic poles; making Iceland and
Northern Scandinavia common destinations for travellers searching for the
Northern Lights.
The colours of the Northern Lights are dependent on the type of gas molecule
involved in the collisions. Green is one of the most common colours seen and is
caused by collisions of oxygen molecules, whereas blue or purple auroras are
caused by nitrogen molecules.
Why can the northern lights sometimes be seen in places further from the
Earth’s poles e.g. the UK ? The answer is the spread of aurora oval due to
ageomagnetic storm. Geomagnetic storms are more common after the maximum
in the solar cycle, a repeating 11-year cycle. The most recent solar maximum
was in 2013.
The Northern Lights are notoriously unpredictable. There are many forecast
apps available such as “My Aurora Forecast”. One of the best websites to check
out when the auroras will be visible from where you are is the Aurora Service
(www.aurora-service.eu/aurora- forecast/). The site gives the Kp value
predicted for the next hour by using solar activity data obtained from a NASA
spacecraft, ACE. The ACE orbits 1.5 million kilometres from Earth: the prime
position to view the solar winds.
A common way to represent geomagnetic activity is the Kp index. Magnetic
observatories located all over the world use instruments to measure the largest
magnetic change every three hours. The recorded data from all these
observatories is averaged to generate Kp values, which range from 0 to 9. The
larger the value the more active the Earth’s magnetic field is due to geomagnetic
storms and the further the aurora oval spreads. If the Kp value is above 4, then it
is storm-level geomagnetic activity. These Kp values are useful in predicting
when auroras will be visible. To see the aurora from the UK, the Kp value would
have to be at least 6.

To get a great show, the conditions are important. Clear nights with no clouds
are best. It is also worth checking the moon cycle: the brightness of a full moon
drowns out the lights of aurora.

Sheffield’s Giant Battery

blackburnmeadowsbattery-e1507558131138

Kirsty Broughton

A major step towards greener energy in the UK was taken last month with the opening of an industrial-scale ‘mega-battery’ site owned by E.ON in Sheffield.

The Sheffield site located in Blackburn Meadows is being hailed as the first of its kind in the UK. It has the capacity to store or release 10MW of energy – the equivalent of half a million phone batteries, and is contained in four 40 foot long shipping containers. The batteries are from the next generation of battery energy storage, and can respond in less than a second to changes in energy output – ten times faster than previous models.

Such promising technology has naturally lead to further investments, and the Sheffield site will soon be dwarfed by significantly larger plants. Centrica (the owner of British Gas) and EDF Energy are both in the process of creating 49MW facilities in Cumbria and Nottinghamshire respectively.

When more energy is being put out into the national grid than is being used by consumers, the batteries will take in the excess power and store it. Then, during periods when consumers are using more energy than the grid can provide, the batteries can release this excess energy into the grid, to ensure that everyone has access to power.

This is especially important considering that the UK energy mix is containing an ever-increasing proportion of intermittent sources, such as wind and solar power. June this year saw 70% of the electricity produced from nuclear, wind and solar sources. For the government to hit legally-binding carbon-cutting targets this needs to be the standard for electricity production, but storage is likely to be necessary to balance the intermittency of renewable supplies.

To meet these targets the government introduced a ‘capacity market’ – a subsidy scheme integral to the shake-up of the electricity market. It is designed to ensure energy security particularly during times of high demand, such as the winter months. The scheme has a pot containing £65.9 million, which it will divide between energy suppliers than can guarantee a constant energy supply. It may sound surprising that in the age of austerity the government that is ever-interested in penny pinching is wanting to hand out money. However, it is estimated that the Sheffield site alone could save £200 million over the next four years by increasing energy efficiency. This certainly makes the £3.89 million awarded to E.ON a worthy investment.

E.ON has seen share prices in Germany dramatically fall as it is undercut by abundant, cheaper renewable energy from other suppliers. Germany is often hailed as world leader in renewable energy production, and during a weekend in May of this year 85% of energy production was from renewable sources. E.ON in the UK was following down the same path, as in recent years UK profits have stagnated, and trade has fallen by up to 9%. It was only in March of this year that profits began to pick up again, due to the company shifting away from fossil-fuels and towards green energy production. The battery site in Sheffield is an excellent next step in this major shift.

Black Holes and Gravitational Waves

two_black_holes_on_way_to_becoming_one

Alexander Marks

On the 14th August 2017, the fourth set of gravitational waves were detected. Although the first waves were recorded in early 2016 by LIGO (Laser Interferometer Gravitational-Wave Observation), this time three different observatories detected the gravitational waves. A pair of black holes caused these waves by violently merging together.

Three Scientists at LIGO, Rainer Weiss, Kip Thorne, and Dr Barry Barish have just been awarded the Noble Prize in Physics, for the first detection of gravitational waves and it was these three scientists who designed and ran the two LIGO observatories, situated in Washington and in Louisiana. In the most recent detections a new observatory in Italy called Virgo also measured the same waves.

Why are three detection’s better than two? Three detection’s allow scientists to better pin point the origins of the signals, 20 times more precisely than just two. This is key for follow up observations. It also provides more information about the object that made them, such as the angle they are tilted at compared to Earth.

Gravitational waves where first predicted by Einstein’s theory of relativity back in 1916. This theory was ground breaking and combines space and time to form the space-time continuum. His theory states that any object with mass warps the space-time continuum, the more massive the object the bigger the warp. It is these warps in space-time that cause gravity.

The famous equation of general relativity is incredibly hard to solve, and requires super computers to find solutions. One of the solutions predicts gravitational waves.

Gravitational waves are caused by all objects as they move through the space-time continuum.  Every object makes gravitational waves, meaning that even a tiny snail moving in the grass produces them. They ripple through space-time much like ripples caused by throwing stone in a still pond.  Gravitational waves were the last part of Einstein’s theory to be proven.

The equation predicts that gravitational waves would travel at the speed of light and carry information of the objects causing them. But most of the gravitational waves are too weak to be measured. It requires a massive object to create the large enough ripples in space time to be measured.

Enter black holes and neutron stars. Black holes are the most massive objects in the known universe. Their mass is so large that light cannot escape their gravity. When two black holes orbit very quickly around each other and eventually merge, they create immense distortions in the space-time that can be measured on Earth.

By measuring the gravitational waves and using Einstein’s theory of relativity, scientist can learn a lot about the darkest parts of our universe. Scientists can predict the mass, rotation and how powerful the event was.

Neutron stars are the remains of stars that have collapsed in on themselves, and are also very massive and could theoretically be detected as well. Yet, there has been no detection of gravitational waves caused by them but, there is promise that these will soon be detected as well.

Even the largest ripples in space time are very difficult to measure. LIGO and Virgo are carefully designed to detect these ripples. Each of the observatories is shaped like an L. Each arm of the L is a long tunnel that are vacuum sealed. At the end of each tunnel there is a mirror, and a split mirror where the two meet. (A split mirror can split laser light in two, and send it in different directions).

Lasers are sent down the tunnels at the same time, without the presence of gravitational waves both lasers would return at the same time. When gravitational waves are present the space-time is warped in such a way that one mirror gets closer and the other gets further away. This results in the laser beams returning at different times, allowing scientists to measure the amount the mirrors were warped. This measurement is very small, about 1000 times smaller than a proton.

This means the bigger the gravitational waves the larger the time gap between the lasers returning. As the time gaps are so small, only very massive object can produce waves big enough to be detected.

 

The black holes that created the most recently detected gravitational waves had masses of 25 and 31 times the mass of our sun. They were orbiting each other 1.8 billion light years away and merged into a black hole of 53 times the mass of our sun. This is a supermassive black hole, and is bigger than ever expect to be found.

 

This is the third black hole to be bigger than expected. Black holes of this size appear be more common than originally thought and the rate at which they occur will soon be figured out.

 

The observatories are currently being upgraded and will become even more sensitive. Scientist hope that when they are turned back on in Autumn 2018 they will detect up to ten of these events each year. There is also hope of detecting gravitational waves from neutron stars as well.

 

With observatories planned in Japan and India, it can be expected to find new phenomena occurring in the universe that may have been thought impossible.

One ticket for the Enterprise please! Has China successfully created an sustainable EM Drive?

 

320x240Shannon Greaves

Space is awesome. So awesome that it has had the global powers stuck in a space race even before America took that one giant leap for mankind. Everyone is eager to explore new planets, solar systems and travel faster than light (FTL) with their very own warp drive Enterprise. Well to all us astronauts at heart, that day may be coming sooner than later, now that China has released a video claiming not only to have a working EM drive, but also are claiming to have one already in space on their space laboratory ‘Tiangong-2’! Prior to this news of success from China, China was reported to have only been studying the EM drive, with no reports of successful functionality. Furthermore, both the UK and NASA have also been working on creating an EM drive, with a mixture of breakthroughs and problems. But before we get into the thick of things, let us have a quick review on some important information about the EM Drive.

The Electromagnetic drive (EM Drive), scientifically known as a radio frequency resonant cavity thruster, makes use of microwaves and particles that are bounced around inside an asymmetrical-shaped cavity, which produces thrust with an increasing momentum. Much like to if you were in a box, pushed on the side, and started to move with acceleration. What this means simply is that an EM drive creates thrust without the need for a propellant. Sadly, what an EM Drive isn’t is a warp drive as seen in Star Trek. Unlike how the EM Drive creates thrust, a warp drive appears to enable FTL travel through warping the fabric of space and time around a ship, allowing it to travel less distance.

Still, a working EM Drive would mean a whole bunch of good things for us, including a much faster way of travelling through space (just maybe not FTL Level). A fully functional EM drive would mean there would be no need for heavy propellants such as rocket fuel on board, and would result in a trip to mars only taking between 70-72 days, compared to the average of 270 days it takes us today. What’s even more impressive, is that according to NASA, with an EM drive it would only take us 92 years to travel to our nearest solar system! In addition to faster space travel, the EM drive would result in: cheaper space travel, solar power stations with cheap solar-harvesting satellites that could beam power back to earth, and generally provide us with a greener and convenient energy source for travel.

So, what are we waiting for!? Well before you go buy your space suits and tickets to China, there is a lot of discussion on whether China’s claims and experiments with the EM Drive are true. So far, all China has given us is a press conference announcement and a government sponsored Chinese newspaper (and China doesn’t have the best record of accomplishment for trustworthy research). Within the press conference they also claimed to need to do further experiments to try to increase the amount of thrust being produced. What we need is a peer-reviewed paper, which would not only provide conclusive evidence for their results with the EM Drive, but also confirm the reliability of their claims to testing it in space. China does have some stability to their claims however, with China claiming to have produced similar results to that of the work of NASA’s EM Drive experiments. NASA’s has been working equally as hard on the EM Drive, and have produced several models of EM Drives producing Thrust. They even finally managed to publish a peer review paper, with an EM Drive producing small amounts of Thrust within a vacuum. This gives a little backup to China’s claim of an EM Drive in space.

The biggest problem the EM Drive faces is that arguably its biggest contribution to science today is also its biggest problem and why many experts contest against it. The very physics of the EM Drive not requiring a propellant violates Newton’s third Law of Motion, “for every action there is an equal and opposite re-action”. So, on the one hand, where this would mean that the EM Drive would change the basis of how we understand physics, it also means that no one can explain how it works. Without this explanation, the consensus is that we can’t possibly use and sustain the EM Drive.

So, what happens now? Well we are going to have to wait to see if China releases that peer review paper, but even without that we have made a lot of development in our goal to space travel. The combined effort of China, NASA and other national institutions have brought the EM drive closer out of the theoretical, and into the possible. There has even been some theories created to explain how the EM Drive works, “quantised inertia” being responsible for creating this thrust. If true, this would mean that the EM Drive would not completely violate the conservation of motion, but adapt it. If you’re interested in the applications of “quantised inertia” to the EM Drive, then consider the works of Dr Mike McCullock. Furthermore, for those of you wanting that FTL warp drive, then there is some hope! NASA engineers have been reporting on forums that when they fired lasers into the EM Drive’s resonance chamber. The result was that some of the beams traveled faster than the speed of light. This suggests that the EM Drive may have the capacity to produce the needed “warp bubbles” for a warp drive! Nasa has even been designing a warp drive ship if you want to check that out to! Now, I’m off to watch some Star Trek, but keep an out for the announcement of a reality tv version!

Moore’s Law; Will it stop?

computer-inside

Harpreet Thandi

In 1965, Gordon E Moore, an electrical engineer from America, wrote an article in Electronics magazine. It suggested that every two years the capacity of transistors would double. Later his prediction was updated to processor power doubling every two years and is now known as Moore’s Law. He then became the co-founder of one the biggest creators of microprocessors that figure the speed of laptops and PCs.

pptmooreslawai

This law has wider implications than simple processing power. Devices have become smaller and smaller. We went from a large mainframe to smartphones and embedded processors. This has resulted in a more expensive process where chips have become smaller.

In the larger scheme of things this two-year evolution is the underlying model for technology. It’s resulted in better phones, more lifelike computer games and quicker computers which we use every day. Maybe this effect came from goal setting: we must make processing power double every two-years, or maybe it was just a natural progression? Either way, Brian Krzanich-chief executive of Intel suggested this growth could be coming to an end but he still supports this; “we’ll always strive to get back to two years”. However, the firm still disproves the death of Moore’s Law, as future processors won’t be made so quickly. Technology users might realise their new phone or laptop is only a bit superior than the older model. There is a drastic need for Moore’s Law to be met again as this speed of development leads to more effective processors and save us so much money with efficiency.

To keep up with Moore’s law there have been some major compromises. Now we are at a crossroads, microprocessors are getting smaller and smaller but now they are reaching a fundamental limit due to their size. Transistors are a certain size for quantum effects to take place. “The number of transistors you can get into one space is limited, because you’re getting down to things that are approaching the size of an atom.”

A problem that started in the early 2000’s is overheating. As the devices have shrunk the electrons are more restricted and the resistance goes up dramatically in the circuits. This creates the heating problem in things such as phones and laptops. To counteract this the ‘clock rates’- the speed of microprocessors has not increased since 2004. The second issue is that we are reaching a limit the size and limit of a single chip. The solution is to have multiple processors instead of one. This means rewriting various programs and software to accommodate this change. As components get smaller they must also become much more robust and stronger.

Four and eight are standard quantities when it comes to the processors in our laptops. For example, “you can have the same output with four cores going at 250 megahertz as one going at 1 gigahertz” said Paolo Gargini-chair of the road mapping organisation. This lowers the clock speed of the processors also solving both problems at once. There are more new innovations being undertaken. However, many of these are simply too expensive to be effective.

According to the International Technology Roadmap for Semiconductors (ITRS) transistors will stop getting smaller by 2021. Since 1993 they have predicted the future of computing. After the hype in 2011 of graphene and carbon nanotubes, ITRS suggested it would take 10-15 years before these combine with logic devices and chips. Germanium and III-V semiconductors are 5-10 years away. The new issue is that transistors will not get smaller and move away from Moore’s Law.

Intel is struggling to make new breakthroughs. If they have not been resolved and they fall of the 2-year doubling target. However, there will be strong competition from their competitors. IBM have also started challenging them; a processor seven nanometres wide, 20 billion transistors and 4 times than today’s power. This will be available in 2017. “It’s a bit like oil exploration: we’ve had all the stuff that’s easy to get at, and now it’s getting harder, … we may find that there’s some discovery in the next decade that takes us in a completely different direction”-said Andrew Herbert who is leading a reconstruction of early British computers at the National Museum of Computing.

There is a new future for quantum computing. This works with qubits-quantum bits with values of 0 and 1. The nature of quantum mechanics can be to have multiple states in a system. We could get a quantum computer to work on multiple problems at once and come up with solutions in days that would naturally take millions of years traditionally.

  In May 2015 Moore spoke in San Francisco at an event celebrating the 50th anniversary of his article. He said “the original prediction was to look at 10 years…The fact that something similar is going on for 50 years is truly amazing…someday it has to stop. No exponential like this goes on forever.” At the time this was completely unknown that the total transistors in a computer chip would double every year. This has continued for a lot longer than expected and is now a major part of popular culture- Moore’s Law has become the underlying physical standard of the future that society has lived up to and has driven to meet.