Space Travel & Exploration in Popular Culture

Nicholas Folidis explores space travel in popular culture: yesterday, today, and tomorrow

Space travel and exploration is a subject that never ceases to amaze and that is apparent in popular culture. The birth of the Space Age, after the launch of the Soviet satellite Sputnik1, kickstarted the science fiction industry and subsequently led to the production of hit, space-related movies and TV series.

Neil Armstrong shook the world by taking a “giant leap for manking” by setting foot on the Moon for the first time back on July 20, 1969. Yet, it was visionary
filmmakers and directors that managed to travel farther into space.

Director Stanley Kubrick’s profound and futurist film ‘2001: A Space Odyssey’ is a prelude to the Apollo 11 mission. It is a film of space travel and the discovery of extra-terrestrial intelligence, that predicted a lot of the technologies currently in use. In the movie, Kubrick foresees the moon landing and the creation of a Space Station that constantly orbits the Earth.

One could even say that he in a way envisioned NASA’s New Frontiers Program –a series of space exploration missions within the Solar System. In
Kubrick’s world, nuclear-powered spacecraft Discovery One (XD-1) is sent
on a mission to Jupiter, manned with five astronauts and an intelligent AI
computer, HAL 9000.

In ‘Star Wars’, smuggler Hans Solo and his mate Chewbacca travel and fight through space on their Corellian light freighter, Millennium Falcon. Comparatively, in ‘Star Trek’, space explorer Captain Kirk and his crew, go on interstellar adventures travelling at faster-than-light speeds aboard starship

USS Enterprise in places “where no man has gone before”.Even before all that, in BBC’s perennial show ‘Doctor Who’, renegade Time Lord from planet Gallifrey, simply known as “The Doctor”, travels through time and space to defend the Universe together with his companions
in the TARDIS (Time And Relative Dimension In Space).

In more recent years, following the birth of Elon Musk’s SpaceX, a new interest sparked around space travel and exploration. Musk’s audacious plan involves sending the first humans to Mars as early as 2024, with the intension of colonising and terraforming Earth’s neighbouring planet, i.e. engineering its environment by deliberately modifying its climate as well as surface, thus making the planet hospitable to humans.

Christopher Nolan depicts a similar idea in his film ‘Interstellar’, where a team of astronauts and scientists along with robots CASE and TARS, as part of NASA’s Project Endurance, embark on a voyage through a space wormhole –the passage to a distant galaxy near the black hole Gargantua– in order to identify a planet that can sustain human life, and ultimately establish a colony there, to ensure humanity’s survival.

Nevertheless, it was Ridley Scott’s ‘The Martian’ that took NASA’s Mars Exploration program one step further by imagining a series of manned exploratory missions to Mars. In the movie, marooned astronaut Mark Watney, of the Ares III mission to Mars, has to survive in the inhospitable red planet relying solely on his intelligence and creativity in order to signal to Earth that he is alive.

When it comes to space travel and space exploration, popular culture not only entertains but it has also helped to inspire, in times where inspiration was needed. It has reflected –and keeps reflecting– the ever-growing public interest in space, which motivates and drives the politics behind space exploration.

That same interest has stimulated the imagination of scientists
and engineers who made spaceflight possible and still continue to advance
aerospace science and technology at a pace that could even turn Elon Musk’s vision of “making humans a multi- planetary species” into a reality, within our lifetimes. One thing is for certain, the future of space travel and exploration is an exciting one!

Diagnosing Edward Cullen

Haneef Akram explores the biomedical mechanism
underlying vampirism

Image result for vampire teeth

The Twilight movies offer exposure to the supernatural world of vampires and
werewolves. The idea of vampires has been around for centuries, across many cultures including the Balkans and Eastern Europe. Vampire superstition in Europe led to mass hysteria in the 18th century, which resulted in corpses being staked and people being falsely accused of vampirism. However, was this superstition a result of misunderstanding a debilitating genetic disease? And if so, was Edward Cullen a sufferer?

The disease in question is called congenital erythropoietic porphyria,
also known as Gunther’s disease. This is caused by a genetic mutation that leads to a faulty enzyme. The defective enzyme is unable to catalyse efficiently, leading to the accumulation of toxic metabolites known as porphyrins.

This causes symptoms that are characteristically vampire-like, and
could perhaps explain the hysteria surrounding vampires in early Europe.
One of the symptoms of Gunther’s disease is photosensitivity. While
Edward’s skin sparkles upon exposure to sunlight, sufferers of Gunther’s
disease exhibit scarring on the skin that’s exposed, such as on the face and
hands.

This is because porphyrias can absorb UV light which causes damage
to the skin. In the long term, structures such as the eyes, ears and fingers
undergo progressive mutilation—could this explain why vampires tend to avoid sunlight, and prefer the cover of the dark?

Edward, like other vampires, has an ill-looking, greyish complexion. This could be explained by the damage porphyrins cause to the haemoglobin in our red blood cells—the reddish pigment responsible for our skin’s vibrant complexion. In combination with a rational fear of sunlight, perhaps
this is the reason for a vampire’s iconic lifeless appearance.

“Gunther’s disease sufferers are known to be hypersensitive to strong
smells, which could explain why vampires are warded off by garlic.”

Those with Gunther’s disease often display red or reddish-brown teeth. This could give the impression that the ‘vampires’ drink blood, when in reality it is due to the build-up of red-coloured porphyrins. Despite this, Edward does, in fact, need to consume blood to sustain himself. The movies show how his eyes darken, and bruises appear under his eyes, after long periods without feeding.

These symptoms are similar to, but significantly milder than the ulcers and cornea scarring that sufferers of Gunther’s disease experience from exposure to sunlight. Furthermore, as is typical of most vampires, Edward has a heightened sense of smell. Gunther’s disease sufferers are known to be
hypersensitive to strong smells, which could explain why vampires are warded off by garlic.

On the other hand, Edward’s superhuman strength questions whether he was truly a sufferer of Gunther’s disease. His ability to crush cars and have muscular tissue equivalent in strength to granite would be severely compromised if he were a sufferer.

Gunther’s disease causes osteoporosis, a condition which weakens the bones over time and can cause severe fatigue even from minimal physical exertion. The Twilight movies featured Edward in some pretty dramatic fight scenes, which is completely atypical of Gunther’s disease sufferers.

Nervous manifestations are also common in sufferers of Gunther’s disease. This ranges from mild hysteria to manic-depressive psychoses and delirium, and was a major factor to instil fear into the superstitious communities of Europe. Edward, however, is known to be quick-witted, and has maintained a relatively stable mental state throughout the Twilight movies, negating the fighting he has endured for love.

It seems that Edward does display some of the symptoms of Gunther’s disease, such as the pale complexion, heightened sense of smell and sensitivity to light. He does not, however, exhibit the more debilitating symptoms such as fatigue, disfigured body parts and osteoporosis.

It could be proposed that he suffers from a milder form of the disease, along with inherited supernatural capabilities. It would also be fair to say that perhaps he suffers from vitamin D deficiency due to inadequate exposure to sunlight. The confirmation of this diagnosis would, however, require him to provide a blood sample—something Edward would be unlikely to agree to.

 

How slow can you go?

Federico Abatecola unravels the mystery behind the
slo-mo phenomena

The slow-motion effect is a film technique which often occurs in modern cinema and is based on the concept that, by increasing the frame rate at which a film is recorded and maintaining the same playback speed, the viewer will perceive time to slow down. This technique lends itself to a variety of uses: from the comical purposes during the early years of cinema; to Scorsese’s alienating scenes in which slow-motion serves to detach the characters’ view from the world surrounding them.

A prime example is Taxi Driver, where De Niro’s alienation and incapability to reintegrate into society is visually represented in the opening credits. In Raging Bull, the slow-motion effect has two meanings: when combined with Point-Of-View (POV) shots it helps to communicate a character’s altered emotional state (sometimes paranoid or a heightened state of awareness); while the effect is also integrated into boxing scenes to highlight and immerse the viewer into the sport’s violence and intensity.

The latter is probably the most common use of the technique nowadays and, since the release of The Matrix, slow-motion has become extremely popular in action films to show impactful, yet rapid, moments in greater detail. However, can these ‘slowed down moments’ be experienced in real life?

Reports of slow-motion-like experiences are actually quite common.

“While in films, the viewer is given more frames to observe and process,similarly in real-life there can be situations in which the brain, driven by the danger of death, processes more information than it normally would, creating a similar effect.” 

Noyes and Kletti (1976, 1977) concluded in two different studies that 75% and 72% of participants experienced external time slowing down during an accident. These situations were characterised by two key factors: the element of surprise; and the threat of imminent danger. In addition to altered perception of external time, the participants’ experiences were accompanied by increased mental sharpness and clarity.

These abilities, however, are only useful in a life-threatening scenario because, even if in some cases they can be vital for survival, they are highly energy-consuming. From a neuroscientific point of view, studies have revealed that the enhancement of cognitive processes originates from the locus coeruleus, where noradrenaline is synthesised, which causes us to be faster and more attentive. Therefore, the anomalous input of new information will lead to an anomalous temporal experience.

There are also rarer cases in which time slows down or freezes in unthreatening situations.

Injury to the V5 and V1 regions (two of the over thirty specialised processing areas of the visual cortex) can lead to Akinetopsia: a disorder which causes patients to see objects but not their motion. This fascinating disorder is
extremely rare, as such an injury would most likely interfere with more than
one visual function.

In conclusion, the brain’s view of the world is similar to films in some respects. It has been demonstrated that the brain does not observe the world continuously but rather in rapid snapshots, like frames in a film. Slow-motion perception can be considered a circumstance in which our brains input a higher number of snapshots.

Moreover, it is possible to see a loose similarity between these real-life
situations and the use of slow-motion in some films. Its use in Raging Bull
and hundreds of other motion pictures depicting violence, shootings, car
crashes and other terribly dangerous scenes, might portray what we would
actually experience in real life.

Genetic Determinism in GATTACA

Jayde Martin challenges the deceiving depiction of genetic determinism in GATTACA

GATTACA portrays a dystopian society governed by scientific
truths; however, its genetic science is more of fictive speculation.
The only truth one can find in the film is that GATC are the initials used for    the base proteins that code for DNA. In the world where Vincent (our
protagonist conceived without the influence of genetic engineering) is
raised, all genetically inherited conditions can be determined with a
simple heel prick test at birth.

This test, also known as Guthrie’s test (after its pioneer, an American microbiologist)has real-life medicinal basis; it is known to detect inherited diseases, such as phenylketonuria (PKU), MCADD, congenital hypothyroidism, sickle cell disorder (SCD) and cystic fibrosis (CF).

However, GATTACA’s rendition of the Guthrie test is manipulated to be  a catch-all conditions test, capable of finding all probable (and somewhat
vague) genetic disorders and diseases; for Vincent, neurological conditions,
manic depression and congenital conditions apparently occur at a likelihood of 60%, 80% and 99% respectively.

In reality, congenital and neurological conditions are far too vague to test for. Additionally, the likelihood percentage given (especially that of 99%) is ridiculously misleading; this is obvious through looking at the
ways in which genetic scientists study inheritance. Genetic expression is extremely dependent on environmental factors.

The politics of understanding the causation of some conditions, such as manic depression, mean physical susceptibility to the illness cannot be its sole consideration. A person who develops a genetically determined condition has both genetic predisposition through the defective alleles they possess and exposure to an environment that caters to its genetic expression.

An array of tests to best account for multi-factors is required for this speculation of genetic science to hold any remote resemblance to actual post-natal diagnostic procedures.

The fundamental construct of the society within GATTACA relies heavily
on what two science philosophers, Resnik and Vorhaus, have debunked as
falsely assumed “strong genetic determinism”. This is the belief that
genetic science can predict the expression of genes within a person to
95% or above.

When Vincent’s parents create their second child through the
intervention of their local geneticist, the geneticist describes the “removal” of
any probabilities of obesity and other apparent genetic conditions the child is
likely to have. This would assume that such probability is high enough to
warrant the interference of an engineer—something that genetic
science cannot currently do, and most likely due to the nature of genetic
inheritance, will not be able to predict.

“…critical analysis on the genetic science in GATTACA should rule it out as irresponsibly fictitious. It communicates bad science to a wider audience; one that may not be specifically trained within genetics to fully
understand the potential of post-natal gene therapy.”

On the basis of science fiction, should we let this representation of
genetic science slide? I would argue that no, we should not. Unlike string theory in Rick and Morty or time travel in Dr. Who, genetic science is used   as a therapy to help disabled people. It touches the very life experiences and
identities of individuals, and possibly creates misconceptions around gene
therapy treatments that could ease serious illnesses.

Therefore, critical analysis on the genetic science in GATTACA should rule  it out as irresponsibly fictitious. It communicates bad science to a wider
audience; one that may not be specifically trained within genetics to fully understand the potential of post-natal gene therapy.

GATTACA does, however, pick up on the eugenics ethics currently
underpinning the practices of genetic counselling for those carrying affected alleles and the selective abortion of possibly disabled children.

These issues have resulted in what is termed as “designer babies”. GATTACA is a good example for the worst case scenario if such an “art” of discrimination were made a science, although the portrayal of its medical diagnostics has
questionable real-life implications with regards to the representation of medical treatments for genetically disabled people.

 

The Blue Planet II Effect

Anna Pitts rides along the waves of impact following David Attenborough’s latest documentary series

In recent years, there has been a resurgence in the popularity of TV
programmes focussing on conservation and the natural world; one
of the most noteworthy of these shows in 2017 being Blue Planet II, which received an average audience of 10.9 million per episode. However arguably the greatest success of Blue Planet II is not the stunning cinematography, nor
the high viewing numbers, but instead the impact of the show on its audience’s consciousness. Hosted by the legendary presenter,
Sir David Attenborough, it is perhaps not surprising that Blue Planet II has
achieved such success among viewers.

Blue Planet II took viewers on a breath-taking journey to explore the world’s oceans, weaving scientific understanding with storytelling to engage new audiences and defy the preconception that you need to be knowledgeable about science to have an interest in conservation. The show goes
one step further than other wildlife documentaries to tackle political issues
head on, and explicitly challenge perspectives relating to conservation
and the impact humans, as a global society, have on the planet.

“Blue Planet II took viewers on a breath-taking journey to explore the world’s
oceans, weaving scientific understanding with storytelling to engage new audiences and defy the preconception that you need to be knowledgeable about science to have an interest in conservation.”

One of the stand-out topics that the Blue Planet II team focused on was the impact of plastic waste on ocean wildlife. The significance of plastic pollution in oceanic environments cannot be overlooked, an estimated 8 million tons of plastic waste enters the ocean every year; by 2050, scientists have predicted there will be more plastic in the ocean than fish. The creators of the show highlighted the issue in various ways from seabirds mistaking tiny pieces of plastic for food to the shocking extent of the presence of microplastics in marine organisms.

Microplastics consist of polyethylene and polyester which are present in plastic shopping bags and clothing, and as demonstrated in Blue Planet II can prove fatal for marine wildlife. Attenborough ended the final episode of the series with this poignant message: “Never before have we had such an awareness of what we are doing to the planet, and never before have we had the power to do something about that. Surely, we all have a responsibility to care for our blue planet. The future of humanity and indeed all life on earth now depends on us.”

By TV companies making scientific shows about the importance of conservation part of mainstream viewing, this has already began to have an impact on various areas in society, particularly regarding tackling plastic waste. In recent months, more businesses
have been considering their environmental impact and reflecting
the growing interest of their clientele in reducing their plastic waste. This can be seen with: major coffee chains giving discounted prices for customers using reusable cups, cafes offering free refills of water bottles, pubs using biodegradable straw alternatives, cosmetic brands creating packaging-free products and even Iceland aiming to be the first major supermarket chain to be plastic-free by 2023. Moreover, the issue of damaging microplastics has received further recognition in politics courtesy of Blue Planet II.

The current UK Government has been discussing the show in debates, pledging to ban the manufacture of microbeads and proposing plastic-free supermarket aisles as part of a 25-year environmental plan.

Further evidence of the influence of Blue Planet II can be seen with the show winning the ‘Impact’ Trophy at the National Television Awards. Yet even if
TV shows like Blue Planet II simply promote discussion by making audiences think deeper about their impact on the environment, and thus putting pressure on businesses and the government to do the same, then their
importance in driving conservation forward cannot be dismissed.

Blood Flowers

Chyi Chung traces the roots of cut flowers unearthed in Ton van        Zantvoort’s documentary film, A Blooming Business

Lake Naivasha sits atop the Great Kenyan Rift Valley. It is unique
in having freshwater (the only other in the valley is Lake Baringo to
the north), and for its elevation of 1,884 metres (the highest point in the valley). The sun shines heavily down, reflecting off clusters of greenhouses surrounding the lake. Within them are trimmed gardens  where flowers bloom in perfection, and like clockwork.

Lilies. Roses. Carnations. Bound for Europe by sunset, on the very same  day they are cut. Arrays of attractive bouquets are spilling off wire racks  in supermarkets, beckoning to be bought and admired. Lilies. Roses. Carnations. Spoilt for choice year-round, in a world seemingly far away  from Lake Naivasha… Agriculture contributes to a quarter of Kenya’s GDP (Gross Domestic Product). Hence, it comes as no surprise that the country has the third largest cut flower industry in the world, at a global trading value of £502 million in 2016. Kenya produces more than 1 in 3 cut flowers sold in the EU [1], most of which are flown in 6,000 km from Lake Naivasha, where the country’s flower farms are concentrated. Higher altitudes give Lake Naivasha a cooler
climate (particularly suited to rose planting) yet still warm enough not to
require excessive heating; and its proximity to the capital, Nairobi, is
vital for connectivity to Europe and beyond. In 2009, Ton van Zantvoort
released A Blooming Business, a harrowing documentary on the
industry based on the anecdotes of its (former) labourers.

“…half a million livelihoods are entwined with the cut flowers of Lake Naivasha…”

The Kenya Flower Council (2013) estimates that half a million livelihoods are entwined with the cut flowers of Lake Naivasha; 20% of them are low-wage flower farm labourers. Usual sexist dynamics pervades: around two-thirds of labourers are women, despite few at senior managerial levels. They are subject to long working hours (up to 15 hours during peak Valentine’s season), sexual harassment and frequent direct exposure to toxic pesticides; the latter could potentially lead to seizures, blindness, and infertility. As with the golden Californian fruit orchards in The Grapes of Wrath, most labourers are migrants who are lured in search of a better living. Instead, they find themselves trapped in a cycle of low wages and temporary contracts allowing for quick dismissal. “You can’t even go back home…because the salary is too low, you just end up staying here”, said Agnes who swore never to return to a flower farm in face of unemployment, after sustaining chemical injuries and losing her job as consequence. Interestingly, a flower is 90% water.

To quench the thirst of flowers growing in greenhouses, 20 million litres of water [2] is syphoned daily from the lake at a greater rate than can be replenished. However, due to the use of chemical fertilisers and pesticides, compounded with direct leaching due to poor or non-existent irrigation in the greenhouses, worst off is the contaminated water being returned.
Today’s population of 300,000 in the surrounding area is ten-fold of that
listed in the 1969 census. Lack of investment in infrastructure equates to
poor housing with no access to clean running water (as that available from
the lake is polluted), and native Acacia and Euphorbia trees being felled
quickly for firewood and to make land for agriculture and poultry farming.

The Maasai people, indigenous to the Great Rift Valley, increasingly find
their pastoral lifestyle stifled by dwindling resources and land around
the lake, which has fallen by 75% since the 1970’s. Local fishermen lament the
loss of their catch due to poisonous waters and overfishing to feed the
growing population. ‘A Blooming Business’ portrays Lake Naivasha as an unsustainable development, fuelled by corruption and exploitation.

Nine years from its release, numerous awareness campaigns have followed suit, raising the pressing need to invest in people and infrastructure. In 2013, the Friends of Naivasha NGO, opened a women’s health centre, using 60% of funds donated from Fair Trade flower farms. It provides for 600 mothers and their new-borns every month; as a result, birth mortality rates from asphyxia
have halved. A requisite in the science of a good documentary lies in its ability
to challenge the audience to think beyond face value–with A Blooming
Business, beyond the aesthetic beauty of cut flowers, and down to their bloody roots in Lake Naivasha.

 

SOCIETY SPOTLIGHT: Beyond My Ethnicity Magazine

 

 

 

 

 

SATNAV and BME Magazine have collaborated for an article swap!    Here, Ayesha Hashim discusses the Human Taxonomy and the Consequences of Subjective Science

Most would agree that taxonomy, the practice of classifying living        creatures based on shared characteristics, is not inherently a morally objectionable, orotherwise ‘bad’, thing. In biology, it forms the organisational basis for the detailed study and analysis of organisms. Taxonomy isn’t by       any means a new practice: evidence of wall paintings from circa 1500 B.C. depicting plants implies the use of basic taxonomies; the creation of animal groupings such as vertebrates/invertebrates  and sharks/cetaceans can be traced back to Aristotle; the Middle Ages oversaw the infusion of logical and philosophical thought into organism categorisation (re: The Great Chain of Being). Thus, it wouldn’t be incorrect to see taxonomy as the inevitable manifestation of the human need to rationalise, to analyse, to dissect, our environment.

Carl Linnaeus’ system of biological classification is an invaluable hierarchical system, whose categories form the basis of classification today.

However, its perversion lent itself to the creation of a distinct hierarchical
system for humans across Britain and North America, which is certainly
morally objectionable.

During the 18th century, thinkers such as Voltaire  and Blumenbach matched phenotypical characteristics (facial features, build, skin colour) with unscientific descriptions of intelligence and character in order to justify the institution of white supremacy.

Furthermore, Philadelphian physician Samuel Morton went on to make unfounded conclusions about cognitive capacity based on skull measurements across different ethnic groups—this capitalised on the misconception that brain size correlates with intelligence. Meanwhile, anthropometry, which involved meticulously collecting the body measurements of military conscripts and those in the navy and marines, was used to calculate effectively meaningless averages. These relied on the false assumptions that such measurements do not vary from generation to generation, and are not subject to factors such as diet and wealth.

Arguably the most profoundly devastating consequence of the human
taxonomy system was the establishment of eugenics. This went on
to give segregation a toxic ‘scientific’ legitimacy via the Nuremberg Laws in Germany, and the South African apartheid era.

The inexcusability of these developments is obvious, but the rationale behind them is disturbingly coherent: it appeared to be the only way to justify a prosperous western society built on the labours of African slaves, and to continue to reap the benefits of slave ownership. Genetic studies in the ‘70s were partly conducted in the spirit of proving racial superiority/inferiority against a backdrop of liberal empowerment, minorities gaining political rights and socio-economic status, and a redefinition of race rooted in cultural, geographic and linguistic similarity—factors that were perceived as destabilising.

Today, the ‘alt-right’ demographic in the west largely constitutes poorly educated, socio-economically marginalised males who cling onto the ‘science’ of white supremacy. The pattern of self- preservation is clear, and the tragedy rooted in the sacrifices made for it does not go unnoticed.

The abuse of scientific practice involved in the creation of the human
taxonomy system helped to offset the mistreatment of racial minorities      that darkens the pages of our history books today. Whilst the crux of our  focus tends to be on social issues of injustice and discrimination (and rightly so), their horrific roots in scientific practice in the wrong hands, under the wrong circumstances, under the wrong mindset, shouldn’t be dismissed.
Perhaps in the current climate, where falsified information of any kind can generate profit, and incendiary ideas become social media trailblazers, it deserves particular attention.

 

 

The day, the week, the month, or even the year for neurotechnology

Hannah Stephens discovers that the absurd futuristic
tropes suggested in Friends aren’t far from reality after all

Image result for friends

With the prospect of the looming summer exams, the news of the American hit sit-com, Friends, finally appearing on the UK Netflix is one of very few things to be thankful for. Being broadcast around the turn of the millennium, there is no getting away from some of the more dated aspects of the show, but one feature that seems eerily ahead of its time is some of the ideas that crop up about futuristic technological advances.

In S6E7, Ross’ description of uploading the human consciousness
onto a computer to enable immortality seems ludicrous, and yet now it is a very realistic possibility. Take myoelectric prosthetic limbs, for instance; although certainly not on par with brain emulation, it is a very real example of scientists both linguistically and physically attempting to bridge the gap between the nervous system and sophisticated computers – and they are succeeding.

A paper published in September last year [1] described the
successful attachment of prosthetic hands to amputees, which not only allowed limb mobility, but also provided some sensory feedback
including pressure sensations and paraesthesia (“pins and needles”). This bidirectional communication between the organism and what is essentially a computer is a huge breakthrough. Examples can also be found that lie slightly less within the everyday eye; a paper published  in Japan has described a novel neural decoding approach whereby a computer uses magnitude and sub-organ location of brain activity at night to compile an image: essentially computationally re-producing and visualising your dreams.

It detailed how individuals were exposed to multiple stimuli whilst
monitoring their brain activity for responses. This allowed mapping of the locations of various scenarios and emotions at a cerebral level, and then reversal of this translation. These highlight training computers to work
in a similar syntax as the human brain, is no longer uncommon.

“…scientific ideas written into TV shows 20 years ago under
the premise of being completely and utterly ridiculous are suddenly  very
real possibilities. It begs the question of how much of today’s futuristic dystopia in TV…will be surrounding us by 2040.”

Another unnervingly, and at the time unknowingly, futuristic event
occurs in S7E15, when Joey’s Days of Our Lives character is seemingly
brought back from a deep coma following a brain transplant. Whilst
this was almost certainly put in as a completely ridiculous and                    impossible event, it is not a far cry from the human head transplants that Italian neurosurgeon, Sergio Canavero, has devoted almost his entire career
towards.

Until very recently, this had only been successfully carried out on
mice, but as of just 2 months ago, Canavero successfully performed a head transplant on a human cadaver, including successful fusion of the two cleaved spinal cords [2]. This potentially presents a new treatment for otherwise incurable peripheral nervous system conditions. Of course, the 36 year mark is not the finishing point for Canavero and his team; a major problem with transplants is the immune considerations with the physical attachment of a foreign body, considerations that were not necessary in corpses.

As one can imagine, there are also huge social and ethical
complications with the procedure. The key point is how scientific ideas
written into TV shows 20 years ago, under the premise of being completely and utterly ridiculous, are suddenly very real possibilities. It begs the question of how much of today’s futuristic dystopia in TV (for instance Black Mirror, famous for terrifying viewers with a society commanded by utterly inhumane technology) will be surrounding us by 2040. Or perhaps in another 20 years we’ll all be eating beef trifles and having our brother’s babies.

Who knows?

[1] E. D’Anna, F. Petrini, F. Artoni, I.
Popovic, I.Simanić, S. Raspopovic, S.
Micera (2017). Sci. Rep., 7, 10930.
[2] X. Ren, M. Li, X. Zhao, Z. Liu, S.
Ren, Y. Zhang, S. Zhang, S. Canavero
(2017) Surg Neurol Int., 8:276.

Can carbon capture and storage solve climate change?

Ronan Dubois investigates carbon capture and its potential to revolutionise the fuel industry

The 2016 Paris climate accord fixed a threshold of450 ppm (parts per million) for the atmospheric CO2 concentration to limit the global temperature rise to 2°C by the end of the century. The International Energy Agency (IEA) has forecasted that reaching these targets will be 140% more expensive without carbon capture and storage (CCS). So, what exactly are they talking about? CCS was first used in American oil rigs in the 1970s. In short, it involves extracting carbon dioxide gas at polluting power plants or industries, transporting it to a storage facility and injecting it deep underground in a special geological formation. Today, 17 large-scale projects operate around the world, storing 40 million tonnes ofCO2 underground annually. CCS has two main purposes; enhanced oil recovery (EOR), whereby CO2 gas is injected into an oil well to increase the reservoir pressure to extract more petroleum; and the permanent sequestration ofCO2 in deep saline formations. It is estimated that these represent 95% of the global CO2 storage resource, which could amount to several centuries of global present-day emissions. Why, then, has CCS not yet been massively implemented? The answer is that the practical obstacles to its wide-scale implementation have, so far, proved more substantial than its reported benefits. One of the major challenges is reducing costs the largest projects amounting to billions of dollars in investments and operation.

This is compounded by the efficiency penalty suffered by power plants equipped with CCS, which are often too significant to justify it without financial incentives. In addition, population acceptance has shown to be a crucial factor in the success or failure of CCS pilot projects. Concerns have been expressed over the environmental impact ofCO2 injection and the risks associated with induced seismic activity and leaks. Some may remember the 1986 ‘Lake Nyos disaster’ of Cameroon, in which the sudden discharge of a natural carbon dioxide cloud caused thousands to suffocate. Furthermore, there are ongoing legislation disputes in Europe to have CO2 reclassified as a commodity rather than a pollutant in order to enable its transport across borders. In spite of this, recent developments seem to signal a renewed momentum for CCS. The governments of India and Scotland have pledged to fund it, with others set to follow their lead. Three projects entered the operational phase this year; one being Australia’s Gorgon project, the world’s largest to date; China, whose power generation sector is largely reliant on coal, is leading the way in a new wave of projects and began the construction of its first plant in 2017; meanwhile Norway plans to turn CCS into a new pan-European industry within the next 5 years by collecting and storing European emissions below the North Sea. The challenge humanity now faces is to store 4 billion tonnes ofCO2 annually by 2040, or 1% of what is stored today. For now, in the IEA’s words, CCS remains “way off target”.

Biomimetics: A marriage of science, nature and… philosophy?

Ayesha Hashim explores how the interplay of science and nature might accommodate a philosophical dimension

The application of optimised, interdisciplinary and evolved systems, ideas and principles found in nature to facilitate the creation of products and materials: a textbook definition of biomimetics (also known as biomimicry). Whilst this suffices as a description of the general methodology, the broader essence of the term is better encapsulated as follows: a design and engineering philosophy that seeks to capitalise upon, and in some respects influence, the human powers of observation and perception.

Biomimetics, as a field, came to fruition in context of the study of nerve propagation (the way nerves transmit electrical impulses) in squid, at the hands of American biophysicist Otto Schmitt during the 1950s; this research engineered the ‘Schmitt trigger’, a device that converts analogue input into digital output. However, the very first product attributable to biomimetics is possibly the aircraft developed by the Wright Brothers in 1903—it remains uncertain the extent to which the technology was due to observing eagles in flight, but the connection is certainly compelling: four centuries earlier, da Vinci had produced illustrations of ‘flying machines’ modelled on bird anatomy and flight.

Past applications of biomimetic principles have been as transformative as they are fascinating: the creation of Velcro fasteners was inspired by the hook-like arrangements in cockleburr seed casings; certain bacteria-repelling materials used in hospitals and restaurant kitchens mimic patterns found on shark skin; the UV-reflecting property of silk spun by spiders to protect ensnared prey features on the exterior of certain buildings, helping to reduce bird injury. However, perhaps more notable is the evolution of biomimetics, from the imitation of desirable isolated features to utilising advancing science to create entire systems and structures. Current research within localised contexts (primarily drug delivery, tissue regeneration, medical imaging) is suitably advanced, reflecting this evolution, but the expansion of biomimetics to architecture and the management of environmental resources is a relatively new occurrence (consider Japanese bullet trains modelled on the aerodynamic kingfisher’s beak, the Helix Bridge mimicking DNA structure, artificial photosynthesis). This is a product of cumulated progress across the sciences and engineering.

As biomimetics weaves ever more intricately into the everyday and into pragmatic conceptualisations of the future—emphasising the gold standard of functionality and efficiency set by natural selection (a standard unsurpassable by human intelligence)—pertinent philosophical questions arise: how far does the appropriation of nature’s design principles increase our duties of earthly stewardship? I four anthropocentric approach to life (humans as the most important creatures of evolution) is no longer justifiable, does it threaten the right to self-determine? Discussion of such questions in Freya Matthews’ intriguing paper, ‘Towards a Deeper Philosophy of Biomimicry’, results in a proposal for achieving ‘biosynergy’ that urges a reconfiguration of our fundamental desires to align with those of environment. No doubt a radical suggestion, but it is one that highlights the part incredible, part devastating potential of biomimetics to supervene on human society and thought.