Space Travel & Exploration in Popular Culture

Nicholas Folidis explores space travel in popular culture: yesterday, today, and tomorrow

Space travel and exploration is a subject that never ceases to amaze and that is apparent in popular culture. The birth of the Space Age, after the launch of the Soviet satellite Sputnik1, kickstarted the science fiction industry and subsequently led to the production of hit, space-related movies and TV series.

Neil Armstrong shook the world by taking a “giant leap for manking” by setting foot on the Moon for the first time back on July 20, 1969. Yet, it was visionary
filmmakers and directors that managed to travel farther into space.

Director Stanley Kubrick’s profound and futurist film ‘2001: A Space Odyssey’ is a prelude to the Apollo 11 mission. It is a film of space travel and the discovery of extra-terrestrial intelligence, that predicted a lot of the technologies currently in use. In the movie, Kubrick foresees the moon landing and the creation of a Space Station that constantly orbits the Earth.

One could even say that he in a way envisioned NASA’s New Frontiers Program –a series of space exploration missions within the Solar System. In
Kubrick’s world, nuclear-powered spacecraft Discovery One (XD-1) is sent
on a mission to Jupiter, manned with five astronauts and an intelligent AI
computer, HAL 9000.

In ‘Star Wars’, smuggler Hans Solo and his mate Chewbacca travel and fight through space on their Corellian light freighter, Millennium Falcon. Comparatively, in ‘Star Trek’, space explorer Captain Kirk and his crew, go on interstellar adventures travelling at faster-than-light speeds aboard starship

USS Enterprise in places “where no man has gone before”.Even before all that, in BBC’s perennial show ‘Doctor Who’, renegade Time Lord from planet Gallifrey, simply known as “The Doctor”, travels through time and space to defend the Universe together with his companions
in the TARDIS (Time And Relative Dimension In Space).

In more recent years, following the birth of Elon Musk’s SpaceX, a new interest sparked around space travel and exploration. Musk’s audacious plan involves sending the first humans to Mars as early as 2024, with the intension of colonising and terraforming Earth’s neighbouring planet, i.e. engineering its environment by deliberately modifying its climate as well as surface, thus making the planet hospitable to humans.

Christopher Nolan depicts a similar idea in his film ‘Interstellar’, where a team of astronauts and scientists along with robots CASE and TARS, as part of NASA’s Project Endurance, embark on a voyage through a space wormhole –the passage to a distant galaxy near the black hole Gargantua– in order to identify a planet that can sustain human life, and ultimately establish a colony there, to ensure humanity’s survival.

Nevertheless, it was Ridley Scott’s ‘The Martian’ that took NASA’s Mars Exploration program one step further by imagining a series of manned exploratory missions to Mars. In the movie, marooned astronaut Mark Watney, of the Ares III mission to Mars, has to survive in the inhospitable red planet relying solely on his intelligence and creativity in order to signal to Earth that he is alive.

When it comes to space travel and space exploration, popular culture not only entertains but it has also helped to inspire, in times where inspiration was needed. It has reflected –and keeps reflecting– the ever-growing public interest in space, which motivates and drives the politics behind space exploration.

That same interest has stimulated the imagination of scientists
and engineers who made spaceflight possible and still continue to advance
aerospace science and technology at a pace that could even turn Elon Musk’s vision of “making humans a multi- planetary species” into a reality, within our lifetimes. One thing is for certain, the future of space travel and exploration is an exciting one!

How slow can you go?

Federico Abatecola unravels the mystery behind the
slo-mo phenomena

The slow-motion effect is a film technique which often occurs in modern cinema and is based on the concept that, by increasing the frame rate at which a film is recorded and maintaining the same playback speed, the viewer will perceive time to slow down. This technique lends itself to a variety of uses: from the comical purposes during the early years of cinema; to Scorsese’s alienating scenes in which slow-motion serves to detach the characters’ view from the world surrounding them.

A prime example is Taxi Driver, where De Niro’s alienation and incapability to reintegrate into society is visually represented in the opening credits. In Raging Bull, the slow-motion effect has two meanings: when combined with Point-Of-View (POV) shots it helps to communicate a character’s altered emotional state (sometimes paranoid or a heightened state of awareness); while the effect is also integrated into boxing scenes to highlight and immerse the viewer into the sport’s violence and intensity.

The latter is probably the most common use of the technique nowadays and, since the release of The Matrix, slow-motion has become extremely popular in action films to show impactful, yet rapid, moments in greater detail. However, can these ‘slowed down moments’ be experienced in real life?

Reports of slow-motion-like experiences are actually quite common.

“While in films, the viewer is given more frames to observe and process,similarly in real-life there can be situations in which the brain, driven by the danger of death, processes more information than it normally would, creating a similar effect.” 

Noyes and Kletti (1976, 1977) concluded in two different studies that 75% and 72% of participants experienced external time slowing down during an accident. These situations were characterised by two key factors: the element of surprise; and the threat of imminent danger. In addition to altered perception of external time, the participants’ experiences were accompanied by increased mental sharpness and clarity.

These abilities, however, are only useful in a life-threatening scenario because, even if in some cases they can be vital for survival, they are highly energy-consuming. From a neuroscientific point of view, studies have revealed that the enhancement of cognitive processes originates from the locus coeruleus, where noradrenaline is synthesised, which causes us to be faster and more attentive. Therefore, the anomalous input of new information will lead to an anomalous temporal experience.

There are also rarer cases in which time slows down or freezes in unthreatening situations.

Injury to the V5 and V1 regions (two of the over thirty specialised processing areas of the visual cortex) can lead to Akinetopsia: a disorder which causes patients to see objects but not their motion. This fascinating disorder is
extremely rare, as such an injury would most likely interfere with more than
one visual function.

In conclusion, the brain’s view of the world is similar to films in some respects. It has been demonstrated that the brain does not observe the world continuously but rather in rapid snapshots, like frames in a film. Slow-motion perception can be considered a circumstance in which our brains input a higher number of snapshots.

Moreover, it is possible to see a loose similarity between these real-life
situations and the use of slow-motion in some films. Its use in Raging Bull
and hundreds of other motion pictures depicting violence, shootings, car
crashes and other terribly dangerous scenes, might portray what we would
actually experience in real life.

How deep is your learning?

Jahan Hadidimoud explores the possibility of artificial intelligence  as the next digital revolution.

The movie ‘Her’ stars Joaquin Phoenix as Theodore Twombly; a man living in the near future, who purchases an advanced Artificial Intelligence (AI) as easily as purchasing any other electrical item. The AI system (voiced by Scarlett Johansson) names herself Samantha, and quickly learns how to communicate with Theodore. As the film progresses, Theodore finds himself falling in love with Samantha, as she provides a nurturing presence that his life has recently lacked. The film questions the possibility of AI that is so human-like that the line between real and virtual becomes blurred.

AI has recently grown huge in interest due to speech assistants such as Apple’s Siri, Google’s Assistant and Amazon’s Alexa, which help to provide information in a much more casual way than conventional web searches.

Progress was made in early stages when assistants could reply to questions based on information previously given, creating a sort of short-term memory
which improved customer satisfaction. These systems, however, are purely built for consumer use. Much more advanced AI systems have been developed that have beaten many world champions at their own games, for instance, Google’s ‘AlphaGo’ defeated Ke Jie at the ancient Chinese board game, Go.

As with many things, there are people for and against AI.

Arguments in support range from economic reasons, to pure scientific curiosity. The reasons against are just as obvious; AI could
take over entire sectors and leave millions jobless, or perhaps we, as mere mortals, shouldn’t be playing god.

Will AI take over the world? Will it ever pull the plug on mankind in the same way some of mankind hopes it can pull the plug on it? Maybe it’s the branding of AI as an “artificial brain” that scares people when they hear of super-
advanced systems that can make decisions. Will these potentially evil,
soulless entities travel down through their power cables, into the national
grid, and onto our computers where they’ll hack the mainframe (whatever
that means) and destroy the planet? No. I don’t think so.

The future possibilities of AI’s use are endless, but one particularly hyped
venture is producing intelligent driverless cars. Although driverless cars
are already in production, with a current system from market leader Tesla being very successful, there are still improvements to be made. Advanced AI car systems will go further than simply determining when to turn, or apply the brakes, they will also have the ability to make crucial decisions for the driver—a well-debated topic that raises philosophical questions about how we programme AI to think. For example, if some pedestrian jumps onto the road, who should the car prioritise?

“AI could take over entire sectors and leave millions jobless”

You or the pedestrian? These questions have no objective right or wrong answer, but seeing how companies face these problems will be interesting.
Possibly the scariest form of AI we’re familiar with are those like Hanson
Robotics’ Sophia—the human mask covering a metal skeleton, with the brain area left exposed, can leave us confused on how to feel. Sophia has appeared on TV shows and conducted interviews where she’s asked questions you would only ask a non-human. Maybe it’s her 8-bit facial expressions, or her awkwardly long pauses before the punchlines to her “dad jokes”, but Sophia certainly doesn’t feel like the finished article to AI droids, particularly after hearing Scarlett Johansson’s AI character speak so fluently.

It looks like AI’s going to be very popular in the near future; the more it
learns, the better it’ll get and the more popular it will become. Even if AI isn’t
the ‘Ex-Machina’-looking droids we see in movies, it will certainly have uses in
our everyday lives more practical than stabbing us, or making us fall in love
with its voice.

Jahan Hadidimoud explores the possibility of artificial intelligence as the next digital revolution.

How deep is your learning?

 

The day, the week, the month, or even the year for neurotechnology

Hannah Stephens discovers that the absurd futuristic
tropes suggested in Friends aren’t far from reality after all

Image result for friends

With the prospect of the looming summer exams, the news of the American hit sit-com, Friends, finally appearing on the UK Netflix is one of very few things to be thankful for. Being broadcast around the turn of the millennium, there is no getting away from some of the more dated aspects of the show, but one feature that seems eerily ahead of its time is some of the ideas that crop up about futuristic technological advances.

In S6E7, Ross’ description of uploading the human consciousness
onto a computer to enable immortality seems ludicrous, and yet now it is a very realistic possibility. Take myoelectric prosthetic limbs, for instance; although certainly not on par with brain emulation, it is a very real example of scientists both linguistically and physically attempting to bridge the gap between the nervous system and sophisticated computers – and they are succeeding.

A paper published in September last year [1] described the
successful attachment of prosthetic hands to amputees, which not only allowed limb mobility, but also provided some sensory feedback
including pressure sensations and paraesthesia (“pins and needles”). This bidirectional communication between the organism and what is essentially a computer is a huge breakthrough. Examples can also be found that lie slightly less within the everyday eye; a paper published  in Japan has described a novel neural decoding approach whereby a computer uses magnitude and sub-organ location of brain activity at night to compile an image: essentially computationally re-producing and visualising your dreams.

It detailed how individuals were exposed to multiple stimuli whilst
monitoring their brain activity for responses. This allowed mapping of the locations of various scenarios and emotions at a cerebral level, and then reversal of this translation. These highlight training computers to work
in a similar syntax as the human brain, is no longer uncommon.

“…scientific ideas written into TV shows 20 years ago under
the premise of being completely and utterly ridiculous are suddenly  very
real possibilities. It begs the question of how much of today’s futuristic dystopia in TV…will be surrounding us by 2040.”

Another unnervingly, and at the time unknowingly, futuristic event
occurs in S7E15, when Joey’s Days of Our Lives character is seemingly
brought back from a deep coma following a brain transplant. Whilst
this was almost certainly put in as a completely ridiculous and                    impossible event, it is not a far cry from the human head transplants that Italian neurosurgeon, Sergio Canavero, has devoted almost his entire career
towards.

Until very recently, this had only been successfully carried out on
mice, but as of just 2 months ago, Canavero successfully performed a head transplant on a human cadaver, including successful fusion of the two cleaved spinal cords [2]. This potentially presents a new treatment for otherwise incurable peripheral nervous system conditions. Of course, the 36 year mark is not the finishing point for Canavero and his team; a major problem with transplants is the immune considerations with the physical attachment of a foreign body, considerations that were not necessary in corpses.

As one can imagine, there are also huge social and ethical
complications with the procedure. The key point is how scientific ideas
written into TV shows 20 years ago, under the premise of being completely and utterly ridiculous, are suddenly very real possibilities. It begs the question of how much of today’s futuristic dystopia in TV (for instance Black Mirror, famous for terrifying viewers with a society commanded by utterly inhumane technology) will be surrounding us by 2040. Or perhaps in another 20 years we’ll all be eating beef trifles and having our brother’s babies.

Who knows?

[1] E. D’Anna, F. Petrini, F. Artoni, I.
Popovic, I.Simanić, S. Raspopovic, S.
Micera (2017). Sci. Rep., 7, 10930.
[2] X. Ren, M. Li, X. Zhao, Z. Liu, S.
Ren, Y. Zhang, S. Zhang, S. Canavero
(2017) Surg Neurol Int., 8:276.

Osman Kent: An Improbable Journey

Interview and article by Phillipa Jefferies, Joanna Chustecki and Sara Jebril With thanks to the EPS Community and Alumni Relations Office.

On Wednesday 8th March Osman Kent, computer science and electronic engineering alumnus, returned to the University of Birmingham to inspire a whole new generation of technologists and entrepreneurs. He was cited by Business Insider magazine as one of the top 15 technologists in the world in 2012. However, as he discusses in his EPS distinguished lecture, it hasn’t always been plain sailing.
Continue reading “Osman Kent: An Improbable Journey”

How Safe is Your Biometric Data?

Recently, Japanese researchers at the National Institute of Informatics (NII) have managed to recreate fingerprints based on photos taken up to three metres away from the subject. High profile members of public, such as celebrities, would likely be at greatest risk of their biometric data being stolen this way, however, NII researcher Isa Echizen suggested that anyone’s fingerprints could be made widely available “just by casually making a peace sign in front of a camera”. As mainstream camera technology becomes more advanced, the practice of uploading pictures to social media will make more people susceptible to biometric data theft.  
Continue reading “How Safe is Your Biometric Data?”

The Father of Molecular Machinery: An Evening with Professor Sir J. Fraser Stoddart

An Interview with Professor Sir J. Fraser Stoddart, Joanna Chustecki and Mel Jack with thanks to the EPS Community and Alumni Relations Office.

A cold autumnal night on campus and something incredible is happening in the Haworth building. Hundreds of students, postgrads, old friends, colleagues and members of the public have flocked to this well-established house of chemistry to hear one of the greatest chemists of our time talk. Professor Sir J. Fraser Stoddart to be exact. Within this huge crowd bustling to access the main lecture theatre stands a man who has published over 1,000 scientific papers, is one of the most cited chemists in the world, and has, on the 5th of October 2016, been awarded the Nobel Prize in Chemistry ‘for the design and synthesis of molecular machines’.
Continue reading “The Father of Molecular Machinery: An Evening with Professor Sir J. Fraser Stoddart”

Birmingham and the RADAR Revolution

Patrick McCarthy uncovers the link beween your microwave, your uni and World War Two.

How does a microwave oven work? Finely tuned electromagnetic (EM) waves form standing nodes inside the oven’s chamber, exciting the bonds in water, causing them to heat up as the contents spin on the plate. The source of these microwaves, the cavity magnetron, has a history directly linked to the University of Birmingham.
Continue reading “Birmingham and the RADAR Revolution”

From Snorlax to Science

Augmented reality is fast becoming a technology of the everyday for millions around the world. Pokémon Go may have seemed like a simple mobile game, yet it signalled the arrival of AR into mainstream public consciousness. The ability to conjure and overlay virtual objects onto the real world is not just a defining advancement in the gaming industry, but also in how we live and operate on a day to day basis.
Continue reading “From Snorlax to Science”