The growth of augmented reality (AR) will almost certainly change the way we visually experience the everyday world. And, as discussed previously on The Conversation, it’s likely to be Google’s Project Glass leading the way on this new frontier.
But other technologies on the horizon will profoundly alter our interactions with computational technologies. More important than the eye-candy value of AR will be the applications for those who are physically or economically disadvantaged.
Virtual reality has been a staple of science fiction since the 1950s. In the 1956 film Forbidden Planet a race of aliens – the Krell – build a computer the size of a city in order to leave their physical beings behind and transmute to a virtual world.
As so often happens in science fiction, that virtual world turned out to be doomed to catastrophe.
More recently, in the Wachowskis' Matrix series, humans are enslaved in a virtual world created by machines. The ensuing battle does not occur in a physical reality but primarily in the artificial reality of the Matrix – a digitally manufactured reality experienced by oblivious humans as a dream.
In The Matrix, the enemy is software – programs known as “agents” who were designed to stabilise the virtual world.
What these and many other science fiction films have in common is the ability for the human mind and computation technology to have a technically-advanced connection. The connection fuses the functions of the brain with software, the two co-existing as one.
Biology + technology = the future
In a TED talk from 2010, Vietnamese/Australian entrepreneur Tan Le demonstrated the basic functionality of this fusion through headset technology developed by Emotiv Lifesciences – the company she founded.
The technology is neither expensive (less than US$300) nor complicated. It comprises a headset and software that runs on standard Mac or PC laptops. When you’re wearing the wireless headset, sensors identify your brainwave activity.
Each individual has unique brainwave cues for certain thought patterns and actions. The software can be trained to identify brainwave patterns that become triggers for actions on the computer.
Imagine writing essays or emails merely by thinking them out, or imagining a painting which could be then imaged and printed using appropriate software.
This is a revolutionary shift in the interaction between humans, data and computers. With such technology, it will no longer be necessary to use a keyboard and mouse to enter data into a computer.
Voice-activated technology has already given us a taste of the potential of direct human interfaces with computers. The brainwave-reading headsets however will take it to a new level, give us the ability to have a real two-way connection with computers.
And combine this brainwave-reading technology and the visual potential of AR – seen in developments such as Google’s Project Glass – and we could soon realise the potential envisaged by designer Michaël Harboun in his video, Trancendenz.
You read my mind
Headsets developed by Emotiv Lifesciences can also be used to monitor mental functions and cognitive skills, all by tracking the status of your brainwaves during daily activities.
Not unlike an electrocardiogram (ECG) machine tracks the electrical activity of the heart to predict future health issues, Emotiv will achieve similar results for your brain.
While tracking of one’s heart seems safe and practical, there are profound ethical issues in the potential impact of recording brainwave activity.
In the science fiction film Minority Report, based on a short story by Philip K. Dick, predictive behavioural modelling based on recorded brainwave activity allows police to predict murders before they have occurred.
The question is, could the brainwave data collected via a device such as Emotiv be used to forecast our own potential thoughts and actions? It’s certainly a troubling thought.
It’s not all bad. The integration of AR and immersive technologies with our lives will provide endless opportunities.
In February we saw researchers creating a direct interface between brainwaves and a prosthetic arm, allowing people to reinstate the activity of a missing limb using only their mind.
AR and other immersive technologies are also being used as an educational tool for children with autism spectrum disorders (ASD).
Research has shown computer interaction can benefit children with ASD but computer interfaces can be challenging for some children to master. AR, on the other hand, is visually appealing, easy to understand and gives immediate feedback to the user.
This feedback loop between biology and technology (biotechnology) will also allow two-way interaction with videogame environments.
Players will be able to interact with both their surrounding physical environment and the virtual gaming platform, merging the two seamlessly.
Currently gamers use the internet to connect to a virtual space where they interact with each other using an avatar (the player’s character). The avatar’s motion and voices are controlled by use of game controllers and microphones, respectively.
With the combined technology of a brainwave-reading headset and AR games could take place in a suburban park. Participants could either physically join the park (just walk there) or inhabit the park virtually, as it exists in Google Earth.
The Unreal gaming engine, developed by Epic Games, is an example of a system well-placed to integrate multiplayer games into the technologies mentioned above. The game engine creates a 3D virtual environment that allows access for many players.
The visualisation could be augmented utilising AR glasses while the interaction with game controllers and actions could be achieved using the headset.
It will be a while before we’re all wandering around with AR glasses or interacting with biotechnology on a daily basis, but one thing’s clear: these technologies will spawn a new era of interactivity that will further extend the capacity of our bodies and minds into the virtual space.
And that’s an exciting concept.