In a lab in Boston, graduate student Arnav Kapur wonders what time it is.
Instantly, a robotic voice tells him it’s a quarter past five. Kapur didn’t audibly ask the hour, he simply thought it, and AlterEgo, a wearable headset capable of listening to the voice inside his head, produced the answer.
The telepathic wearable, developed by Kapur and researchers at the Massachusetts Institute of Technology’s Media Lab, connects the human mind with a machine.
Sitting on the wearer’s jaw and face, AlterEgo uses electrodes to pick up ‘neuromuscular signals’ triggered when you say words in your head. These signals – known as subvocalisation – are automatically sent from your brain to your mouth every time you think about speaking, but don’t actually say anything aloud. AlterEgo picks up the signals and sends them to an artificial intelligence (AI) that has been trained to correlate particular signals with particular words, ultimately allowing for silent communication with the in-built AI assistant.
If AlterEgo needs to talk back to the wearer, to answer questions or give instructions, for example, an AI voice similar to Siri or Alexa speaks audibly via bone conduction headphones. Capable of transmitting sound waves through the jaw, AlterEgo’s headset doesn’t obstruct the ear canal like traditional earbuds would, meaning the interaction is conducted without interfering with the user’s awareness or ability to engage with the outside world. As a result, AlterEgo augments its wearer’s cognitive abilities seamlessly and silently, which is the ultimate goal for neurotechnology, the field of study dedicated to developing a mind-machine link.
“If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself,” explains MIT professor Patti Maes. “So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present.”
So far, AlterEgo has helped researchers with tasks such as controlling the TV and winning a game of chess, but on day it could be used to help carry out operations in noisy environments, such as controlling airplanes on a tarmac or providing information to soldiers in war zones.
AlterEgo could even help people with speech or cognitive disabilities interact and communicate with external stimuli. A project called BrainGate, supported by the U.S. National Institute of Neurological Disorders and Stroke, used similar brain-computer interface technology to help paralyzed people move robotic limbs via their thoughts.
In the future, ambitious entrepreneurs hope to transition neurotechnologies from wearable, external devices such as AlterEgo to implants embedded directly into the body.
A $100 million neuroscience start-up called Kernel is looking to enhance human intelligence by developing brain implants capable of linking people’s thoughts to computers, while Facebook has hired neuroscientists for an undisclosed brain-computer interface project.
Elsewhere, entrepreneur Elon Musk has provided backing for Neuralink, a private research company developing an implantable mind-computer interface that would be embedded in the brain and create a direct communication channel between the mind and the outside world that, if successful, would essentially turning us into cyborgs. In an in-depth interview with Wait But Why, Musk said: “The thing that people, I think, don’t appreciate right now is that they are already a cyborg. […] If you leave your phone behind, it’s like missing limb syndrome. I think people – they’re already kind of merged with their phone and their laptop and their applications and everything.”
While debates around the ethics of augmented human intelligence continue to divide opinion, the future will almost certainly see the development of linked-up, mind-computer interfaces.