Neil Harbisson was born with achromatopsia, also known as total color-blindness; for the first two decades of his life, he didn’t know color and lived in a grayscale world. From the age of 21, he began to hear color. In 2003, the computer scientist Adam Montandon began the ‘electronic eye’ project, this strives to overcome his achromatopsia, by playing audio frequencies corresponding to particular colors. This skull-implanted antenna, which Neil calls his ‘Eyeborg’, acts as an audio-visual aid. In 2004, he was officially recognised as a Cyborg by the government.
He has been hearing colors for eight years, and has had to memorize notes and the names of the colors that they match up to. This information gradually became a perception, and then progressed to become emotional ‘feelings’, Harbisson developed his own favorite colors, due to the more appealing sounds that they produce. He soon began to dream in color, and it was at this point that he felt that the software and the brain had united. As an extension of his senses, the cyber device had become part of his body, and even features in his passport photo.
He compares visits to art galleries as similar experiences to attending concerts, where he can ‘listen’ to masterpieces by Picasso and Monet. Visits to the supermarket are like visits to nightclubs, and he describes each aisle as being ‘full of different melodies’. He used to dress in a way that looked good, now he likes to dress to ‘sound good’, flaunting hues that are more appealing to the ear. Even his eating habits have changed, as he regularly re-arranges his plate in order to sound better. Harbisson has created ‘Sound Portraits’, including that of Leo Dicaprio and Prince Charles, who, surprisingly, sounds a lot like Nicole Kidman!
An unexpected secondary effect of his electronic ear is the reversal – normal sounds started to take on a colorful form in his mind; a ringing telephone is a largely green experience, and a piece by Mozart is associated with yellowy hues. Harbisson has even begun to hear colors the human eye itself cannot perceive; he can detect movement with his ears, as well as infra-red and ultra-violet waves. Research has been carried out by Oliver Sacks, who found that blind people have the ability to have hallucinations that they have never previously experienced or witnessed. Hallucinations, unlike imaginations, are not our own creation, or under our control; they mimic our perceptions in completely random ways.
Sacks has found that around 10% of visually impaired people experience visual hallucinations. He claims that those whose brain is receiving no visual input tend to find that these parts of the brain become hyper-active and excitable, resulting in firing spontaneously and subsequently ‘seeing things’. These are known as Charles Bonnet Hallucinations, in which there is no obvious connection with memory and emotion, and are all part of the integrated stream of perception and imagination.
Harbisson makes the interesting point that if we are able to extend our senses, then we are able to extend our knowledge. He feels that we would gain far richer experiences if we ceased to focus on creating applications for our mobile phones, and started creating applications for our own bodies. Neil Harbisson has proved that technology is already capable of increasing, and even supplying, a sense. Scientific advances enable us to increase our limits and have a better quality of life regardless of the limitations we receive at birth.
Although Harbisson feels that scientists should stop focusing on mobile phone applications, the new App, EyeMusic, could allow the visually impaired to have access to their own electronic eye, and in a far more accessible form. ‘A woman who has been blind since birth sits at a table with a bowl of mostly green apples in front of her. When asked to find the single red one, she plucks it from the bowl without hesitation and holds it up to applause from the audience. It’s not a magic act but a demonstration of a new app that enables the visually impaired to hear information usually perceived through sight,’ says Roni Jacobson of the National Geographic.
Amir Amedi developed the EyeMusic, a sensory substitution device, which uses computer algorithm to construct a ‘soundscape’. Like Neil Harbisson’s electronic eye, EyeMusic conveys visual information through musical notes. After a period of training, the user is then able to simply hold up their smartphone to their surroundings, and, in the form of notes played through headphones, EyeMusic builds the scene pixel by pixel. The sound starts on the left hand side of the scene, the height of objects are conveyed through the pitch of the notes, the colour through instruments, and the proximity through volume.
The application activates the same category-dependant processing area of the brain as in sighted people. However, rather than traveling via the visual cortex, the signal enters the brain through the auditory cortex and is then diverted. Amedi claims that the brain is far more flexible than we realize, and that we simply have to find alternative routes to tap into areas that are previously blocked by impairment. Could this open up a whole new array of sensory experiences for the rest of the population? For those of us who pay to trick the senses and experience ‘dining in the dark’, could ‘the taste of color’ be the next culinary craze?
The future sounds bright for the blind, as Cyborgism is evolving and surpassing expectations in unexpected ways. The road to transhumanism is not only scientifically exciting but also culturally inspiring. Technological developments are allowing lines to be blurred between sight and sound, art and music, beauty and melody.