Why stop at five senses? Tech is letting us hack our brains to give us 'superpowers'

Tech is helping blind people see using sound, letting deaf people hear via taps on the back, and teaching anyone to feel magnetic north – and that’s just the beginning

Many people claim to have a sixth sense, whether it's being able to predict the weather or that they can just 'sense' something bad is about to happen.

And while science may scoff at these claims, expanding beyond our five senses and changing how they behave may not be as bizarre as it sounds. Technology is giving blind people the chance to see using sound, deaf people the ability to hear via taps on the back, and teaching anyone to feel magnetic north – and that’s only the beginning.

We don’t directly experience reality; our brains are locked inside our skulls and can only interpret the limited signals that our eyes, ears and so on are capable of picking up. For example, our eyes can only see a small slice of the electromagnetic spectrum – we can’t see infrared or X-rays, for example.

Neuroscientist David Eagleman points to the idea of “umwelt” in biology: animals only pick up the environmental signals that are useful to them. The slice of the world that creature can detect is its “umwelt”, and it assumes there’s nothing else out there.

But thanks to our brain’s remarkable neuroplasticity, we can smash through that: artist Neil Harbisson uses an intriguing head-piece called an “eyeborg” to not only get around his colour blindness via sound, but to see light on the spectrum that isn’t usually available to us humans.

Researchers have long examined the idea of sensory substitution, particularly to help those missing a sense, with Paul Bach-y-Rita in 1969 translating vibration on the back of blind people to an image of an object.

Fast forward to now, and Dr Giles Hamilton-Fletcher, a researcher at the University of Sussex, is working on a sensory substitution device that turns sound into images, representing bright objects as tones. “If you have a white object, it’s almost like it has a speaker playing on it,” he said.

The brain’s neural plasticity is so remarkable that not only do the sounds start being understood directly via the visual cortex, but one triallist – who has used variations of it for 12 years, as it takes careful training to use – went from seeing 2D black and white sketches to picturing colourful, 3D scenes. Neither aspect was actually possible to see via the system, but her brain filled in the details. Read more: How to hack your senses: from 'seeing' sound to 'hair GPS'

Advancements in technology mean more of us could start taking advantage of neural plasticity to augment or expand our senses – maker hardware such as Arduino boards mean such sci-fi goals are now achievable in weekend hackathons.

Eagleman last year spun off his university interests into a start-up called NeoSensory, which is working on giving sensory feedback to people using a prosthetic.

“You don’t get any sensory feedback like you do with your normal leg,” he said. The technology will use sensors on the bottom of a prosthetic foot to measure pressure and angle, delivering the feedback to the torso – “through skin that is displaced slightly by a few feet”.

Another NeoSensory project is a vest with vibration engines that translates spoken word into patterns. With a bit of training, users can understand what word is being said, with their brains subconsciously translating the complicated pattern into language.

While the focus is helping deaf people, the input need not be sound – it could be anything from Twitter feeds to stock market data or weather patterns. “Anyone who wants to buy a vest can try whatever data stream we haven’t even thought of yet,” Eagleman added, as it has an open API. “People can feed in whatever data stream they want to.”

Such sensory augmentation could be available for all of us in a few months, via start-up Cyborg Nest, part co-founded by Harbisson as well as another cyborg artist, Moon Ribas, who has a pair of implants in her arms to feel the seismic activity of the planet.

The start-up’s North Sense is a small patch that gently vibrates when you’re facing north, encouraging your brain to build up a better spatial orientation model. Two of its other co-founders, Liviu Babitz and Steve Haworth, spoke to WIRED from Las Vegas, where they were attending the Association of Professional Piercers to show off North Sense for the first time to the public.

There’s a reason the demo is at a piercing conference: the North Sense widget is held against the skin via two bar piercings, making it not quite an implant but not an easily removable wearable, either, said Babitz.

“The permanency of it is crucial here because otherwise it will be just a tool,” he said, like a compass or GPS while embedding it on your skin means it’s with you everywhere. “You can not leave your eyes at home if you’ve decided you don’t need them today,” said Babitz.

It’ll take a few months for the piercing to heal before you can start vibrating, and then another few months before you stop noticing the buzz and simply start knowing that you’re facing north. “It takes about six months for the brain to create new neural networks to understand what you’re experiencing,” Haworth said.

He knows, because he’s been through it before. A decade ago, Haworth embedded magnets in his fingertip, letting him feel magnetic fields – a WIRED writer tried the same with his assistance. Cyborg Nest is also working on ways to sense the environment, such as pollution levels or crowd density, as well as a technology to sense what’s happening behind you – which Babitz wouldn’t reveal many details about, though he promised a prototype already exists.

In the meantime, Cyborg Nest is already taking preorders for North Sense, with delivery in the next few months – they don’t want to rush manufacturing, noting that getting it wrong could hurt trust down the line and discourage the cyborg, sense-enhancing lifestyle.

“We see this as a first, but a very serious step, towards the future that is about to come,” Babitz said. “I think one important thing at this stage is to create trust...so even those who don’t like cyborgs will accept that cyborgs and technology are going to be part of us.”

Whether a better mental map constitutes a new sense or not is down for debate, however. Hamilton-Fletcher argues systems that translate data to patterns on our skin as “sense-esque” rather than true senses. “It’s no longer like a sense, it’s more like reading data,” he said.

"We're probably nowhere near the limit. The brain is enormous and it has plenty of room to take in all kinds of sensor data" (David Eagleman, neuroscientist)

Eagleman believes his data-interpreting vest does represent an extra sense, as what you see, smell and taste is – to your brain – nothing other than electrochemical signals between neurons, he says.

“What a sense feels like has to do with the structure of the data coming in. If you have data coming in of a particular sort...then you interpret it as vision. But if it’s a different structure, you’d say ‘that smells like something’.”

What happens if we add too many senses – will we overwhelm our brains? “That’s unknown,” said Eagleman, “but I feel like we’re probably nowhere near the limit...[the brain] is enormous and it has plenty of room to take in all kinds of sensor data.”

This article was originally published by WIRED UK