Do New Hearing Aids Represent Coming Wave Of Mind-Controlled Wearables?

Do New Hearing Aids Represent Coming Wave Of Mind-Controlled Wearables?

Scientists at Columbia University’s institute of mind brain behaviour have developed a prototype for an AI-powered hearing aid that allows its wearer to select the noises it wants to amplify by tuning into them the way our natural hearing system does. The listener’s neural signals select a favoured voice, conversation or any other noise. This is detected by the hearing aid’s technology, which then amplifies the mind’s favoured source of audio information at the expense of any other background noises.

The development is a significant leap forward on current hearing aid technology which is good at favouring voices while supressing other background noises but cannot identify and tune into particular voices. That is a problem in crowded spaces where many people are talking at the same time. Hearing aids amplify all of the voices simultaneously, often leaving the wearer with the choice of either being lost in a noisy cacophony or switching off or turning down their aid. The result of both options is that the wearer becomes isolated from conversations happening around them.

The area of our brains that processes sound is far more sophisticated. Nima Mesgarani, part of the Columbia University team that has developed the prototype hearing aid, explains that our brains “can amplify one voice over others, seemingly effortlessly.” The new technology the team have developed harnesses that ability of our brain by tapping in to the same neural signals.

The device’s AI first does the job of isolating all of the individual sources of noise being received through the ear. It then monitors the listener’s brainwaves. These provide the vital clue by matching to voice patterns the listener is trying to concentrate on. The hearing aid then amplifies the voice(s).

Earlier versions of the technology needed to ‘learn’ voices to amplify so worked with family and friends but not with new people. The latest version’s AI is an upgrade that can adapt to new voices never heard before. For now, the technology requires direct access to the brain, so has only been tested on volunteers who were anyway undergoing brain surgery. The next task of the team, whose research has been published in the Science Advances journal, is to develop a device that can be worn around the ear or on the scalp.

Technology able to interpret brainwaves and take its instructions from them in the same way as our bodies subconsciously do opens the door for other exciting future developments. The most obvious, and important, are likely to be in medicine. New generations of prosthetics that genuinely offer comparable dexterity and functionality to biological limbs are already starting to come through.

What odds that voice assistants such as Alexa will relatively quickly become mind-controlled assistants? And beyond that and further into the future, brain-controlled operating system suites we wear or are implanted could feasibly allow us to replace television remote controls, light switches and pretty much any other tech we currently operate by manually selecting options through a physical interface.

Leave a Comment