Best AI Hearing Aids
You may be wondering what all the fuss is about AI hearing aids. The buzz around AI hearing aid technology continues to grow, and you will see most of the leading manufacturers showcasing their latest AI hearing aid settings and features. Alongside all the hype, much of the content out there on AI in hearing aid technology is confusing and sometimes misleading. Hearing aid manufacturers themselves are often responsible for much of this confusion, and finding clear information about what AI can actually do in hearing aids, such as how it helps manage background noise, is difficult. In this article, we aim to address some common questions and misconceptions about how AI is currently utilised in hearing aids. However, first, we will need to examine some background on AI in general.
What is Artificial Intelligence AI?
AI is an umbrella term that encompasses any technology designed to simulate human learning, comprehension, problem-solving, decision-making, creativity, and autonomy. Examples we often come across are things like Chat GPT, which allow us to ask questions, provide data for analysis and generally treat the AI system as a font of knowledge to assist us in our daily activities. These AI systems genuinely feel ‘intelligent’ as they answer complex questions and can analyse very quickly anything we input.
Ok, so how does AI technology relate to how we use hearing aids?
We are obviously not asking our hearing devices questions, so how might AI be used in hearing aids? First, we need to take a step back to look at the different levels of AI which have developed over the last fifty years.
Machine learning (one component of AI)
Machine learning (ML) has been used in hearing aids to improve the users listening experience for around twenty years.
Essentially, machine learning allows computer chips to make predictions or decisions based on data. It encompasses a broad range of techniques that enable computers to learn from and make inferences based on previous data.
Consider a digital ‘Music Library’ that has thousands of songs categorised in various ways (genres, language, etc.). If we imagine the ‘Music Library’ is fed a new song (data). We can expect a machine learning system to identify the musical category to which the song should be assigned. Is it rock, pop, classical or contemporary? In other words, it is categorising the song type based on previous learning.
The more data that is used in this learning phase, the better the hearing programs system will be at classifying and analysing data.
So, are we saying that AI powered hearing aids are using pattern recognition?
For the central part, yes; however, there are some other aspects of AI being used. Let’s carry on our discussion of advanced AI technology.
I keep reading about deep neural networks (DNN). What are they?
Essentially, deep neural networks (DNN) are a component of machine learning inspired by how the human brain works. It’s called deep because it has many layers of interconnected “neurons” (artificial nodes) stacked between the input and the output.
Don’t worry, understanding nodes and how computer chips work is not necessary! These ‘nodes’ have been designed to mimic how our brains function – processing a large amount of information very quickly. Our brains also learn as we process more and more information. Think of it this way. You have a shiny, new computer chip and want to use it in a hearing aid. Your aim is for the chip to be able to analyse many different sound environments and make decisions based on its analysis.
Deep Neural Networks enable the chip to learn from initial training, allowing it to recognise not only the sound environments fed into the system but also generalise to new ones that a hearing aid user may encounter.
By using a large dataset with plenty of variability, the DNN will eventually be able to generalise to other environments to handle data accurately that it has never seen in training. So, just like a human brain, the more information the DNN has been trained on, the better it can recognise patterns and make decisions.
What does this mean for hearing aid users when it comes to AI?
There are currently three main ways ai enabled hearing aids learn using Deep Neural Networks (DNNs).
1. AI Machine Learning Algorithms
For these hearing aids, artificial intelligence has been used in the development phase to train noise reduction and speech clarity enhancement algorithms.
AI driven hearing aid manufacturers typically feed millions of audio samples into their computer chips in these training phases.
The hearing aids try to recognise the listening environment the user is in and make automatic sound adjustments based on pattern recognition.
In other words, the AI or Machine Learning used in the development of these ‘AI Hearing aids’ determines how they process sound when in use.
2. AI-Powered User Adjustments
A few hearing aid brands have introduced interactive AI features that let you adjust your hearing aids through app-based assistants. For example:
Starkey Edge Mode records a short sample of your environment and uses AI to automatically adjust your settings.
Widex MySound lets you choose between A/B sound profiles to help the system “learn” your preferences.
Signia Assistant allows you to chat with an AI within the app to fine-tune your sound. Whilst helpful, these tools are more about convenience and user interface than AI sound processing.
Whilst helpful, these tools are more about convenience and user interface than AI sound processing.
-
3.AI Hearing Aid Processing in Real-time
More recent trends in AI hearing aids involve using a dedicated chip inside the hearing aid that runs AI algorithms in real-time to separate speech from background noise.
This is still in its infancy, with one. quality hearing aid manufacturer (Phonak) making bold claims about its ability to improve speech-in-noise.
For the Phonak Audeo Sphere Infinio, the addition of a second chip has increased the size of the hearing aid. Battery consumption is also higher, and time will tell if this extra chip gives real-world benefits.
Anecdotally, clients whom I have fitted with these definitely report changes in sound quality when the real-time processing is active. Some report benefits while others report a difference in sound quality. Phonak claims its chip performs 7.7 billion operations per second, providing an idea of the complexity of the deep neural networks being used.
How are the manufacturers adding this real-time processing?
Most hearing aids use a single integrated chip responsible for DNN processing, with Phonak and ReSound using two chips.
Both camps claim advantages. The single-chip brands claim faster processing, smaller size and better battery life. The dual-chip camp claims they have more DNN processing power due to the extra chip. Interestingly, Phonak has increased the size of its dual-chip aid, while Resound has managed a relatively small dual-chip device. I suspect the Phonak is performing more real-time DNN processing than the Resound devices.
The future of modern hearing aids with artificial intelligence
The bold claims by traditional hearing aid manufacturers are often that. Yes, machine learning hearing aids have been available for some time, using algorithms that adapt to the user’s listening environment and some recent models do employ real-time ‘deep’ processing to refine the audio signal, making it easier for individuals to hear speech.
However, we mustn’t lose sight of the fact that hearing loss remains a significant challenge for many people.
This is the crucial thing not to lose sight of. Hearing loss makes it harder to hear, especially in the presence of background noise. The challenge for hearing aids is to make it as easy as possible for a digital hearing aid wearer to hear well in noisy environments, improve speech recognition and optimise untreated hearing loss.
This new technology has numerous benefits its changing and improving the listening experience of people with hearing loss. What is equally crucial is the ability of your audiologist to first make sense of all the technology available and, secondly, to guide you through the entire process of hearing aid fitting and follow-up.
Our Clinical Director, Paul, has a wealth of experience and expertise in hearing aid fitting, having previously taught the topic at the University of Leeds. Constantly updating his knowledge and staying current with researching hearing aid developments, he is ideally positioned to guide you through the process of selecting and optimising the best hearing aids directly for your current hearing health. Book your Hearing Evaluation online or call for more information. 0113 8730444
Book an appointment with our audiologist

