AudiologyOnline Phone: 800-753-2160


ReSound Enzo IA - August 2025

The Future of Hearing: Designed for People, Powered by AI

Andrew Dittberner, PhD

February 1, 2026

The Future of Hearing: Designed for People, Powered by AI - An interview with GN Chief Scientific Officer, Dr. Andrew Dittberner.

 

In this interview, GN Chief Scientific Officer, Dr. Andrew Dittberner, explains how "augmented intelligence" and dual-chip technology allow hearing aids to work in partnership with the human brain. He details a future where devices move beyond simple settings to intuitively learn a user’s personal preferences and intent in real-time.

 

AudiologyOnline: Artificial Intelligence (AI) can be a little confusing in its application to hearing aids. From GN’s perspective, what is “AI” in hearing aids?

Dr. Andrew Dittberner: When we talk about AI in hearing aids, we mean technology that can do things that usually require human intelligence, like recognizing patterns and making decisions in real time. A traditional hearing aid senses sound and reacts according to fixed rules. An AI‑enabled hearing aid senses, tries to understand what’s happening, and then decides how to respond.

At GN, we focus on what we call “augmented intelligence.” We don’t want the device to replace the user’s brain; we want it to work with the brain. The hearing aid brings computing power, but the user’s own behavior and perception guide what is important. That’s our Organic Hearing philosophy: combining human intelligence and machine intelligence.

AudiologyOnline: While AI has been used in hearing aids for several years, its adoption has accelerated significantly in the past year with the introduction of dual-chip Deep Neural Network hearing aids. How has this technology enhanced the hearing experience for users compared to previous applications of AI?

Dr. Andrew Dittberner: Historically, AI was mostly a tool for engineers behind the scenes. The big step with our dual‑chip hearing aids, such as ReSound Vivia, is that advanced AI now runs inside the hearing aid to help with hearing in everyday life.

The clearest benefit is in noisy situations. Instead of simply turning everything down when it gets loud, our AI is trained to recognize and reduce specific unwanted sounds while protecting speech and important sound details.

We do this by combining classic digital signal processing with deep neural networks that are very good at recognizing complex sound patterns. The dual‑chip design gives enough computing power to run both at once, in real time, while still protecting battery life.

AudiologyOnline:  How does GN deploy AI technology into its hearing aids?

Dr. Andrew Dittberner:  We share the work between the hearing aid, the AI algorithms, and the user’s brain. That’s how our Organic Hearing philosophy shows up in practice.

Imagine you’re in a noisy restaurant. First, the hearing aid’s traditional processing cleans up the signal using directional microphones. Then the AI identifies and turns down known unwanted sounds while preserving speech and spatial cues. Finally, your brain decides what to focus on.

ReSound AI HAs

 

 

 

 

 

 

 

AudiologyOnline: What differentiates GN’s approach to AI?

Dr. Andrew Dittberner:  Our main difference is that we deliberately use AI to support the user, not to take over. In some systems, the hearing aid tries to decide on its own what you want to hear and what you don’t. Sometimes it gets it right, but when it doesn’t, the result can be jarring for the user.

In our design, the hearing aid is guided by user behavior. It looks at how you interact with sound and uses that to protect likely wanted sounds and reduce likely unwanted ones. If it isn’t sure what the main sound of interest is, it avoids making an aggressive cut. Instead, it keeps more of the soundscape available so your brain can decide.

I often compare this to an electric bike. On flat ground, the motor doesn’t need to help. When you go uphill, it quietly kicks in and supports you. It never replaces you; it just makes things easier.

resound vivia navy blue

AudiologyOnline:  Are there any particular challenges or opportunities with AI and its application in hearing aids and for people with hearing loss?

Dr. Andrew Dittberner:  The opportunity is to make hearing much more personal. AI lets us shape sound more closely to each person’s needs, environments, and preferences. It opens new ways to handle complex, changing listening situations.

The challenge is that advanced AI usually needs a lot of computing power, which can mean larger devices and higher battery use. At GN, we added AI hardware in ways that keep the form factor essentially the same and protect battery life.        

AudiologyOnline:  What can people expect from AI hearing aids in the near future? What’s the next big innovation?

Dr. Andrew Dittberner:  You can expect AI hearing aids to become more personalized and more adaptive. They won’t just be set to your audiogram; they’ll increasingly learn from how you live, where you go, and what you tend to listen to.

We want this to feel like an invisible partnership between you and the device. The AI should quietly learn what you consider important and what you consider noise, instead of constantly asking you to change settings. As that improves, we’ll be able to reduce listening effort, improve understanding in difficult environments, and keep everything sounding natural.

AudiologyOnline: How will future AI‑enabled hearing aids get better at understanding a user’s intent and listening goals in real time?

Dr. Andrew Dittberner:  To do that well, we must look at how humans and machines interact. The hearing aid needs to interpret the world in a way that lines up with how the user experiences it. You can think of it as a kind of digital twin that tries to understand which sounds are important at any moment.

That means using smarter data. We may use information like head movements, gaze direction, or other sensors that help identify sound sources. Then we need to translate that into signals the AI can use in a way that matches human perception.

AudiologyOnline: You mention “personalization,” which is a big topic in AI. How is GN using AI to tailor the hearing experience to each individual’s lifestyle, and not just their audiogram?

Dr. Andrew Dittberner:  An audiogram is important, but it only tells part of the story. Real personalization means understanding what each person considers a “signal of interest” and what they consider noise, and that can change from moment to moment.

If the hearing aid tried to make all those decisions alone, it would inevitably get some of them wrong. At GN, we start by training our AI on large sets of sounds that are very likely to be unwanted in many situations. That gives us a strong base for noise management. For wanted sounds, we rely on understanding user behavior: how people naturally act when they are engaged with something they want to hear.

When the system can’t be sure if a sound is wanted or unwanted, we avoid forcing a hard decision. We preserve the sound so that the user, with their own binaural hearing and cognitive abilities, can decide what to attend to and what to ignore. That keeps the experience natural and user‑centric.

AudiologyOnline:  What do hearing care professionals need to know about AI hearing aids to best support them?

Dr. Andrew Dittberner:  While hearing care professionals don’t need to become AI engineers, it helps to have a clear picture of where AI is operating and what it does.

In the hearing aid itself, AI provides real‑time benefits like better hearing in noise, scene analysis, and intelligent automation of settings, always working alongside proven digital signal processing. On connected devices like phones, AI appears as tools that help with fitting and personalization, such as checking how well the device is seated or guiding user adjustments.

In the cloud, AI can help professionals monitor how devices are used in the real world, how well they are performing, and when there may be issues that need attention. In that sense, AI becomes a support partner for the clinician, offering insights and suggestions without replacing their judgment.

AudiologyOnline:  What is the long‑term potential benefit of AI for hearing loss?

Dr. Andrew Dittberner:  Today’s hearing aids are mostly reactive. They sense what is happening and respond based on rules we designed earlier. With AI, we can create systems that try to comprehend the listening situation and adapt far more flexibly.

The number‑one complaint we still hear is difficulty hearing in noise. That is a tough problem because what counts as “noise” changes constantly with context and intent. AI is uniquely suited to tackle this kind of dynamic challenge. By combining powerful AI with the strengths of the human brain, we believe we’re close to finally solving this ongoing challenge.

 

 

For more information on this topic, please visit ReSound.com

Industry Innovations Summit Live CE Feb. 1-28


andrew dittberner

Andrew Dittberner, PhD

Chief Scientific Officer, GN

Andrew Dittberner, PhD, is Vice President of Research at GN Hearing. He received his Master of Science degree in 1998 from the University of Arizona, his PhD in Speech and Hearing Science in 2002 from the University of Iowa, and completed professional graduate work in communication engineering at UCLA. He serves on a number of ANSI working groups and continues to consult and support NIH-funded initiatives. He holds adjunct professor positions with three universities and served as Chair for the Hearing Aid Industry Research Consortium, an organization with the purpose to develop and direct a non-competitive, mutually agreed upon research agenda that benefits the hearing instrument industry, its customers, and end-users. At GN Hearing, he leads an international group of research teams located in both USA and Europe, including partnerships with global-leading academic centers in their respective disciplines. These teams, consisting of highly skilled scientists and engineers in the areas of neuroscience, advanced signal processing, acoustics, sensory engineering, and machine learning, create and prototype new audio solutions for those experiencing everyday hearing disabilities, needing hearing protection, and to enhance/augment their overall hearing experience.