AudiologyOnline Phone: 800-753-2160


Phonak Infinio - Unlock a world of possibilities

Phonak Harnesses AI Technology in Hearing Solutions

Alyssa Ricevuto, AuD, MBA

July 15, 2024
Share:

Dr. Alyssa Ricevuto discusses everything you need to know about AI in the current technology, from Phonak Autosense OS and programming to AI’s role in improving speech understanding and the potential impact on human audiologists.

 

AudiologyOnline: What is Artificial Intelligence (AI)?

Figure

Alyssa Ricevuto, AuD, MBA: Artificial Intelligence, or AI, as we commonly refer to it, is a “branch of computer science dedicated to creating intelligent machines that work and react like humans1.”

AudiologyOnline: What is machine learning and how is it different from deep learning or deep neural network (DNN)?

Alyssa Ricevuto, AuD, MBA: Machine Learning is the process that gives AI the “ability to learn1.” Deep learning or deep neural network (DNN) are interchangeable terms and a subcategory of machine learning that “provides AI with the ability to mimic a human brain’s neural network1.” According to a recent study, harnessing the capabilities of DNN has the potential to improve the signal-to-noise ratio in hearing aids by up to 12 dB3.

AudiologyOnline: What are the benefits of AI in hearing aids?

Alyssa Ricevuto, AuD, MBA: Hearing technology companies are in the early stages of applying AI and DNN to enhance hearing aid performance and those use cases will continue to evolve. There are so many possible ways this new technology can help patients. AI and DNN can potentially be used to make speech clearer in a noisy room, assist with hearing speech from any direction, enhance battery efficiency, improve fitting flow, optimize mixing and personalizing custom settings, and making adjustments and improving connectivity for phone, televisions, and computers.

1

AudiologyOnline: How is Phonak using AI?

Alyssa Ricevuto, AuD, MBA: Phonak has been using AI in hearing technology since 2000. Our first AI based machine learning system was SoundFlow on the CORE Platform. CORE was the first to utilize a form of AI within its groundbreaking automatic system and was available in the Phonak Exelia hearing aid.

2

For the last 24 years, Phonak hearing devices have been using machine learning to identify and then classify the environment. The technology has continued to evolve as sound steering has improved through  additional training with real world audio recordings. Our brain of the current system, called Autosense OS, is programmed to automatically scan the patient’s surroundings and make changes in real time to provide clear, rich sound to the hearing aid wearer.

3

With Audeo Lumity, we introduced SmartSpeech Technology in 2022 within AutoSense OS. This feature focused on the primary reason patients seek out hearing aids, for improvement in speech understanding in noisy environments. The Lumity platform is the most sophisticated application of AI/DNN in Phonak history and is still only the beginning of how this technology will be used to enhance the hearing experience in real-life situations, particularly when it comes to challenging environments with background noise.

AudiologyOnline: What is SmartSpeech technology and what can it do?

Alyssa Ricevuto, AuD, MBA: We know that the number one complaint of patients is hearing in noise and that is where SmartSpeech technology shines.  SmartSpeech can improve speech recognition in noisy conditions by 16%, when speech is coming from in front of a wearer. When speech is coming from the side or from behind, this feature can provide 15% better speech understanding with an 11% reduction in listening effort in that same noisy setting. This proves that Phonak products not only provide the clarity your patients are looking for, but also are proven to reduce listening effort. Every bit of reduction in cognitive load throughout the day allows patients to use that energy in other aspects of their lives2.

AudiologyOnline: Will AI replace the human audiologist?

Alyssa Ricevuto, AuD, MBA: Your patients still want and need counseling along with personalized support so AI does not replace the valuable role of the HCP in the patient journey.

The greatest use for this ever-evolving technology is when we combine professional audiological care with AI tools and technologies to provide the best possible outcomes for their patients.

At Phonak, we are dedicated to helping our hearing care professionals navigate the ever-evolving AI landscape. We recognize your crucial role in the patient journey. Please visit our website to see how we are always evolving to help you, and reach out to your dedicated Phonak representative with any questions.

References

Simplilearn Content Team. (2024, April 2). How Does Artificial Intelligence (AI) Work and Its Applications. Simplilearn. https://www.simplilearn.com/tutorials/artificial-intelligence-tutorial/how-does-ai-work

Woodward, J. & Latzel, M. (2022). New implementation of directional beamforming configurations show improved speech understanding and reduced listening effort. Phonak Field Study News. Available on phonak.com Evidence Library.

Diehl, P.U., Singer, Y., Zilly, H. et al. Restoring speech intelligibility for hearing aid users with deep learning. Sci Rep 13, 2719 (2023).

 

Phonak Infinio - December 2024


alyssa ricevuto

Alyssa Ricevuto, AuD, MBA

Marketing Manager- Marketing Programs

Dr. Alyssa Ricevuto has been with Phonak since 2017, serving in various training roles as Clinical Trainer, Key Account Clinical Trainer, and Senior Clinical Trainer. She is currently the Program Marketing Manager. Prior to joining Phonak, Alyssa spent ten years working as a clinician in both private practice and ENT settings. Alyssa received her BA from the University of Connecticut, Doctorate of Audiology from the University of Oklahoma Health Sciences Center, and her MBA from Southern New Hampshire University.



Our site uses cookies to improve your experience. By using our site, you agree to our Privacy Policy.