AudiologyOnline Phone: 800-753-2160


Widex Smarter Solution - January 2024

20Q: Hearing Aids, Hearables and the Future of Hearing Technology

20Q: Hearing Aids, Hearables and the Future of Hearing Technology
Dave Kemp
August 8, 2022

To earn CEUs for this article, become a member.

unlimited ceu access $129/year

Join Now
Share:

From the Desk of Gus Mueller

Gus-mueller-contributing-editor

An often-used saying of the past 100 years or so goes something like this: “If it looks like a duck, walks like a duck, and quacks like a duck, then it’s probably a duck.” So what if it looks like a hearing aid...?

Alternative products and distribution systems for hearing aids have been around for a long time.  Some of you might remember the ubiquitous Lloyds Hearing Aid ad in the Sunday newspaper, going back to the 1970s.  Products like this historically were called “mail-order hearing aids,”  but now typically are dubbed direct-to-consumer (DTC) products.  And there appears to be a lot of them.

Somewhere along the way, the FDA got involved and coined the term Personal Sound Amplification Product (PSAP), defined as an electronic instrument for non-hearing-impaired consumers to amplify sounds in certain environments, such as for hunting or other recreational activities, and not intended to aid persons with or compensate for impaired hearing.  Now we all know that many PSAPs look, walk and talk like a duck, but technically, they’re not a duck.  But what if they are advertised as being a duck or very duck-like?  Does the FDA or FTC step in?  Well, as my colleague audiologist Tom Powers often says: “Last I checked, they don’t have hearing-aid-sniffing dogs in the Miami airport.”

And then of course we have the more modern term “wearables,” which have a sub-category called “hearables”—some of which look nothing at all like a duck.  Confusing?  Quite often, yes.  Fortunately, we have a handful of people who understand all this, and one of them is our 20Q guest author this month.

Dave Kemp is the Director of Business Development & Marketing at Oaktree Products, which truly is a Mom and Pop business, founded by his parents, Margy and Bob in the basement of their home in 1992.  He perhaps is best known for his online presence, which includes his popular blog, FutureEar.co, and his Future Ear Radio podcast, where he and his guests discuss emerging technology pertaining to hearing aids and consumer hearables—even a few things on the Aural Attention Economy.

Dave writes frequently for the prominent voice technology website, Voicebot.ai, and also has been featured on NPR’s Marketplace. He is a popular workshop lecturer, co-authored the book “Voice Technology in Healthcare,” and has numerous publications, including the prestigious Harvard Business Review.

True to his blog “FutureEar,”  in this 20Q Dave fills us in on what we can expect in the emerging world of ear-worn computers, and yes, some might at least walk like a duck.

Gus Mueller, PhD
Contributing Editor

Browse the complete collection of 20Q with Gus Mueller CEU articles at www.audiologyonline.com/20Q

20Q: Hearing Aids, Hearables and the Future of Hearing Technology

Learning Outcomes 

After reading this article, professionals will be able to:

  • Identify different qualities between consumer hearables and hearing aids.
  • Describe various hearing augmentation solutions.
  • Discuss new developments in hearables, such as sensors and machine learning algorithms.
Dave-kemp-author
Dave Kemp

1. I understand that if I want to know about the future of hearing aids, you’re the guy to go to! True?

Well, I don’t know about that, but it certainly is an interest of mine, and I do just so happen to have a blog named “Future Ear.” When I joined Oaktree in 2016, it was right around the same time that made-for-iPhone (MFi) hearing aids seemed to be catching on in the market. I felt that MFi hearing aids would prove to be as transformative to hearing aids as the iPhone was to phones. So, I started Future Ear to start unpacking all the ways that hearing aids were becoming more computerized and what kinds of new features might materialize as a result.

It started as a series of long-form essays, followed by a period where I was doing short-form daily updates. In 2019, I started the Future Ear Radio podcast, where I interviewed experts on the various technologies and trends that I had been writing about.

The tagline of Future Ear reads, “connecting the trends converging around the ear.” I’m largely just educating myself about things I find interesting and sharing what I am learning.

2. Well, I’m here to listen. Exactly what types of trends are converging around the ear?

The first place I’d like to start with is this theme, “The Peace Dividends of the Smartphone Wars” from Chris Anderson’s 2012 book, Makers. Chris is the CEO of a company called 3D Robotics and was the former editor-in-chief for the tech magazine, Wired. In his book, Chris describes how all of the consumer technology that has followed on the heels of mass smartphone proliferation is effectively a “dividend” yielded by the gigantic economies of scale ushered in by the companies that produce smartphones.

All the components inside the smartphone – the GPS, cameras, computer chips, memory cards, batteries, microphones, speakers, sensors, and so on – are the same components that populate pretty much all consumer technology products from drones to smart speakers to AirPods and yes, even hearing aids.

So, if you look across the last 15 years or so, it’s not just that we’ve had a mass proliferation of smartphone technology across the globe. We’ve also seen mass proliferation and innovation of every component within the smartphone supply chain as well. Every component within the supply chain has become dramatically more affordable, allowing the broader consumer technology space to flourish.

3. You mentioned “dividends.” What is a typical example of a “dividend” that relates to hearing aids?

I would point to what’s going on with computer chips broadly speaking with the migration toward “systems-on-a-chip” (SoC). Companies like Qualcomm, Broadcom, and Apple have been battling to design these really sophisticated computer chip architectures that integrate and stack together the various components that power many of our consumer devices.

Without geeking out too hard here, the main point is that when we’re considering teeny, tiny in-the-ear devices, the real estate inside those devices is at a really high premium. There’s only so much space to go around, and the battery is something that’s very hard to scale down in size.

Fortunately, the engineers and designers who are creating today’s in-the-ear devices are solving the “real estate” issue by utilizing SoCs and integrating the microphones, sensors, Bluetooth antenna, central processing unit (CPU), etc. together onto a single chip.

We in the hearing health industry are among the biggest beneficiaries of the trend toward microtization of computers, since we’re dealing with some of the most micro-sized computers out there.

4. That makes sense. It would seem that these “dividends” also would apply to the other products we are seeing that technically are not hearing aids?

Most certainly. It sounds like you’re referring to “hearables.” Wearables are defined as, “body-worn computers” and Nick Hunn, the chairman of the Bluetooth Special Interest Group (SIG), coined the term “hearable” as a sub-set of the wearable product category in his seminal 2014 post, “Hearables – the new Wearable.” (Hunn 2014)

The fact of the matter is that everything we wear in our ears today are increasingly becoming computerized and therefore being transformed into hearables. That’s why I think this term is kind of confusing, because hearing aids are the most sophisticated ear-worn computers that exist, so, wouldn’t hearing aids be considered hearables as well?

5. I see your point, but how then should we describe these computerized in-ear devices that are not hearing aids?

The term “hearable” represents all “smart” ear-worn devices. There are $100 Skullcandy headphones that contain hearing augmentation features, just as there are earbuds like the Jabra Enhanced Plus device that have full hearing aid functionality. It’s a product category that encompasses a broad spectrum of devices with varying degrees of sophistication.

To distinguish what I’m referring to during our discussion, I’ll refer to hearing aids as hearing aids and “hearables” as consumer hearables. The term hearables (without “consumer” preceding it) would describe the product category.

There are a lot of terms that are tossed around today, so it does get a little confusing. Everyone of course is talking about OTCs, and you may have heard of the terms Personal Sound Amplification Product (PSAP) and “Direct-to-Consumer” (DTC) used to describe categories of products (many of which are not hearing aids) or delivery systems. For this discussion we are having, I’m not specifically using those terms because the bigger picture that I’m trying to paint here is that without the FDA-grade classification, PSAPs and OTC devices are really just hearables that focus on hearing amplification features and packaged in form factors that might look like hearing aids.

In the US, the big distinction between hearing aids and consumer hearables is really with hearing aids’ regulatory status of being class II medical devices and what that status affords the device manufacturers in terms of how the devices are legally allowed to be programmed to perform.

The regulatory status has played an influential role in the types of hardware and technology used to power the devices, as the whole internal component composition of hearing aids is designed specifically for the parameters outlined within the regulatory framework.

6. I don’t quite follow how the regulatory framework would influence the design of hearing aids?

A clear example of this is with the use of application-specific integrated chips (ASICs), which is what nearly all hearing aids run on. As the name implies, this type of chip is used for specific applications (i.e. highly sophisticated audio and speech processing that runs on low power) that are “hard-coded” within the chip. The downside to ASICs is that they’re expensive to manufacturer as they require substantial engineering costs that must be absorbed into the cost of the device by the original equipment manufacturer (OEM). Therefore, you need some economies of scale to bring those costs down, which has served as a barrier of entry for new hearing aid OEMs.

For non-hearing aids, ASICs are overkill and too rigid with the types of functionality that they can support (remember, app-specific integrated chips). In the consumer hearables realm, the majority of devices run on digital signal processing chips (DSPs), which provide for more flexibility in the types of applications that they can support as they’re much more generalized in their uses. In addition, DSPs are used in a vast swath of consumer electronics devices, making them much more financially palatable for the consumer hearable OEMs to use to run their devices on.

Regulatory status aside, there are and will continue to be key differences from a hardware and technology standpoint between FDA-grade hearing aids and non-hearing aids, but certain differences are becoming less pronounced. Take battery life for example. Breakthroughs on the wireless technology side of things with the advent of Bluetooth Low Energy (BLE) and lower power DSPs allows for better battery preservation. Not to mention all the pocket chargers that are so pervasive now, allowing people to fast-charge their earbuds. There’s still quite a bit of disparity around battery life, but the gap is shrinking. That’s true for multiple facets of hearing aids vs. consumer hearables.

7. Consumer hearables and hearing aids share an increasing amount of qualities, but there are key differences and specific roles that each type of device will play into the future?

I think you’ve got it! There’s a popular 2010 Steve Jobs’ quote about the role that laptop and desktop computers took on as mobile became the dominant method of computing:

“When we were an agrarian nation, all cars were trucks, because that’s what you needed on the farm. But as vehicles started to be used in the urban centers, cars got more popular … PCs are going to be like trucks. They’re still going to be around, they’re still going to have a lot of value, but they’re going to be used by one out of X people.”

As hearables of all types broadly proliferate, I believe hearing aids will take on the same kind of specialty “truck” role that Jobs was describing. Hearing aids have always taken on a specialty-type role, what’s changed is the emergence of its more generalized counterpart.

8. What you are saying is that hearing aids always will be well-suited for some tasks, but consumer hearables are increasingly becoming capable of handling a wider load of tasks?

Exactly. This is where it might be helpful for us all to draw from our own personal experiences with how we compute today. How much of your daily computing is done on a phone? A tablet? A laptop? A desktop? Which tasks have migrated from one device to another in the past few years? Why?

This is why I fixated earlier on the definition of hearables. An increasing amount of people are wearing mini ear-computers and some ear-computers might be more optimal than other ear-computers for specific use cases, just as one might prefer using an iPad over an iPhone for certain things. For example, if a consumer frequently video conferences for work, what does their set up look like? Would they prefer those Zoom calls to be streamed through their hearing aids or through over-the-ear hearables that provide personalized audio augmentation?

I would argue that the rising sophistication of consumer hearables will ultimately serve as a big net-positive as they’re going to be one of the most powerful tools we’ll use to combat the “7-10 year gap” between when a consumer realizes they have some level of hearing loss and when they choose to take action.

9. I’m certainly very familiar with “the gap” with my patients, but not quite sure how hearables are going to fix this?

The question that the 7–10-year gap presents is, “why are people waiting so long to pursue hearing correction?” Price, access to hearing care, the stigma surrounding hearing aids, and denial are the most frequently cited objections to treating one’s hearing loss. Those four objections detract from hearing aids’ value proposition.

For the majority of prospective customers, the high price point, the process of seeing a hearing professional, and/or the negative connotations that the customer might have with hearing instruments, tend to outweigh the perceived upside of using hearing aids.

What’s interesting though is that people who do ultimately take the plunge and get professionally fitted with hearing aids, report high satisfaction rates. According to Market Trak 2022, 83% of hearing aid owners are cited as being satisfied overall with their hearing aids. (Powers & Carr, 2022) In fact, the satisfaction rate of hearing aids by device owners has continued to rise with Market Trak surveys over the years (e.g. 58% in 1989, increasing to 83% in 2022). (Kochkin, 1990)

So, the data would suggest that the persistently low adoption rates of hearing aids are less of a testament to the quality of the devices or the professional service, and more closely tied to compelling people to give the devices a fair shot in the first place.

Which brings us back to a not-so-exciting value proposition. 

OTC and DTC hearing aids might be attractive options to some prospective customers whose primary objections are price and/or access. But these device offerings will still need to address the stigmatization that deters people from seeking a solution or represent a giant leap for those who deny having an issue in the first place.

Perhaps the key to increasing overall adoption of hearing correction lies with exposure to hearing augmentation, even if that exposure is to a basic level of augmentation.

10. If we expose more people to hearing augmentation, they might come to realize what they’re missing?

Maybe. As the proverbial saying goes, “You can lead a horse to water, but you can’t make him drink.” While those of us working within the hearing healthcare space feel strongly about the importance of hearing health, there seems to be a disconnect with how the public is evaluating their options. They’re not really drinking, per-se, and the 7-10-year gap is the result.

Now contrast that with what’s transpiring with the hearing health offerings within the consumer hearables offering.

Take the partnership between Qualcomm and Jacoti, for example, where Qualcomm has licensed Jacoti’s hearing enhancement technology to be embedded within Qualcomm’s flagship consumer hearable chip, the QCC5100. This is a chip that’s widely used in an array of consumer hearable devices and through this partnership, Qualcomm is providing manufacturers with an out-of-the-box hearing enhancement solution.

Even if the Jacoti hearing health features are rudimentary, it still helps to provide people with exposure to hearing augmentation. And given that we’re seeing hearing augmentation and sound personalization features being embedded at the chip-level implies that these kinds of features will spread far and wide.

This sets up the “Hearing Health Trojan Horse.”

11. I’m a little weak on Greek mythology, but keep going…

As hearing augmentation features proliferate, it creates a scenario where consumers might be completely unaware that the next set of consumer hearables that they purchase includes a hearing augmentation feature. They’re not made aware of this type of feature until post-purchase.

Imagine the people with gradual hearing loss who can suddenly hear the birds chirp again with a quick toggle of a setting. Would this not start the gears in motion within the consumer’s mind? “What other sounds have I been missing?” (You can lead a horse to water…)

Consumer hearables with hearing health functionality helps increase the surface area potential for these types of epiphanies to occur. Hearing professionals see these kinds of revelations play out in real-time within their clinics, but there are so many steps required to reach that point. In today’s paradigm, the patient must jump through a series of hoops for them to begin to understand and appreciate the value proposition of hearing correction.

In essence, the more people that proactively choose to treat their hearing loss sooner would translate to a larger population of people who might want to eventually graduate into seeking professional hearing care and higher levels of technology.

12. What are some other ways we might increase exposure to hearing augmentation?

The workplace is ripe for all kinds of hearing augmentation solutions. There are lots of new types of software that can be used for jobs that involve using a computer or phone for an extended period of time.

Applications like Sonic Cloud and Mimi can be used to personalize the audio coming from one’s computer or phone. For example, Sonic Cloud allows you to adjust the frequencies of all the sound coming out of your laptop, so that you can tailor that sound to your liking, rather than just increasing the volume across the board. So, for workers who video conference a lot at work, these apps can really help to optimize those sound environments for them.

Hearing professionals might want to consider adding business owners and employers to their lists of potential prospects to target with their marketing, as audio augmentation becomes more widely available via third-party applications, or with hardware that has audio augmentation embedded into the office headset itself.

13. Circling back to the trends converging around the ear, what else should we be on the lookout for?

Let’s talk about the rise of (internet) spoken word audio, which is an umbrella term that includes podcasts, audiobooks, sports talk, and news.

NPR and Edison Research recently published their annual, “Spoken Word Audio Report.” In this report, they found that 45% of the US population now listens to spoken word audio daily. On average, these daily listeners are tuning into their spoken word audio for 2 hours and 6 minutes per day.

Bluetooth connected hearing aids can serve as the medium for daily listeners to consume their 126 minutes of spoken word audio. How many hearing aid wearers would cite streaming audiobooks or podcasts as one of the favorite things to do with their hearing aids today?

This use case has only existed for as long as Bluetooth-enabled hearing aids have, which is less than ten years. The things that matter to prospective hearing aid wearers are changing.

Remember, people migrate use cases between computing modalities and those use cases flow in different directions. For example, a discreet RIC-style device might be quite compelling for podcast fanatics. The types of use cases that matter to people with consumer hearables will increasingly matter for people considering hearing aids too.

14. What else is happening “around the ear?”

What I’m really excited for is for hearables to begin utilizing the combination of sensors (data collection) and machine learning algorithms (make sense of the data).

There are a variety of use cases that will spawn from this combination. For example, we’re already seeing hearing aids that utilize machine learning for the signal classification of incoming inputs such as speech-in-quiet noise, speech-in-noise, wind, reverberation, and even the patients’ own voice, and then automatically alter signal processing features, gain and output to optimize performance for that listening situation.

We’re going to see a lot more use cases where the devices are constantly monitoring its environments and adjusting its settings based on the data that it’s registering through the sensors that the device is laden with.

Sensor fusion will be a big theme across the next five years.

15. Sensor fusion?

Similar to how we’re seeing a migration toward systems-on-a-chip, we’ll see similar design architectures used to create networks of the sensors used inside hearables. Sensor fusion is how these sensors will all communicate and operate together as a network.

There are a variety of sensors that are already being implemented into hearing aids and consumer hearables, such as inertial sensors and motion sensors. Inertial sensors, such as accelerometers and gyroscopes, are used to detect the orientation of the person wearing the device. In some products, signal processing automatically is altered when the person is moving. Hearing aids like Starkey’s Livio AI use inertial sensors for its fall detection feature.

We’re now seeing a new series of sensors beginning to be integrated into hearables, called biometric sensors.

16. Biometric monitoring? Like an Apple Watch?

Exactly, and since hearables are really a subset of the broader wearable product category, we should probably look to devices like the Apple Watch as being a precursor of what’s to come at the ear.

Wearable companies struggled initially to figure out what specific use cases body-worn computers excelled at. Initially, most of the product positioning was around step-count and other rudimental fitness metrics that could be gathered from the inertial sensors embedded in our earliest wearables.

As time has gone on, more sophisticated sensors, such as heart-rate monitors, ECGs, and blood oxygen sensors, have been introduced into our smart watches and rings. Simultaneously, the data science methods used to analyze the data via machine learning has gotten exponentially better.

In a matter of only a few years, we went from smartwatches that count steps, to ones that can detect atrial fibrillation. That’s less of a testament of hardware improvements, and more about the progress that’s been made on the data science that allows for higher levels of understanding of what the data means.

We’re now seeing the same types of biometric sensors slowly migrate up to the ear with new hearables offerings. From a data monitoring standpoint, there are a lot of advantages of collecting data with hearables.

17. Really? Such as?

Accuracy - The method in which optical, photoplethysmogramsensors (PPG) gather and measure data (LED light emittance) is very sensitive to light scatter from bone, muscle and other tissue. Since the ear is primarily cartilage and blood vessels, it allows for a much more accurate data collection and reading than most other body areas.

New Metrics – Most of us working with the ear know that our tympanic membranes radiate our body temperature. That’s a tough metric to capture at the wrist, but would be quite easy with hearables outfitted with a temperature sensor to read those data.

Net Metrics Part Deux – the microphones on hearables will also collect a variety of interesting information, from the acoustical environment to the user’s speech.

Stability – one of the biggest issues with collecting data at the wrist is that motion can distort the readouts. Even if our bodies are in motion, our ear canals remain pretty stable (unless you’re going to be doing somersaults and cartwheels).

18. Good points. Hearing aids laden with new-age sensors would make for great data collection tools?

Yes, an all-day, sensor-equipped device that is worn inside the ear canal is a pretty ideal combination to build out a personal longitudinal health data set.

Traditionally, we’ve been capturing this type of health data, such as our blood pressure, maybe once per year at the doctor’s office? With wearables, we can be collecting and logging this information on the minute every hour.

Logging all these data and creating longitudinal data sets is meaningless without the ability to make sense of the data, which necessitates machine learning to generate actionable insights.

Wearable companies like Whoop and Oura are out in front here, providing their users with deeper-level insights around diet, sleep, exercise and overall health. For example, Whoop users are given a “recovery score” each morning that is based around the user’s heart-rate variability, resting heart rate, sleep and respiratory rate. The “recovery score” is intended to help quantify the readiness for the user to work out or rest based on the aggregate data.

So, for me, the question I think about is, “what will be analogous to Whoop’s recovery score when it comes to the insights that can be gleaned from the data being collected by hearables?”

One of the areas I’m watching closely is to see what types of metrics hearable OEMs focus on from a data collection standpoint to better understand the actionable insight possibilities. Will these OEMs move toward spaces like memory and cognition? Speech deterioration as measured against the user’s longitudinal benchmark?

So, going back to sensor fusion, I’m thinking about the totality of data that can be gathered and combined to allow for deeper and more insightful levels of understanding that are generated by the algorithms running on top of the data.

19. Say I wear an Apple Watch, and also a next-generation hearable that has all kinds of sensors on it. Does all those data get pooled together?

That certainly could be a reality. Apple already dominates wearable device sales, but I think the company’s much bigger wearable aspiration revolves around health data. In a 2019 interview, Apple CEO Tim Cook was quoted as saying:

“I believe, if you zoom out into the future, and you look back, and you ask the question, ‘What was Apple’s greatest contribution to mankind?’ it will be about health,”

For most of us today, our electronic medical records (EMRs), diagnostic history, and general health data are stored in several fragmented silos (EPIC, Cerner, Praxis). There is very little data synthesis happening across each of our personal health data sets, which is part of the reason why our healthcare system is so reactive.

Apple Health is uniquely positioned to bring all of our data together into one big data repository, including third-party wearable data. Again, if we look toward the wrist as a precursor of what’s coming at the ear, then look no further than the integrations of companies like Whoop and Oura into Apple Health.

It’s possible that in the not-too-distant future, we’ll see hearing aid and consumer hearable companies provide the consumer with the ability to integrate their device’s data into centralized repositories like Apple Health. This all amounts to more proactive types of insight, such as potential health risks being flagged within the data set.

20. That’s a tantalizing future you’re painting! You might want to consider doing a podcast called something like Future Ear!” Any last thoughts?

We in the hearing health world are living through a pretty exciting period. As I’ve outlined throughout our chat, there are a lot of macro trends that bode well for the future of hearing tech, the role of the hearing professional, and also for prospective customers who are being served a variety of new device types and use cases to choose from as they begin their hearing health journey.

My hope is that all of these trends converging around the ear will combine in a myriad of ways to help increase the number of people becoming more cognizant of the state of their hearing health, which hopefully would help to shrink the 7-10 year adoption gap.

References

Hunn, N. (2014). Hearables—the new wearables. Wearable Technologies. https://www. wearable-technologies. com/2014/04/hearables-the-new-wearables.

Kochkin, S. (1990). Introducing MarkeTrak: A Consumer Tracking Survey of the Hearing Instrument Market. The Hearing Journal, 43(5), 1–5.

Powers, T., Carr, K. (2022). MarkeTrak: Navigating the changing landscape of hearing healthcare. Hearing Review, 29(5), 12-17.

Spoken Word Audio Report. (2022). National Public Radio & Edison Reserach. Retrieved May 20, 2022.

Citation 

Kemp, D. (2022). 20Q: Hearing aids, hearables and the future of hearing technology. AudiologyOnline, Article 28290. Available at www.audiologyonline.com

To earn CEUs for this article, become a member.

unlimited ceu access $129/year

Join Now
Rexton Reach - April 2024

dave kemp

Dave Kemp

Dave Kemp is the Director of Business Development & Marketing at Oaktree Products and the Founder & Editor of Future Ear. In 2017, Dave launched his blog, FutureEar.co, where he writes about what’s happening at the intersection of voice technology, wearables and hearing healthcare. In 2019, Dave started the Future Ear Radio podcast, where he and his guests discuss emerging technology pertaining to hearing aids and consumer hearables.

He has been published in the Harvard Business Review, co-authored the book, “Voice Technology in Healthcare,” writes frequently for the prominent voice technology website, Voicebot.ai, and has been featured on NPR’s Marketplace. In addition, Dave travels the country giving talks to Audiologists on the technological evolution that the hearing aid is currently experiencing.



Related Courses

Taking Hearing Care to the Next Level by Addressing Complete Lifestyles
Presented by Dave Kemp, Andy Bellavia
Recorded Webinar
Course: #37379Level: Intermediate1 Hour
This course will explore a wide variety of new solutions – hardware and software-based – that hearing professionals can implement into their total offering. During the presentation, the presenters will share real-world experiences of various solutions and discuss the specific challenges that these solutions can be positioned to solve.

The Expanding Use Cases of Today’s Hearing Aids
Presented by Dave Kemp
Recorded Webinar
Course: #36412Level: Introductory1 Hour
This course will walk through a variety of recent technological developments, occurring in this industry and in ancillary industries, that pertain to hearing aids. We’ll then look at the implications of what’s transpired to understand what’s on the horizon for the next wave of hearing aid devices and capabilities.

The Technological Evolution of Today’s Hearing Aids
Presented by Dave Kemp
Recorded Webinar
Course: #33857Level: Introductory1 Hour
This course will walk through a variety of recent technological developments, occurring in this industry and in ancillary industries, that pertain to hearing aids. We’ll then take a look at the implications of what’s transpired to understand what’s on the horizon for tomorrow’s hearing aids.

Connectivity and Future Hearing Aid Applications
Presented by Dave Kemp
Recorded Webinar
Course: #31891Level: Introductory1 Hour
This course goes through the current state of hearing aid applications and connectivity innovation and explores what’s on the horizon for Bluetooth hearing aids and the expanding use cases that they’ll possibly support. This course will cover ancillary industries and the innovation transpiring within them, and explore how that innovation may bleed into our industry.

20Q: Hearing Aid Adoption — What MarkeTrak Surveys are Telling Us
Presented by Lindsey E. Jorgensen, AuD, PhD
Text/Transcript
Course: #38410Level: Intermediate2 Hours
This 20Q article provides a discussion of data collected in the MarkeTrak surveys and how this impacts our patients adopting hearing aids.

Our site uses cookies to improve your experience. By using our site, you agree to our Privacy Policy.