This is What Would Happen if Tesla Made Hearing Aids

Oticon, a Denmark-based hearing aid manufacturer, just released their newest hearing aid - and it sounds a lot like they took a page from Tesla’s artificial intelligence handbook in its design.

In this article I’m going to deep dive into this new hearing technology, and why I think this new generation of hearing aids is the best news for people with hearing loss in years.

I have to admit, I’m a huge Tesla geek. I’m in the camp of people that believe that this technology is the future of multiple industries, including transportation, energy, and machine learning.

Even if you’re not a Tesla fan, you’ve probably heard of their self-driving car technology.

What’s fascinating about this technology is the way that Tesla is teaching its cars to drive like humans, only safer.

Tesla has close to a million cars on the road with what they call “full-self driving” software. While this software doesn’t truly let the cars drive from point A to point B with no human intervention, the amazing thing is that the cars are always learning.

Each Tesla car is equipped with a series of cameras, and an onboard internet connected computer.

The footage from the cameras, combined with driving data, are what Tesla engineers use to analyze what is happening around the car.

Thanks to machine learning, Tesla is able to gather information from its huge fleet of cars, and back at their headquarters in California they have a skilled team of data scientists that help the deep neural network computer tag different images to teach the computer in the car what it’s seeing.

Tesla's full-self driving display

I drive a Tesla Model 3 with Full Self Driving. When I go on a trip that requires me to travel on highways, I can plug the destination into my GPS, and the car will drive on the highway, pass other cars, change lanes for safety as needed, and take the correct exit for me. It does all of that without me steering, accelerating, or braking - my car is doing the driving for me.

The car is able to use its cameras to analyze what’s happening around me, and then turn that two dimensional camera information into a 3 dimensional scene.

That’s pretty incredible if you really think about it.

Not only can it tell that a car is a car, a truck is a truck, and a bike is a bike - it can see a bike mounted on the back of a car and correctly determine what it's looking at.

As you can imagine, there are hundreds of millions of possible situations that the car could find itself in, and often these are situations where a human would have to improvise to drive their car.

As humans we have the benefit of using a very powerful onboard computer -- our brain -- to understand what’s actually happening around us and to respond correctly.

Tesla has developed a computer brain or “deep neural network” back at their headquarters that is able to analyze these “edge cases”, or situations where it’s not so cut and dried what the car is actually seeing.

Think about a stop sign that was hit by a car and now it’s leaning sideways. What about a car that’s being towed backwards by another truck? Or a construction crew flipping a “Stop” and “Slow” sign back and forth? The car has to be able to analyze what it’s seeing in all of these situations and react like a human.

So that’s machine learning with deep neural networks - human engineers train computers on how a human would act so that the computer can learn and act like a human independently.

Sound creepy?

Maybe...but this technology is going to allow us to use machines in ways that can help humanity in the future.

And, not to be too corny, but the future is now.

Ok, so you probably thought this was an article about hearing aids.

Here’s the tie-in.

Oticon, one of the leading global hearing aid brands, has just released their newest product, the Oticon More.

The Oticon More is the first hearing aid with an onboard Deep Neural Network.

Oticon Calls Their Machine Learning Technology "BrainHearing"

In developing this product, the engineers at Oticon trained the computer processor that drives these hearing aids with over 12 million different sounds and sound scenes.

The problem with hearing aids, historically, has been that they could amplify sounds and even help focus on speech sounds while damping down surrounding environmental sound - but ultimately you were hearing an artificially modified signal.

This could result in you hearing sounds that you couldn’t previously hear, but not totally understanding what is happening around you...OR hearing some sounds to the detriment of other sounds.

With the new Oticon More, hearing aids are able to come closer than ever before to reproducing what natural hearing in a healthy auditory system is able to do.

The onboard neural network, called “MoreSound Intelligence” in these hearing aids constantly analyzes and processes the sound environment very precisely, providing a clear contrast and balance for the sounds all around you.

Because of the training that the engineers of this technology have done with the deep neural network, the hearing aids are able to seamlessly handle virtually all of the sounds scenes that you could ever encounter, and then send them to your ears in a detailed way that is customized to your hearing profile in a way that optimally supports your brain.

Much like Tesla cars use 2D computers to create a 3D visual scene, these hearing aids are able to use onboard microphones to properly identify and classify sounds, and then place them in a 3 dimensional sound scene that allows the listener to focus on the sounds that are important and ignore the rest - just like normal hearing.

So with this new technology we’re closer than ever before to hearing aids that reproduce what healthy, natural hearing and auditory processing is able to do.

Are the robots taking over? Maybe so...and maybe that's not such a bad thing after all

__________________________________________________________________________


If you’d like to schedule a free consultation to discuss your options with Dr. Brad Stewart or another of our skilled audiologists at ClearLife Hearing Care at our Allen or Lewisville TX offices, click here!


Dr. Brad Stewart, audiologist, is the founder and owner of ClearLife Hearing Care in Allen and Lewisville Texas. Dr. Stewart is also the creator of the NeuroHearing™ and NeuroTinnitus™ treatment programs.

Brought to you by ClearLife Hearing Care with locations in Allen, TX and Lewisville, TX.
4.9 stars on Google - over 100 reviews
See our reviews

Home Of The NeuroHearing™ Treatment Program

We have developed an industry-leading brain retraining program that results in 96% patient success with hearing loss treatment.

  • Automatic, custom-tailored brain retraining program
  • Gradual adaptation to hearing technology
  • Structured follow-up program, ensuring consistent hearing technology use
Learn More