As autonomous technology becomes more common in our cars, how we interact with our vehicles is bound to change. Today, we turn steering wheels, push on pedals and flip buttons to control the way a car drives, what music we want to hear or open a gas tank. Our voice may be the method that handles much of those tasks in the future.
Car owners today can already access Alexa, Siri and Google Assistant in their vehicles, helping them play music, make a phone call or get directions to the closest grocery store. But that level of interaction, and the sophistication of voice assistants themselves, is expected to grow.
Nuance Communications is working on just this very skill: the way we use speech recognition and artificial intelligence today and in the future, particularly around the way we'll control and, in essence, drive our autonomous cars.
GearBrain will talk with Nuance's Adam Emfield, the company's principal user experience manager who will help us understand what we can expect from our voice assistants in the coming years. We'll also ask him which features that we use in our car today are likely to stay, which may disappear — and if there are other ways of interacting with our cars that are we should expect down the road.
Please tune in on GearBrain's Facebook page, when we speak live with Emfield next week, on Tuesday October 9th at 2 pm. Bring questions so we can get answers during the half hour show.