Self Driving Cars
Cameras, sensors and artificial intelligence determine when the driver is able to take control
Artificial intelligence-powered driver monitoring systems will be a key stepping stone for the transition from driver assistance to fully autonomous driving.
That was a central message from Bosch at CES 2020, where the automotive component supplier explained how its technology will be crucial in the development of self-driving cars of the future.
Bosch's system makes use of cameras and other sensors to understand a vehicle's occupants and whether the driver is paying attention. The driver's line of sight is monitored, along with their head position and blink rate.
For now, this data can be used to issue alerts if the system of a regular car thinks the driver is falling asleep, or otherwise distracted - by looking down at their phone, for example. But in the near-future, partially autonomous cars will use this data to know when the task of driving can be handed back to the driver, and how much notice they need to be given.
This is crucial as the automotive industry heads further up the SAE scale of autonomous driving. Levels three and four (out of five) refer to autonomous systems which can drive vehicles most of the time, but may occasionally need to ask for help, and hand over to a human.
By constantly monitoring the attention of the person behind the wheel, Bosch says future vehicles can work out how much notice the driver needs. If they are looking away from the road or have their eyes shut, the system will alert them sooner to upcoming roadworks which may require their help to navigate. If the driver is paying as much attention as if they were in control themselves, less notice time is required.
Ai will monitor driver attention and position in the car Getty Images/iStockphoto
Such driver monitoring systems will be a legal requirement for all new vehicles sold in the European Union from 2022 onwards.
Beyond keeping an eye on the driver, these same monitoring systems can be used to understand what passengers are in the vehicle. In turn, this helps the car's safety systems arm the airbags and seatbelt tensioners accordingly.
Instead of doing so purely based on the weight of the passenger, as is the case in vehicles today, Bosch's future system will gain a better understanding of the age and size of a passenger, and their position in the seat; all of this data will help the vehicle adjust how its airbags and seatbelt tensioners work in a crash.
Furthermore, Bosch says this system could also alert parents if they have left a child in the car for a dangerous amount of time - on a hot day, for example - and it could also tell the emergency services exactly how many adults and children were in a car when it crashed, helping any extraction efforts.
When applied to ride-share vehicles like cars operated by Uber and Lyft, the cameras and sensors could notify a passenger if they get out and leave something behind, like a bag or coat.
Central to all of this, Bosch says, is the goal of building and maintaining consumer trust in artificial intelligence. In these examples, this means convincing buyers of semi-autonomous vehicles that the car will hand over control in a calm way, giving plenty of notice and reassuring drivers that everything is working at intended.
This certainly won't be easy, and is without doubt the toughest stage yet in the evolution of the autonomous car. Right now, we have hands-on-the-wheel driver assistance with the likes of Tesla Autopilot. One day we will have sleep-in-the-back full automation. But between these is a stage where humans and machines will have to work together, helping each other to reach a shared goal. Whether legislation will make this a smooth ride is another issue entirely.