The intricacies of Advanced Driver Assistance Systems (ADAS) are fascinating. In a recent episode of the HARMAN Experiences Per Mile podcast, I talk all-things in-cabin monitoring systems, including how these new analytical techniques can be used to create a safer environment for drivers and riders. This blog post highlights parts of the discussion, but you can listen to the entire podcast here.
In-cabin monitoring is an umbrella term for Driver and Occupant Monitoring Systems (DMS and OMS). The basic configuration is that we put a camera or cameras into the cabin of a vehicle. Then we can observe the cabin, the people and the other things that are in it to create a safer, more enjoyable environment.
DMS is specific to the driver. Usually, DMS is more robust than OMS because of course the driver is the person that's in control of the vehicle, so he or she is the person that's central to the safety of the vehicle and its occupants. OMS generally analyze the other occupants and things in the cabin. Both are important depending on whether you are guarding the safety of the vehicle or enhancing the feeling of well-being inside of it.
Different hardware pieces are used to enable these two types of systems, but there's a common theme that runs through both, and that's machine learning. We rely heavily on machine learning algorithms to extract very nuanced and small pieces of detail that an observer might miss while looking at somebody's face. These points are important in understanding some of the things that affects the driver or passenger.
It's quite astounding how much information is contained just in the movement in someone’s facial features. These movements are not just monitored but measured too. We measure pupils, eyes, eyelid movement, the frequency of eyelid movement, and more. Two interesting differences here are gaze direction and the head movement. You don't always look in the direction that your head's pointing and that tells you something else about what you're doing. All this information is then used to create high-level features for safety such as drowsiness detention.
For L2 and L3 automated driving applications, you need to know where the driver's hands are in order to turn control back over to the driver as needed. Being able to accurately monitor and measure where the hands are is another basic function of DMS.
To accurately gauge the driver’s state of mind, we can determine the cognitive load of the driver. If you know how busy the driver’s mind is, it’s possible to impact the inside of the vehicle as well as how the vehicle interacts with the outside world. With this technique, you monitor a combination of facial features and movements and analyze them together with machine learning algorithms.
With OMS, there are various safety systems that can benefit from understanding what they're doing. For example, by monitoring major limbs, shoulders, and the torso, the position can be identified. This is very important for preparing the inside of the cabin and the occupant for an imminent crash. A lot of ADAS systems today that look externally, like automatic emergency braking systems, will know a few hundred milliseconds before the impact that there's going to be an accident. And this differentiates active safety and passive safety.
Passive safety is a way of surviving an accident after the impact, whereas active safety systems allow us to prepare the cabin in a proper way. If you know where the occupants are in advanced systems, it allows us to send the signals to the OEM, who is controlling these airbag deployments. The OEM can then make sure the airbags deploy in the safe way and at the right rates or sequence to cocoon the various occupants, hence providing a higher chance of survival in the accident.
It's a sobering statistic that worldwide 2.5 people die every minute due to a car accident. Whether it’s a pedestrian or the person inside the car, it's an extremely high number of fatalities and ADAS, including DMS and OMS, can help save lives in several ways.
To hear more from me on this topic, listen to this episode of the Experiences Per Mile podcast.
Dr. Alan Jenkins
Senior Director for Global ADAS Product Development at HARMAN International