Insights / Industry Perspectives / Safety Reimagined: How ADAS Empowers Collision-Free Future  

·

8 mins read

Safety Reimagined: How ADAS Empowers Collision-Free Future  

It only takes a second: A message on your phone or a child distracting you in the backseat of your car, and, before you know it, your car crashed into the back of the vehicle that suddenly stopped in front of you.   

According to WHO, traffic crashes on the road cause approximately 1.3 million deaths worldwide each year, and human errors cause 95% of the cases.  

While fully self-driving cars are still a thing of the future, automotive industry has successful been investing billions into automation systems in the past decade. 

Automotive safety features such as adaptive cruise control or blind-spot monitoring are becoming standard as ADAS technology is gaining popularity in various automotive models. But… 

What is ADAS?  

Advanced Driver-Assistance Systems (ADAS) are electronic systems in a vehicle that use advanced technologies to assist the driver, including:  

  • Pedestrian detection/avoidance  
  • Lane departure warning/correction  
  • Traffic sign recognition  
  • Automatic emergency braking  
  • Blindspot detection 
  • Auto Park/Smart Park 
  • Driver Monitor & Occupancy Monitor System (DMS & OMS) 

Powered mostly by cameras, LiDAR (light detection and ranging), radar, and infrared sensors, ADASs are becoming increasingly available in many vehicles to improve safety and comfort. Also, the automotive industry is investing considerable funds to reach level 4 and level 5 of Autonomous Driving. Considering we are still far from fully autonomous vehicles, current ADAS features focus primarily on active safety (accident prediction and prevention) and assistance systems.  

The fact that start-ups and established OEMs have invested $106 billion in autonomous-driving capabilities since 2010, most of which has gone toward enhancing ADASs, talks much about how impactful this technology is. According to a National Safety Council report published in 2019, ADAS technology can prevent 62 percent of total traffic deaths, primarily thanks to pedestrian automatic braking features and lane-keeping. ADAS features also have the potential to prevent or mitigate about 60 percent of total traffic injuries.   

Now let’s discuss the top three ADAS features that will make a difference in the automotive industry and beyond.   

Driver monitoring & occupancy monitoring systems  

One of the major causes of traffic accidents today is drowsy driving and distracted driving, e.g., paying closer attention to a mobile device or passengers.  

Sophisticated driver monitoring can help prevent traffic accidents by alerting the driver when the system detects the individual is distracted or falling asleep at the wheel.  

Very soon, a driving monitoring system will be a crucial feature for OEMs to get the highest safety rating by Euro NCAP. The European New Car Assessment Programme is a European voluntary car safety performance assessment program. A high rating from this organization is a well-established mark of quality and safety, which encourages car manufacturers to adapt their new car models to the safety standards set by the Euro NCAP organization. This safety rating aims to substantially reduce driver errors caused by distraction, drowsiness, or other abnormal vital conditions, mainly during the night drive.   

How it works  

Driver Monitoring technology is complex and requires many years of experience and R&D activities from different domains, such as deep learning, image processing, computer vision, camera technologies, and embedded devices design and development.   

What is DMS?  

The idea behind the Driving Monitoring System (DMS) is to accurately detect the driver’s face and eyes with an interior camera, track them, and monitor the driver’s state. This should help eliminate one of the leading causes of accidents by identifying all types of distracted driving (i.e., visual, manual, cognitive). Any distraction, drowsiness, sleep/microsleep states should be immediately detected, and an appropriate reaction should be taken before an accident happens.   

In perfect conditions, all these unwanted situations can be easily detected. Still, in real life, different complex (not ideal) scenarios might (and will) happen, e.g., lighting conditions can reduce face visibility, or the driver might be wearing sunglasses, so eyes are blocked. Advanced Near InfraRed (NIR) cameras, specially designed for driver state monitoring, must be used to overcome all these issues. They should capture the high-quality images required for the driver’s facial features and head tracking, while NIR image sensors minimize the influence of environmental lighting on face images. This technology enables complete face visibility during night or any other difficult light conditions. 

Additionally, DMS powered by AI can efficiently detect if the driver eats, drinks, smokes, uses the phone while driving, or doesn’t have a safety belt on.  

What is OMS?  

The Occupancy Monitoring Systems (OMS) is a relatively newer technology compared to DMS. It is a system that collects information from all passengers in the vehicle to increase the safety and security of all passengers, as well as ambient personalization (e.g., biometrics/health, forgotten child, intrusion detection). 

Together with DMS, OMS provides complete cabin monitoring, called Interior Sensing, and offers additional safety levels for all passengers in the car. Interior sensing can monitor the state of all passengers as well as pets inside vehicles. OMS detects all occupants inside the vehicle and enables automatic management of some safety features (e.g., enabling child lock if a child is present).  

Considering that DMS and OMS technology are already present in some premium cars with some basic functionalities, we can soon expect even more sophisticated DMS systems that can detect driver’s emotions or even decide if the driver is not capable of driving. For example, ADAS can detect if the driver is drunk by looking for pupil dilation and monitoring their behavior during the drive. Also, it will be possible to automatically manage interior features of the vehicle (e.g., volume moderation, lighting, and temperature settings) by identifying and interpreting the emotions of all passengers on board.  

Thanks to Euro NCAP, Driver Monitoring and Occupancy Monitoring Systems will practically be a requirement for any new car model launched on the European market very soon. This means DMS and OMS technology will be one of the priorities for many car manufacturers, AI companies, and camera manufacturers worldwide.  

A vision-based system   

With modern cars, there is another blind spot that most people don’t even know exists. It’s called an A-pillar blind spot. On a car, pillars are the parts that connect the roof to the body. The A-pillars are on both sides of the front windshield, the B-pillars are behind the front doors, and the C-pillars are on both sides of the rear window. As engineers have designed cars to protect passengers better, the A-pillars have grown much wider. Also, there is room for an airbag in the A-pillar, which also significantly affects its size. Therefore, it can result in a serious reduction in visibility. The problem usually manifests itself on the driver’s side of the car, often when making a left turn. The A-pillar blind spot can block a driver’s view of a pedestrian or a cyclist.  

A-pillar ADAS

Currently, there are no off-the-shelf solutions for this problem on the market, but with the tremendous progress of the ADAS systems, we can expect assistance systems that will help us overcome this problem very soon.   

A practical solution will be a vision-based system with a wide-angle front camera and flexible display embedded in the A-pillar. A video stream from the camera in front of the A-pillar will be projected on the display in front of the A-pillar. Additionally, a DMS camera will be used to detect the position and movement of the driver’s head, so real-time video stream on the A-pillar display can be adjusted to the driver’s point of view.  

Considering that all new cars will have front and interior cameras installed, this solution seems very practical and easy to achieve.   

Automatic emergency braking  

As we already described, DMS technology can detect any driver’s distraction that takes more than a few seconds and alert the driver. However, emergency braking or obstacle avoidance should be conducted only in critical situations. Other drivers can produce these urgent situations even if you are not distracted, e.g., a vehicle that unexpectedly stops in front of you.  

To equip a car with such advanced systems, it is necessary to use data fusion from many sensors — mostly radars, LiDARS, and different cameras. A high-frequency radar sensor is mandatory to detect the presence of an obstacle and calculate the distance from it. On the other hand, to distinguish shapes, colors and quickly identify the type of objects, we should use high-resolution cameras. The radar helps detect any obstacle on the road, but the camera is required to determine the type of the obstacle (e.g., pedestrian or another vehicle). Hence, the fusion of data from multiple sensors is mandatory to get reliable ADAS applications.  

For these systems, reliability is critical — they should work properly in any conditions — day, night, rain, fog, etc. But, considering all sensors have some limitations, it is recommended to use as many different sensors as possible to overcome these limitations. For example, foggy or rainy weather reduces visibility, meaning that conventional cameras or LiDARS are unreliable. Unlike cameras and LiDAR, radars aren’t affected by weather conditions so that they can detect almost any obstacle during bad weather. However, a shorter radar wavelength does not allow the detection of small objects. So, to detect a small animal on the road during terrible weather conditions, we can introduce additional sensors like thermal cameras, which are extremely valuable to driver-assisted systems.  

Go beyond the wheel   

The global ADAS market size is projected to grow to USD 83.0 billion by 2030, and demand for Advanced Assistance Systems will grow rapidly in the next 10 years. There are a few reasons for this trend:  

  • car manufacturers want a good safety reputation 
  • ADAS can significantly raise the vehicle price 
  • Some things will most probably be regulated by government laws very soon, e.g., driver monitoring. 

These megatrends are raising fundamental questions about the purpose of a vehicle —advances in autonomous technology, the shift from a hardware to a software-defined vehicle, and ever-changing customer expectations. While, on the one hand, they create challenges to traditional automakers, they also present a world of exciting new opportunities.  

In this new digital era where the Automotive industry plays a key part, companies must get creative about reshaping their products, embracing the change, and collaborating beyond industry lines to find new ways to innovate.  

To learn how we can help you power up your future with different ADAS technologies, reach out to us. Our tech expertise, creativity, and ability to solve complex issues and build innovative solutions from scratch is our biggest advantage. 


Author