ADAS

What Is ADAS?

3 things you need to know

Advanced driver-assistance systems (ADAS) are the hardware and software components that automate a driver’s responsibilities. Examples of ADAS in vehicles today include adaptive cruise control, blind spot detection, lane change detection, automatic lane following, and automatic emergency braking.

Why Does ADAS Matter?

ADAS can make roads safer by minimizing human error. Some ADAS systems enforce safe driving habits by alerting drivers of unsafe road scenarios, such as when a car in the driver’s blind spot would make changing lanes dangerous. Other ADAS systems automate driving behaviors, such as collision avoidance with autonomous emergency braking.

In fact, ADAS prevents 28% of all crashes and 9,900 annual fatalities in the U.S., according to a study by Boston Consulting Group.

Levels of ADAS

There are five levels of driving automation as defined by the Society of Automotive Engineers. Most cars on the road today have ADAS features between Level 0 and Level 3. Companies at the forefront of automated driving are pursuing Levels 4 and 5.

SAE J3016TM Levels of Driving AutomationTM

SAE J3016™ Levels of Driving Automation™

Fully autonomous vehicles may become a reality as the safety, cybersecurity, and policy issues are being worked out.

How Are ADAS Features Designed?

To understand how ADAS features are designed, let’s use adaptive cruise control as an example. When using this ADAS feature, the car slows down as it approaches a vehicle in front and accelerates to cruising speed if the vehicle in front moves a safe distance away.

The first step in designing adaptive cruise control (ACC) is to collect data from sensors mounted on the car. For adaptive cruise control, we need a camera and a radar sensor. The camera detects the other objects in the frame (vehicle, pedestrian, tree, etc.), and the radar calculates the distance from our car to the object.

After collecting data from our sensors, we turn our focus to ADAS algorithm development. Adaptive cruise control can be broken down into three steps:

A perception algorithm to detect if there is a vehicle in front of us

Steps 1, 2, and 3 correspond to the following:

  1. A perception algorithm to detect if there is a vehicle in front of us
  2. A radar algorithm to calculate our distance from the vehicle
  3. A controls algorithm to adjust the speed of our car based on the distance measurement.

We used ACC as an ADAS example, but the general methodology of choosing the right sensors and designing algorithms based on the sensor data applies to all ADAS features.

The Importance of Sensors

The three most popular sensor types used for ADAS features are camera, radar, and lidar.

Cameras

Cameras are used for detection-related ADAS tasks. Cameras on the side of a vehicle can detect blind spots. Cameras in the front can detect lanes, vehicles, signs, pedestrians, and cyclists. The associated ADAS detection algorithms are generally built using conventional computer vision and deep learning algorithms. Cameras have several advantages:

  • They provide excellent data for object detection
  • They are relatively inexpensive – low price means that testing many types of cameras is less expensive for manufacturers
  • There are many varieties– test and select from many camera types such as fisheye, monocular, and pinhole
  • They are the most extensively researched – the camera is the oldest of the three sensor types and has been studied the most

The downside of camera data is that they are less suited for detecting distance from an object compared to data from other sensor types. For this reason, ADAS developers often use cameras in conjunction with radar.

Radar

Radar sensors emit a high frequency wave and record when these waves bounce back to them from objects in the environment. The data can be used to calculate the distance to an object. In ADAS, radar sensors are usually on the front of the vehicle.

Radar works in varying weather conditions, which makes it a practical sensor choice for ADAS features like automatic emergency braking and adaptive cruise control.

Although radar sensor data are well-suited for distance detection algorithms, these data are less useful in algorithms for classifying the detected objects. For this reason, ADAS developers often use radar in conjunction with cameras.

Lidar

Lidar (light detection and ranging) sensors emit a laser into the environment and record when the signal returns. The returned signals are reconstructed to create a 3D point cloud that shows the lidar’s surrounding environment. Lidar data can be used to calculate the sensor’s distance from the objects in the 3D point cloud.

There are two types of lidar sensors used for ADAS applications:

  1. Electromechanical (spinning) lidar - Electromechanical lidar is mounted on top of a car and rotates while collecting data to produce a 3D point cloud map of the environment.
  2. Solid-state lidar – This is a newer type of lidar that has no moving parts. In the long term, solid-state lidar promises to be faster, cheaper, and more accurate than electromechanical lidar. However, designing a commercially viable sensor poses engineering problems related to the safety and range of the sensor.

You can use lidar data to perform both the distance detection and object classification functions in ADAS. However, lidar data processing requires more computational power compared to camera and radar data, and poses some challenging problems for ADAS algorithm developers.

Developing ADAS Algorithms with Simulation

Testing on hardware is expensive, so engineers first test their ADAS solutions using virtual simulation. Simulation environments can be 2D or 3D.

You can use 2D simulation to develop and test ADAS algorithms for camera and radar. We start by creating virtual scenes with roads, pedestrians, cyclists, and other vehicles. Then we place our vehicle in the scene and mount virtual cameras and radar sensors onto it. We can then program the movement of the car to generate synthetic sensor data for ADAS algorithm development and testing.

3D simulation builds on 2D simulation and allows us to test lidar in addition to cameras and radar. 3D environments require more computational power because of their relative complexity.

Once you have developed ADAS algorithms in simulation environments, the next development stage is hardware-in-the-loop (HIL) testing. This involves testing ADAS algorithms with real hardware from cars, such as a real braking system, by connecting them to a simulation environment. HIL testing provides a good sense of how an ADAS component of a car will operate in the real world.

There are other ADAS tests such as driver-in-the-loop, but they all lead to in-vehicle tests to understand how the vehicle will perform when all the parts come together. This is the most expensive type of ADAS testing but also the most accurate and is required before moving a vehicle to production.

ADAS with MATLAB and Simulink

MATLAB® and Simulink® support ADAS development for each stage of the workflow:

  1. Analyzing data
  2. Synthesizing driving scenarios
  3. Designing ADAS planning and control algorithms
  4. Designing perception algorithms
  5. Deploying algorithms
  6. Integrating and testing

Analyzing Data

MATLAB enables you to access, visualize, and label live and recorded driving data for ADAS development. MATLAB also supports geographic map data via HERE HD Live Maps, OpenStreetMap, and Zenrin Japan Maps. These data are often used for ADAS algorithm development and verification.

The app shows a video on the left with a car labeled with a blue bounding box and the word “car,” and a lidar sequence on the right with a car labeled in a 3D blue bounding box.

Ground Truth Labeler app for interactively labeling ground truth data in a video, image sequence, or lidar point cloud.

Synthesizing Driving Scenarios

MATLAB lets you develop and test ADAS algorithms in virtual scenarios using the cuboid simulation environment for controls, sensor fusion, and motion planning, as well as the Unreal Engine environment for perception. You can also design realistic 3D scenes with RoadRunner.

The app shows a scene canvas on the left, and a bird’s eye view of the scene on the right. The left side displays an intersection with multiple cars including a blue ego car traveling north.  The right side shows the same intersection from a bird’s eye view with a camera sensor and radar sensor detections from the ego vehicle.

Driving Scenario Designer app for designing scenarios, configuring sensors, and generating synthetic data for ADAS applications.

Designing ADAS Planning and Control Algorithms

MATLAB contains many automated driving reference applications, which can serve as starting points for designing your own ADAS planning and controls algorithms.

Roadway with ego vehicle, from which several curved paths show trajectories, with chase view on left and top view on right. Paths are color-coded as optimal, colliding, infeasible, and not evaluated.

Visualization of evaluating possible trajectories in a highway driving situation within the bird’s eye plot.

Designing Perception Algorithms

MATLAB provides tools for developing perception algorithms from camera, radar, and lidar data. You can develop algorithms using computer vision, deep learning, radar and lidar processing, and sensor fusion.

View from the driver of a stop sign surrounded by a yellow bounding box and label that reads “stopSign: (Confidence = 0.995492)".

Detecting a stop sign using a pre-trained R-CNN with MATLAB.

Deploying ADAS Algorithms

Toolboxes like MATLAB Coder™, Embedded Coder®, and GPU Coder™ allow you to automatically generate code to deploy your ADAS algorithms onto embedded devices and service-oriented architectures like ROS and AUTOSAR.

NVIDIA Jetson TX2 board.

A NVIDIA Jetson TX2. You can generate CUDA code for it with GPU Coder

Integrating and Testing

You can integrate and test your perception, planning, and control systems with Simulink tools. Using Requirements Toolbox™, you can capture and manage your ADAS requirements. You can also use Simulink Test™ to run and automate test cases in parallel.

Requirements Editor with a file viewer on the left and properties on the right. The properties panel shows a test table for stop-and-go testing on a curved road. The table describes the target vehicles and requirements for the ego vehicle.

Requirements testing for highway lane following reference application.