Automated Driving Toolbox provides algorithms and tools for designing, simulating, and testing ADAS and autonomous driving systems. You can design and test vision and lidar perception systems, as well as sensor fusion, path planning, and vehicle controllers. Visualization tools include a bird’s-eye-view plot and scope for sensor coverage, detections and tracks, and displays for video, lidar, and maps. The toolbox lets you import and work with HERE HD Live Map data and OpenDRIVE® road networks.
Using the Ground Truth Labeler app, you can automate the labeling of ground truth to train and evaluate perception algorithms. For hardware-in-the-loop (HIL) testing and desktop simulation of perception, sensor fusion, path planning, and control logic, you can generate and simulate driving scenarios. You can simulate camera, radar, and lidar sensor output in a photorealistic 3D environment and sensor detections of objects and lane boundaries in a 2.5-D simulation environment.
Automated Driving Toolbox provides reference application examples for common ADAS and automated driving features, including forward collision warning, autonomous emergency braking, adaptive cruise control, lane keeping assist, and parking valet. The toolbox supports C/C++ code generation for rapid prototyping and HIL testing, with support for sensor fusion, tracking, path planning, and vehicle controller algorithms.
Reference Applications
Reference applications form a basis for designing and testing ADAS applications.
Product Highlights
Scenario Simulation
Simulation using realistic driving scenarios and sensor models is a crucial part of testing automated driving algorithms. Automated Driving Toolbox provides various options such as cuboid simulation environment, Unreal engine simulation environment, and integration with RoadRunner Scenario to test these algorithms. This application supports import and export of scenes and scenarios to ASAM OpenDRIVE and ASAM OpenSCENARIO® formats.
Generate Scenes and Scenarios from Recorded Sensor Data
Create virtual driving scenarios from vehicle data recorded using various sensors, such as a global positioning system (GPS), inertial measurement unit (IMU), camera, and lidar. Use raw sensor data, recorded actor track lists, or lane detections.
Test Suite for Euro NCAP Protocols
Automatically generate seed scenario and its variants for the assessment of various Euro NCAP protocols. Visualize the generated variants or export them to the ASAM OpenSCENARIO® file format. Using Test Bench, run simulations and get Euro NCAP Test Metrics.
Planning and Control
Plan driving paths with vehicle costmaps and motion-planning algorithms. Use lateral and longitudinal controllers to follow a planned trajectory.
Detection, Tracking, and Ground Truth Labeling
Develop and test vision and lidar processing algorithms for automated driving. Perform multi-sensor fusion and multi-object tracking framework with Kalman. Automate labeling of ground truth data and compare output from an algorithm under test. Using Ground Truth Labeler app, label multiple signals like videos, image sequences, and lidar signals representing the same scene.
Localization and Mapping
Use simultaneous localization and mapping (SLAM) algorithms to build maps surrounding the ego vehicle based on visual or lidar data. Access and visualize high-definition map data from the HERE HD Live Map service. Display vehicle and object locations on streaming map viewers.