Documentation

### This is machine translation

Translated by
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

## Build a Driving Scenario and Generate Synthetic Detections

This example shows you how to build a driving scenario and generate vision and radar sensor detections from it by using the Driving Scenario Designer app. You can use these detections to test your controllers or sensor fusion algorithms.

This example covers the entire workflow for creating a scenario and generating synthetic detections. Alternatively, you can generate detections from prebuilt scenarios. For more details, see Generate Synthetic Detections from a Prebuilt Driving Scenario.

### Create a New Driving Scenario

To open the app, at the MATLAB® command prompt, enter `drivingScenarioDesigner`.

### Add a Road

Add a curved road to the scenario canvas. From the app toolstrip, click . Then click one corner of the canvas, extend the road to the opposite corner, and double-click to create the road.

To make the road curve, add a road center around which to curve it. Right-click the middle of the road and select . Then drag the added road center to one of the empty corners of the canvas.

To adjust the road further, you can click and drag any of the road centers. To create more complex curves, add more road centers.

### Add Lanes

By default, the road is a single lane and has no lane markings. To make the scenario more realistic, convert the road into a two-lane highway. In the left pane, on the Roads tab, expand the Lanes section. Set the Number of lanes to `2` and the Lane Width to `3.6` meters, which is a typical highway lane width.

The road is now one-way and has solid lane markings on either side to indicate the shoulder. Make the road two-way by converting the center lane marking from a single dashed line to a solid double-yellow line. From the Marking list, select `2:Dashed`. Then set the Type to `DoubleSolid` and specify the Color as the string `yellow`.

### Add Vehicles

By default, the first car that you add to a scenario the ego vehicle, which is the main car in the driving scenario. The ego vehicle contains the sensors that detect the lane markings, pedestrians, or other cars in the scenario. Add the ego vehicle, and then add a second car for the ego vehicle to detect.

#### Add Ego Vehicle

To add the ego vehicle, right-click one end of the road, and select Add Car. To specify the trajectory of the car, right-click the car, select Add Waypoints, and add waypoints along the road for the car to pass through. After you add the last waypoint along the road, press Enter. The car autorotates in the direction of the first waypoint. For finer precision over the trajectory, you can adjust the waypoints. You can also right-click the path to add new waypoints.

Now adjust the speed of the car. In the left pane, on the Actors tab, set Constant Speed to `15` m/s. For more control over the speed of the car, clear the Constant Speed check box and set the velocity between waypoints in the Waypoints table.

#### Add Second Car

Add a vehicle for the ego vehicle to detect. From the app toolstrip, click Add Actor and select . Add the second car with waypoints, driving in the lane opposite from the ego vehicle and on the other end of the road. Leave the speed and other settings of the car unchanged.

### Add a Pedestrian

Add to the scenario a pedestrian crossing the road. Zoom in (Ctrl+Plus) on the middle of the road, right-click one side of the road, and click Add Pedestrian. Then, to set the path of the pedestrian, add a waypoint on the other side of the road.

To test the speed of the cars and the pedestrian, run the simulation. Adjust actor speeds or other properties as needed by selecting the actor from the left pane of the Actors tab.

### Add Sensors

Add front-facing radar and vision (camera) sensors to the ego vehicle. Use these sensors to generate detections of the pedestrian, the lane boundaries, and the other vehicle.

#### Add Camera

From the app toolstrip, click . The sensor canvas shows standard locations at which to place sensors. Click the front-most predefined sensor location to add a camera sensor to the front bumper of the ego vehicle. To place sensors more precisely, you can disable snapping options. In the bottom-left corner of the sensor canvas, click the Configure the Sensor Canvas button .

By default, the camera detects only actors and not lanes. To enable lane detections, on the Sensors tab in the left pane, expand the Detection Parameters section and set Detection Type to `Objects & Lanes`. Then expand the Lane Settings section and update the settings as needed.

#### Add Radar

Snap a radar sensor to the front-left wheel. Right-click the predefined sensor location for the wheel and select Add Radar. By default, sensors added to the wheels are short range.

Tilt the radar sensor toward the front of the car. Move your cursor over the coverage area, then click and drag the angle marking.

Add an identical radar sensor to the front-right wheel. Right-click the sensor on the front-left wheel and click Copy. Then right-click the predefined sensor location for the front-right wheel and click Paste. The orientation of the copied sensor mirrors the orientation of the sensor on the opposite wheel.

The camera and radar sensors now provide overlapping coverage of the front of the ego vehicle.

### Generate Sensor Detections

#### Run Scenario

To generate detections from the sensors, click Run. As the scenario runs, the Ego-Centric View displays the scenario from the perspective of the ego vehicle. The Bird’s-Eye Plot displays the detections.

To turn off certain types of detections, in the bottom-left corner of the bird's-eye plot, click the Configure the Bird's-Eye Plot button .

By default, the scenario ends when the first actor stops. To have the scenario run for a set time instead, from the app toolstrip, click Settings and change the stop condition.

#### Export Sensor Detections

To export the detections to the MATLAB workspace, from the app toolstrip, click Export > Export Sensor Data. Name the workspace variable and click . The app saves the sensor data as a structure containing the actor poses, object detections, and lane detections at each time step.

To export a MATLAB function that generates the scenario and its detections, click Export > Export MATLAB Function. The scenario is a `drivingScenario` object. The sensor detections are generated by `visionDetectionGenerator` and `radarDetectionGenerator` System objects. To adjust the parameters of the scenario, you can update the code in the exported function directly. To generate new detections, call the exported function.

### Save Scenario

After you generate the detections, click to save the scenario file. In addition, you can save the sensor models separately. You can also save the road and actor models into a separate scenario file.

You can reopen this scenario file from within the app or by using this syntax at the MATLAB command prompt:

`drivingScenarioDesigner(scenarioFileName)`

If you are developing a driving algorithm in Simulink®, you can use a Scenario Reader block to read the roads and actors from this file into your model. However, because the block does not support reading in sensor data, the sensors you created are ignored. You must instead create the sensors within your model, using blocks such as Radar Detection Generator and Vision Detection Generator.

## Related Topics

#### Implementing an Adaptive Cruise Controller with Simulink

Download technical paper