Main Content

Generate High Definition Scene from Lane Detections

This example shows how to generate a High-Definition (HD) scene containing roads with lanes by using the data extracted from OpenStreetMap®, Global Positioning System (GPS), and lane detections.

Lanes in roads of virtual driving scenarios are essential to perform safety assessment for automated driving applications. You can create real-world scenario using Standard-Definition (SD) map imported from OpenStreetMap®. However SD maps lack detailed lane information that is essential for navigation in an autonomous system. This example shows a workflow to generate HD map from SD maps with detailed lane information using recorded lane detections and GPS sensor data.

This figure shows these steps.

Load Sensor Data

Download a ZIP file containing GPS data, lane detections, and camera information and then unzip. In this example, you use camera data for visual validation of the generated scene.

dataFolder = tempdir;
dataFilename = "PolysyncSensorData.zip";
url = "https://ssd.mathworks.com/supportfiles/driving/data/"+dataFilename;
filePath = fullfile(dataFolder, dataFilename);
if ~isfile(filePath)
    websave(filePath,url);
end
unzip(filePath, dataFolder);
dataset = fullfile(dataFolder,"PolysyncSensorData");
data = load(fullfile(dataset,"sensorData.mat"));

Load the GPS data.

gpsData = data.GPSData;

gpsData is a table with these columns:

  • timeStamp — Time, in microseconds, at which the GPS data was collected.

  • latitude — Latitude coordinate values of ego waypoints. Units are in degrees.

  • longitude — Longitude coordinate values of ego waypoints. Units are in degrees.

  • altitude — Altitude coordinate values of ego waypoints. Units are in meters.

Display the first five entries of gpsData.

gpsData(1:5,:)
ans=5×4 table
    timeStamp     latitude    longitude    altitude
    __________    ________    _________    ________

    1.4616e+15     45.532      -122.62      34.432 
    1.4616e+15     45.532      -122.62      34.434 
    1.4616e+15     45.532      -122.62      34.395 
    1.4616e+15     45.532      -122.62      34.379 
    1.4616e+15     45.532      -122.62      34.383 

Load the recorded lane detections data. Alternatively, you can generate lane detections by processing raw camera data. For more information on how to generate lane detection from camera data, see the Extract Lane Information from Recorded Camera Data for Scene Generation example.

laneDetections = data.LaneDetections;

laneDetections is a table with these columns. For more information on how to organize lane detections into this format, see the Preprocess Lane Detections for Scenario Generation example.

  • timeStamp — Time, in microseconds, at which lane detection was recorded.

  • left — Left boundary information, specified as a parabolicLaneBoundary model.

  • right — Right boundary information, specified as a parabolicLaneBoundary model.

Display the first five entries of laneDetections.

laneDetections(1:5,:)
ans=5×3 table
    timeStamp               left                         right          
    __________    _________________________    _________________________

    1.4616e+15    1×1 parabolicLaneBoundary    1×1 parabolicLaneBoundary
    1.4616e+15    1×1 parabolicLaneBoundary    1×1 parabolicLaneBoundary
    1.4616e+15    1×1 parabolicLaneBoundary    1×1 parabolicLaneBoundary
    1.4616e+15    1×1 parabolicLaneBoundary    1×1 parabolicLaneBoundary
    1.4616e+15    1×1 parabolicLaneBoundary    1×1 parabolicLaneBoundary

Load the camera data recorded from a forward-facing monocular camera mounted on the ego vehicle.

cameraData = data.CameraData;

The camera data is a table with two columns:

  • timeStamp — Time, in microseconds, at which the image data was captured.

  • fileName — Filenames of the images in the data set.

The images are located in the Camera folder under dataset directory. Create a table that contains file paths of these images for each timestamp by using the helperUpdateTable function.

imageFolder = "Camera";
cameraData  = helperUpdateTable(cameraData,dataset,imageFolder);

Display the first five entries of cameraData.

cameraData(1:5,:);

Remove data from workspace.

clear data;

Crop and Preprocess Sensor Data

Crop the GPS data, lane detections, and camera data relative to GPS time stamp range by using the helperCropData.

startTime = gpsData.timeStamp(1);
endTime = gpsData.timeStamp(end);

% Pack all the tables in a cell array. 
recordedData = {gpsData,laneDetections,cameraData};

% Crop the data.
recordedData = helperCropData(recordedData,startTime,endTime);

The data set time format is not in POSIX® system which is required for Scenario Builder functions. Convert time stamps to posixtime by using the helperNormTimeInSecs function and specify these options.

  • scale — Scale to convert time stamp. Specify as 1e-6 to convert microseconds to seconds. Specify 1, if the recorded time stamps are in seconds.

  • offset — Offset to set the simulation start time. Driving scenario supports start time as 0.

scale = 1e-6;
offset = startTime;
recordedData =  helperNormTimeInSecs(recordedData,offset,scale);

Extract the GPS data, lane detections and camera data with updated timestamps values from recordedData.

gpsData = recordedData{1,1}; 
laneDetections = recordedData{1,2};
cameraData = recordedData{1,3};

Remove recordedData from workspace.

clear recordedData;

Extract Map Roads using GPS data

Create a geographic player using the geoplayer object and display the full route using GPS data.

zoomLevel = 16;
center = mean([gpsData.latitude gpsData.longitude]);
player = geoplayer(center(1),center(2),zoomLevel);
plotRoute(player, gpsData.latitude,gpsData.longitude);

{"String":"Figure Geographic Player contains an axes object. The axes object contains 2 objects of type line, scatter.","Tex":[],"LaTex":[]}

Obtain geographic bounding box coordinates from GPS data by using the getMapROI function.

mapStruct = getMapROI(gpsData.latitude,gpsData.longitude);

The map file required for importing roads of an area has been downloaded from the OpenStreetMap (OSM) website. The OpenStreetMap provides access to worldwide, crowd-sourced map data. The data is licensed under the Open Data Commons Open Database License (ODbL). For more information on the ODbL, see the Open Data Commons Open Database License site.

url = mapStruct.osmUrl;
filename = "drive_map.osm";
websave(filename,url,weboptions(ContentType="xml"));

Extract road properties and geographic reference coordinates to use to identify ego roads by using the roadprops function.

[roadData,geoReference] = roadprops("OpenStreetMap",filename);

Display the first five entries of roadData.

roadData(1:5,:)
ans=5×10 table
    RoadID    JunctionID              RoadName                RoadCenters     RoadWidth      BankAngle         Heading          Lanes         LeftBoundary     RightBoundary 
    ______    __________    _____________________________    _____________    _________    _____________    _____________    ____________    ______________    ______________

      1           0         "5510377-617349836-617349835"    {15×3 double}      7.35       {15×1 double}    {15×1 double}    1×1 lanespec    { 83×3 double}    { 83×3 double}
      2           0         "5510384"                        { 5×3 double}      7.35       { 5×1 double}    { 5×1 double}    1×1 lanespec    { 16×3 double}    { 16×3 double}
      3           0         "5510384"                        {14×3 double}      7.35       {14×1 double}    {14×1 double}    1×1 lanespec    { 73×3 double}    { 73×3 double}
      4           0         "5511492"                        {15×3 double}      7.35       {15×1 double}    {15×1 double}    1×1 lanespec    {161×3 double}    {161×3 double}
      5           0         "5511492"                        { 5×3 double}      7.35       { 5×1 double}    { 5×1 double}    1×1 lanespec    { 93×3 double}    { 93×3 double}

Visualize the imported roads by using the helperPlotRoads function. Notice that the roads do not contain lane information. Define cropWindow of the form [x y width height] to crop the map, zoom in, and display. The x and y elements are the coordinates of the top-left corner of the crop window.

cropWindow = [-30 -30 60 60];
helperPlotRoads(roadData,cropWindow);

Figure contains 2 axes objects. Axes object 1 with title Extracted Roads contains 20 objects of type patch, line, rectangle. Axes object 2 with title Zoomed View contains 20 objects of type patch, line, rectangle.

Select Ego Roads from Road Network

Convert geographic GPS coordinates to local East-North-Up (ENU) coordinates by using the latlon2local function. The transformed coordinates define the trajectory waypoints of the ego vehicle. Units are in meters.

[xEast,yNorth,zUp] = latlon2local(gpsData.latitude,gpsData.longitude,gpsData.altitude,geoReference);
waypoints = [xEast,yNorth,zUp];

Raw GPS data often contains noise. Smooth the GPS waypoints by using the smoothdata function.

window = round(size(waypoints,1)*0.2);
waypoints = smoothdata(waypoints,"rloess",window);

If GPS data also suffers from drift in position and orientation, then you must improve ego vehicle localization to generate accurate ego trajectory. For more information, see the Improve Ego Vehicle Localization example.

Create the ego trajectory using the waypoints and their corresponding time of arrivals by using the waypointTrajectory (Sensor Fusion and Tracking Toolbox) System object™. This example do not consider altitude values of the waypoints, therefore set all the altitude values to 0. You must set ReferenceFrame property of this System object to ENU because Scenario Builder for Automated Driving Toolbox® supports the ENU format for local coordinate data.

waypoints = double([waypoints(:,1:2) zeros(size(zUp))]);
egoTrajectory = waypointTrajectory(waypoints,gpsData.timeStamp,ReferenceFrame="ENU");

Select road properties on which the ego vehicle is traveling by using the selectActorRoads function.

egoRoadData = selectActorRoads(roadData,egoTrajectory.Waypoints);

Display egoRoadData table.

egoRoadData
egoRoadData=1×10 table
    RoadID    JunctionID         RoadName          RoadCenters      RoadWidth      BankAngle         Heading          Lanes         LeftBoundary     RightBoundary 
    ______    __________    __________________    ______________    _________    _____________    _____________    ____________    ______________    ______________

      75          0         "Banfield Freeway"    {438×3 double}      3.75       {35×1 double}    {35×1 double}    1×1 lanespec    {551×3 double}    {551×3 double}

Visualize the selected roads by using the helperPlotRoads function. Notice that the selected ego roads do not have lane information.

helperPlotRoads(egoRoadData,cropWindow);

Figure contains 2 axes objects. Axes object 1 with title Extracted Roads contains 4 objects of type patch, line, rectangle. Axes object 2 with title Zoomed View contains 4 objects of type patch, line, rectangle.

Generate Lane Specifications for Ego Roads

Visualize lane detections and camera images by using the birdsEyePlot function and helperPlotLanes function.

% Initialize the figure with bird's-eye plot.
currentFigure = figure(Position=[0 0 1400 600]);
hPlot = axes(uipanel(currentFigure,Position=[0 0 0.5 1],Title="Recorded Lanes"));
bep = birdsEyePlot(XLim=[0 60],YLim=[-35 35],Parent=hPlot);
camPlot = axes(uipanel(currentFigure,Position=[0.5 0 0.5 1],Title="Camera View"));
helperPlotLanes(bep,camPlot,laneDetections,cameraData);

{"String":"Figure contains 2 axes objects and other objects of type uipanel. Axes object 1 contains 3 objects of type line, text. These objects represent Left lane marking, Right lane marking. Axes object 2 contains 5 objects of type image, text.","Tex":[],"LaTex":[]}

{"String":"Figure contains 2 axes objects and other objects of type uipanel. Axes object 1 contains 2 objects of type line. These objects represent Left lane marking, Right lane marking. Axes object 2 contains 5 objects of type image, text.","Tex":[],"LaTex":[]}

Create reference lane specifications by using the lanespec function. Specify three number of lanes on the basis of visual inspection of camera data. Note that the reference lane specification need not to be accurate as it provides a global context to populate lanes on the roads.

numOfLanes = 3;
refLaneSpec = lanespec(numOfLanes);

Specify the lane index for the first waypoint of the ego vehicle. You must start numbering the lanes in left-to-right order along the travel direction of the ego vehicle. For this example, the first waypoint of the ego vehicle is on the third lane as shown in the camera image.

firstEgoWaypointLaneIdx = 3;

Update the lane specification with information extracted from lane detections by using the updateLaneSpec function.The camera image shows that all three lanes of the ego road have nearly same lane width. To replicate ego lane width for non-detected lanes, set the ReplicateUpdatedLaneWidth argument to true.

egoRoadsWithUpdatedLanes = updateLaneSpec(laneDetections,egoRoadData,refLaneSpec,...
                                          egoTrajectory,firstEgoWaypointLaneIdx,ReplicateUpdatedLaneWidth=true);

Visualize ego roads with updated lane specifications by using the helperPlotRoads function.

helperPlotRoads(egoRoadsWithUpdatedLanes,cropWindow);

Figure contains 2 axes objects. Axes object 1 with title Extracted Roads contains 4 objects of type patch, line, rectangle. Axes object 2 with title Zoomed View contains 4 objects of type patch, line, rectangle.

Simulate Scenario with Ego Vehicle and Updated Lanes

Define parameters to create a driving scenario. endTime is specified as GPS end time and the sampleTime is calculated as minimum difference between two consecutive GPS timestamps.

endTime = gpsData.timeStamp(end);
sampleTime = min(diff(gpsData.timeStamp));

Create a driving scenario by using the drivingScenario object. Specify SampleTime and StopTime for the driving scenario.

scenario = drivingScenario(SampleTime=sampleTime,StopTime=endTime);

Add roads with updated lane specifications by using helperAddRoads function. This helper function adds roads to driving scenario using the road function.

scenario = helperAddRoads2Scene(scenario,egoRoadsWithUpdatedLanes);

Add the ego vehicle waypoints to scenario by using the helperAddEgo function.

scenario = helperAddEgo2Scene(scenario,egoTrajectory);

Visualize the generated scenario and compare it with recorded camera data using the helperVisualizeHDSceneWithEgo function.

currentFigure = figure(Name="Generated Scene",Position=[0 0 1400 600]);
helperVisualizeHDSceneWithEgo(currentFigure,scenario,cameraData);

{"String":"Figure Generated Scene contains an axes object and other objects of type uipanel. The axes object contains 2 objects of type image, text.","Tex":[],"LaTex":[]}

Create RoadRunner HD Map

HD map contains lane information which is useful for automated driving applications such as sensing, perception, localization, and planning. Create a RoadRunner HD map from generated driving scenario by using the helperComposeRRHDMap function.

rrHDMap = helperComposeRRHDMap(scenario);

Visualize HD Map roads by using the helperPlotRoads function.

helperPlotRoads(rrHDMap,cropWindow)

Figure contains 2 axes objects. Axes object 1 with title RoadRunner HD Map contains 3 objects of type line, rectangle. Axes object 2 with title Zoomed View contains 3 objects of type line, rectangle. These objects represent Lane Boundaries, Lane Centers.

Write the RoadRunner HD Map to a binary file, which you can import in RoadRunner.

write(rrHDMap,"RoadRunnerHDMapFromLanes.rrhd");

You can import the RoadRunnerHDMapFromLanes.rrhd data to RoadRunner and build the scene using Scene Builder Tool. For detailed instructions on importing a RoadRunner HD Map file with .rrhd extension into RoadRunner, previewing the map, and building the scene, see Build Scene (RoadRunner). This figure shows a scene built using RoadRunner Scene Builder.

Note: Use of the Scene Builder Tool requires a RoadRunner Scene Builder license.

See Also

Functions

Related Topics