Generate RoadRunner Scene Using Aerial Hyperspectral and Lidar Data
This example shows how to generate the digital twin of a scene containing trees and buildings by using aerial hyperspectral and lidar data, and reconstruct roads using OpenStreetMap® data.
You can create a digital twin of a real-world scene, from data recorded from various sensors, and use it to perform verification and validation of automated driving functionalities. This example uses a hyperspectral imaging sensor, point clouds from an aerial lidar sensor, and road network information from OpenStreetMap data to create a RoadRunner scene. The hyperspectral imaging sensor acquires images with narrow and contiguous wavelengths within a specified spectral range, and the rich spectral information in the hyperspectral data enables you to identify materials. For more information on hyperspectral images, see Getting Started with Hyperspectral Image Processing. Meanwhile, aerial lidar point clouds enable you to accurately extract the locations and dimensions of static objects in a scene. Lastly, OpenStreetMap provides access to worldwide, crowd-sourced map data, which you can use to reconstruct a road network. Using the hyperspectral data, lidar data, and the OpenStreetMap road network data, you can create a scene in RoadRunner that contains roads and other objects, such as trees and buildings.
In this example, you perform these steps.
Load data, and visualize the hyperspectral image and lidar point cloud.
Detect trees and buildings from hyperspectral data, and refine detections using lidar data.
Convert geographic coordinates to local coordinates, and create a georeferenced point cloud.
Extract cuboids of trees and buildings from the labeled, georeferenced point cloud.
Extract a road network using the OpenStreetMap web service.
Create a RoadRunner HD Map containing the extracted roads, trees, and buildings.
Build and import the scene using RoadRunner.
This example requires the Scenario Builder for Automated Driving Toolbox™ and Hyperspectral Imaging Library for Image Processing Toolbox™ support packages. You can install the support package add-ons from the Add-On Explorer. For more information on installing add-ons, see Get and Manage Add-Ons.
Load and Visualize Data
This example uses data from the MUUFL Gulfport Data Set [1][2]. The MUUFL Gulfport Data Set consists of data acquired simultaneously by airborne hyperspectral and lidar sensors mounted on an aircraft over the University of Southern Mississippi – Gulfport in November 2010. The data was acquired using an integrated system that included a hyperspectral sensor to collect hyperspectral data, a lidar sensor to collect lidar data, and a navigation system to collect latitude and longitude data. Because the hyperspectral and lidar data were acquired simultaneously, they are already spatially registered, and there is one-to-one mapping between the pixels of the hyperspectral image and the points in the lidar point cloud. Additionally, the data collected from the navigation system contains latitude and longitude world coordinates for each pixel in the hyperspectral image, and consequently for each point in the aerial point cloud.
Download and unzip the data set.
zipFile = matlab.internal.examples.downloadSupportFile("image","data/gulfport_dataset.zip"); [filepath,folder,~] = fileparts(zipFile); unzip(zipFile,filepath)
Load the hyperspectral data into the workspace as a hypercube
object. Estimate an RGB image of the loaded hyperspectral data.
filename = fullfile(filepath,folder,"gulfport_dataset","hyperspectralData.dat"); hcube = hypercube(filename); rgbImg = colorize(hcube,Method="rgb",ContrastStretching=true); rows = hcube.Metadata.Height; cols = hcube.Metadata.Width;
Load the lidar data into the workspace as a lasFileReader
(Lidar Toolbox) object, and read the point cloud from the lidar data.
filename = fullfile(filepath,folder,"gulfport_dataset","lidarData.las"); lasReader = lasFileReader(filename); ptCloud = readPointCloud(lasReader);
Visualize an RGB image of the hyperspectral data and an image of the elevations of the points in the point cloud in the xy-plane.
figure(Units="normalized",Position=[0 0 0.5 0.5]) tiledlayout(1,2) nexttile imshow(rgbImg) axis equal tight title("RGB Image of Hyperspectral Data",Color="white") nexttile pcshow(ptCloud.Location,ViewPlane="XY") title("Point Cloud of Lidar Data")
Load the world coordinate data into the workspace. This data contains latitude and longitude values, in degrees, for each pixel of the hyperspectral data, and consequently of the lidar data.
filename = fullfile(filepath,folder,"gulfport_dataset","worldCoordinates.mat"); load(filename,"lat","lon")
Detect Trees and Buildings
The hyperspectral data provides information about the spectral properties of the land cover, whereas the lidar data provides information about the elevation of the land cover from the ground. You can use information from the hyperspectral and lidar data to accurately detect trees and buildings in the scene.
Identify a few pixels that represents trees and buildings in the RGB image. Specify useInteractiveROI
as true
to select the seed points for the trees and buildings interactively using the helperSelectSeedPoints
helper function, attached to this example as a supporting file. To select the seed points interactively, you must first specify the desired number of seed points for both the trees and buildings.
To instead select the seed points programmatically, you can specify useInteractiveROI
as false
, and the locations of the seed points for the trees and buildings in the treesPoints
and buildingsPoint
variables, respectively.
useInteractiveROI = false; numberOfTreePoints = 2; numberOfBuildingPoints = 2; if useInteractiveROI [treesPoints,buildingsPoints] = helperSelectSeedPoints(numberOfTreePoints,numberOfBuildingPoints,rgbImg); else treesPoints = [225 121; 50 33]; buildingsPoints = [311 18; 182 84]; end
Compute the average spectra of the identified tree pixels and the identified building pixels, respectively, as reference.
treesRef = []; buildingsRef = []; for i = 1:size(treesPoints,1) loc = treesPoints(i,:); pt = squeeze(hcube.DataCube(loc(1),loc(2),:)); treesRef = [treesRef pt]; end for i = 1:size(buildingsPoints,1) loc = buildingsPoints(i,:); pt = squeeze(hcube.DataCube(loc(1),loc(2),:)); buildingsRef = [buildingsRef pt]; end treesRef = mean(treesRef,2); buildingsRef = mean(buildingsRef,2);
Spectral matching techniques compare individual pixel spectra with a reference spectrum. The spectral angle mapper (SAM) method of spectral matching measures the angle between the reference spectrum and individual pixel spectra. Hence, a lower SAM score is equivalent to a smaller angle between the two spectra, and indicates a higher similarity between the two spectra.
Compute the SAM score between each individual pixel spectrum in the hyperspectral image and the reference spectra for trees and buildings by using the sam
function. Threshold the SAM scores using Otsu's method to obtain a preliminary mask for trees and buildings.
treesScore = sam(hcube,treesRef); treesMaskPreliminary = treesScore <= multithresh(treesScore); buildingsScore = sam(hcube,buildingsRef); buildingsMaskPreliminary = buildingsScore <= multithresh(buildingsScore);
Visualize the preliminary masks. Observe that using only spectral information results in classifying grass is classified as trees and roads as buildings.
figure tiledlayout(2,2) nexttile imagesc(treesScore) axis equal tight colorbar clim([0 1]) title({"Spectral Matching Score Map"; "for Trees"}) nexttile imagesc(treesMaskPreliminary) axis equal tight title({"Trees Detected"; "from Hyperspectral Data"}) nexttile imagesc(buildingsScore) axis equal tight colorbar clim([0 1]) title({"Spectral Matching Score Map"; "for Buildings"}) nexttile imagesc(buildingsMaskPreliminary) axis equal tight title({"Buildings Detected"; "from Hyperspectral Data"})
Use the elevation data from the lidar sensor to differentiate between trees and grass and between buildings and roads to get refined masks for trees and buildings, respectively.
elevationData = reshape(ptCloud.Location(:,3),rows,cols); elevationMask = elevationData; elevationMask(~treesMaskPreliminary & ~buildingsMaskPreliminary) = 0; elevationMask = elevationMask >= multithresh(elevationData); treesMask = treesMaskPreliminary & elevationMask; buildingsMask = buildingsMaskPreliminary & elevationMask;
Create labels using the masks for trees and buildings.
predictedLabels = zeros(rows,cols); predictedLabels(treesMask) = 1; predictedLabels(buildingsMask) = 2;
Create a dictionary to assign a name for each predicted label.
labelNames = ["unclassified" "trees" "buildings"]; labelMap = dictionary(unique(predictedLabels)',labelNames)
labelMap = dictionary (double ⟼ string) with 3 entries: 0 ⟼ "unclassified" 1 ⟼ "trees" 2 ⟼ "buildings"
Assign the label names to the predicted labels.
labels = labelMap(predictedLabels);
Visualize the labels.
figure imagesc(predictedLabels) axis equal tight colorbar(Ticks=unique(predictedLabels),TickLabels=labelNames) title("Detected Trees and Buildings")
Create Georeferenced Point Cloud
The loaded point cloud is in a projected coordinate system, which you must convert to a local coordinate system to georeference the point cloud. Georeferencing the point cloud is essential because the point cloud locations must align with the locations of the road network you import from the OSM web service.
The loaded latitude and longitude values are in one-to-one mapping with each point in the point cloud. You can use the latitude and longitude information to georeference the point cloud.
Select the mean values of the latitude, longitude, and altitude to define the georeference point of the point cloud. You can get the altitude from the elevation data in the point cloud. Note that you must use the same georeference origin to download the road network of interest from the OSM web service.
geoReference = [mean(lat(:)) mean(lon(:)) mean(elevationData(:))];
Convert the latitude, longitude, and altitude to the local coordinate system, with geoReference
as the origin.
[xE,yN,zU] = latlon2local(lat(:),lon(:),elevationData(:),geoReference);
Create a dictionary to assign a color to each predicted label name.
labelColors = {uint8([0 0 0]),uint8([0 255 0]),uint8([255 0 0])}; colorMap = dictionary(labelNames,labelColors)
colorMap = dictionary (string ⟼ cell) with 3 entries: "unclassified" ⟼ {[0 0 0]} "trees" ⟼ {[0 255 0]} "buildings" ⟼ {[255 0 0]}
To fit the cuboids of the trees and buildings in the point cloud, map the predicted labels of all the points of the scene to the Color
property of the point cloud, and create a georeferenced point cloud.
pcColor = cell2mat(colorMap(labels(:))); ptCloud = pointCloud([xE yN zU],Color=pcColor,Intensity=ptCloud.Intensity);
Extract Cuboids of Trees and Buildings from Georeferenced Point Cloud
To generate a RoadRunner Scene containing static objects such as trees and buildings, you must have a cuboid parameter array containing the center, dimension, and orientation for each static object.
Create a parameter structure that contains a field for each type of static object. Specify each static object field as a structure containing these fields:
minDistance
— Minimum distance between the points of two different static objects, specified as a scalar. If static objects of the same type are very close to each other, reduce theminDistance
value to identify them individually.NumClusterPoints
— Minimum and maximum number of points in each cluster, specified as a scalar or a two-element vector of the form [minPoints maxPoints].
When you specifyNumClusterPoints
as a scalar, the maximum number of points in the cluster is unrestricted. Based on the density of your point cloud and the type of static object, adjust theNumClusterPoints
values for efficient segmentation of desired objects. For example, specify smallerNumClusterPoints
values for trees and larger values for buildings.
vegetationParams = struct("minDistance",1.3,"NumClusterPoints",[5 50]); buildingParams = struct("minDistance",2.5,"NumClusterPoints",50); parameters = struct(labelNames(2),vegetationParams,labelNames(3),buildingParams);
Obtain the cuboid parameters for each instance of a static object in the point cloud with a specified label from the point cloud by using the helperExtractCuboidsFromLidar
helper function. The helper function returns the parameters in a cell array, cuboids
.
cuboids = helperExtractCuboidsFromLidar(ptCloud,parameters,labelNames(2:3),colorMap);
The lidar data used in this example has been recorded by an aerial lidar sensor with a bird's-eye field of view, so the majority of the points in the point cloud are on the tops of buildings and trees, with very few points on the sides. As a result, the cuboids extracted from the point cloud might not be accurate, and a scene generated with this data might place some objects in the air. Extend the height of the objects to the ground plane by using the helperExtendCuboidsToGround
function, attached to this example as a supporting file.
for i = 1:height(cuboids) cuboids{i} = helperExtendCuboidsToGround(cuboids{i},ptCloud); end
Visualize the corrected cuboid parameters overlaid on the point cloud.
ax = pcshow(ptCloud,ViewPlane="XY"); hold on showShape("cuboid",cuboids{1},Color="red",Parent=ax,LineWidth=1) showShape("cuboid",cuboids{2},Color="yellow",Parent=ax,LineWidth=1) hold off
Extract Roads Using OpenStreetMap Web Service
To add the road network into the RoadRunner scene, first download the map file from the OpenStreetMap (OSM) [3] website for the same area that represents the downloaded hyperspectral and lidar data.
Import the OSM road network by using the getMapROI
function. Use the georeference
point of the point cloud to get the latitude and longitude values for the OSM road network, and the mean of the limits of the point cloud as extents of the road network.
osmExtent = mean(abs([ptCloud.XLimits ptCloud.YLimits])); mapParameters = getMapROI(geoReference(1),geoReference(2),Extent=osmExtent); osmFileName = websave("RoadScene.osm",mapParameters.osmUrl,weboptions(ContentType="xml"));
Create an empty driving scenario object.
scenario = drivingScenario;
Import the roads obtained from the OpenStreetMap web service into the driving scenario.
roadNetwork(scenario,"OpenStreetMap",osmFileName)
Generate RoadRunner HD Map with Lanes and Static Objects
Create RoadRunner HD Map
Create a RoadRunner HD Map containing the road network imported from the OpenStreetMap web service by using the getRoadRunnerHDMap
function.
rrMap = getRoadRunnerHDMap(scenario);
Update Lane Information in RoadRunner HD Map
As the elevation information for the road might not be accurate, improve the road elevation with reference to the point cloud by using helperRoadRunnerElevateLane
helper function, attached to this example as a supporting file.
laneInfo = helperRoadRunnerElevateLane(rrMap,ptCloud);
Update the RoadRunner HD Map using the road network with improved elevation data.
rrMap.Lanes = laneInfo.Lanes; rrMap.LaneBoundaries = laneInfo.LaneBoundaries; rrMap.LaneGroups = laneInfo.LaneGroups;
Update Static Object Information in RoadRunner HD Map
The hyperspectral data used in this example contains a few trees that appear to be placed on roads. Because the hyperspectral image is aerial data and, road-side plant canopies generally occlude the aerial visibility of roads, some roads have been misclassified as trees.
Remove the trees from the roads by using the helperRemoveTreesFromRoads
helper function, attached to this example as a supporting file.
treeCuboids = cuboids{1}; statObjs.trees = helperRemoveTreesFromRoads(scenario,treeCuboids,ptCloud,[rows cols]);
The data in this example contains buildings of different heights. To generate RoadRunner asset paths for buildings based on their heights, use the helperGenerateAssetPaths
helper function, attached to this example as a supporting file.
buildingCuboids = cuboids{2};
[statObjs.buildings,params.buildings.AssetPath] = helperGenerateAssetPaths("buildings",buildingCuboids);
Generate static object information in the RoadRunner HD Map format by using the roadrunnerStaticObjectInfo
function.
objectsInfo = roadrunnerStaticObjectInfo(statObjs,Params=params);
Update the map with the information on the trees and buildings.
rrMap.StaticObjectTypes = objectsInfo.staticObjectTypes; rrMap.StaticObjects = objectsInfo.staticObjects;
Visualize and Write RoadRunner HD Map
Plot the map, which contains information for the static objects, lane boundaries, and lane centers.
f = figure(Position=[1000 818 700 500]);
ax = axes("Parent",f);
plot(rrMap,ShowStaticObjects=true,Parent=ax)
xlim([-100 100])
ylim([-180 180])
Write the RoadRunner HD Map to a binary file, which you can import into the Scene Builder Tool.
rrMapFileName = "RoadRunnerAerialScene.rrhd";
write(rrMap,rrMapFileName)
Build and Import Scene Using RoadRunner
Open the RoadRunner application by using the roadrunner
(RoadRunner) object. Import the RoadRunner HD Map data from your binary file, and build it in the currently open scene.
Specify the path to your local RoadRunner installation folder. This code shows the path for the default installation location on Windows®.
rrAppPath = "C:\Program Files\RoadRunner " + matlabRelease.Release + "\bin\win64";
Specify the path to your RoadRunner project. This code shows the path to a sample project folder on Windows.
rrProjectPath = "C:\RR\MyProject";
Open RoadRunner using the specified path to your project.
rrApp = roadrunner(rrProjectPath,InstallationFolder=rrAppPath);
Copy the RoadRunnerAerialScene.rrhd
scene file into the Assets
folder of the RoadRunner project, and import it into RoadRunner.
copyfile(rrMapFileName,(rrProjectPath + filesep + "Assets")); rrhdFile = fullfile(rrProjectPath,"Assets",rrMapFileName);
Clear the overlap groups option to enable RoadRunner to create automatic junctions at the geometric overlaps of the roads.
enableOverlapGroupsOpts = enableOverlapGroupsOptions(IsEnabled=0); buildOpts = roadrunnerHDMapBuildOptions(EnableOverlapGroupsOptions=enableOverlapGroupsOpts); importOptions = roadrunnerHDMapImportOptions(BuildOptions=buildOpts);
Import the scene into RoadRunner by using the importScene
function.
importScene(rrApp,rrhdFile,"RoadRunner HD Map",importOptions)
Note: Use of the Scene Builder Tool requires a RoadRunner Scene Builder license, and use of RoadRunner Assets requires a RoadRunner Asset Library Add-On license.
This figure shows the RGB image of the hyperspectral data, and the RoadRunner Scene built using RoadRunner Scene Builder.
References
[1] P. Gader, A. Zare, R. Close, J. Aitken, and G. Tuell. “MUUFL Gulfport Hyperspectral and LiDAR Airborne Data Set.” Technical Report REP-2013-570. University of Florida, Gainesville, 2013. https://github.com/GatorSense/MUUFLGulfport/tree/master.
[2] X. Du and A. Zare. “Technical Report: Scene Label Ground Truth Map for MUUFL Gulfport Data Set.” University of Florida, Gainesville, 2017. http://ufdc.ufl.edu/IR00009711/00001.
[3] You can download OpenStreetMap files from https://www.openstreetmap.org, which provides access to crowd-sourced map data all over the world. The data is licensed under the Open Data Commons Open Database License (ODbL), https://opendatacommons.org/licenses/odbl/.
See Also
Functions
hypercube
|colorize
|spectralMatch
|sam
|roadrunnerStaticObjectInfo
|lasFileReader
(Lidar Toolbox) |readPointCloud
(Lidar Toolbox) |segmentGroundSMRF
(Lidar Toolbox)
Related Topics
- Overview of Scenario Generation from Recorded Sensor Data
- Classify Land Cover Using Hyperspectral and Lidar Data
- Generate RoadRunner Scene Using Aerial Lidar Data
- Generate RoadRunner Scene from Recorded Lidar Data
- Generate RoadRunner Scenario from Recorded Sensor Data
- Generate High Definition Scene from Lane Detections and OpenStreetMap