Main Content

Code Generation For Aerial Lidar Semantic Segmentation Using PointNet++ Deep Learning

This example shows how to generate CUDA® MEX code for a PointNet++ [1] network for lidar semantic segmentation. This example uses a pretrained PointNet++ network that can segment unorganized lidar point clouds belonging to eight classes (buildings, cars, trucks, poles, power lines, fences, ground, and vegetation). For more information on PointNet++ network, see Getting Started with PointNet++ (Lidar Toolbox).

Third-Party Prerequisites

Required

  • CUDA enabled NVIDIA® GPU and compatible driver.

Optional

For non-MEX builds such as static libraries, dynamic libraries, or executables, this example has the following additional requirements.

Verify GPU Environment

To verify that the compilers and libraries for running this example are set up correctly, use the coder.checkGpuInstall (GPU Coder) function.

envCfg = coder.gpuEnvConfig('host');
envCfg.DeepLibTarget = 'cudnn';
envCfg.DeepCodegen = 1;
envCfg.Quiet = 1;
coder.checkGpuInstall(envCfg);

Load PointNet++ Network

Use the getPointnetplusNet function, attached as a supporting file to this example, to load the pretrained PointNet++ network. For more information on how to train this network, see Aerial Lidar Semantic Segmentation Using PointNet++ Deep Learning (Lidar Toolbox) example.

net = getPointnetplusNet;

The pretrained network is a DAG network. To display an interactive visualization of the network architecture, use the analyzeNetwork function.

The sampling and grouping layer, and the interpolation layer are implemented using the functionLayer function. Both pointCloudInputLayer and the functionLayer functions do not support code generation. For code generation support, replace the function layers with custom layers and the pointCloudInputLayer with the imageInputLayer by using the helperReplaceInputAndFunctionLayers helper function, attached to this example as a support file. This function saves the network as a MAT file with the name pointnetplusCodegenNet.mat.

net = helperReplaceInputAndFunctionLayers(net);

pointnetplusPredict Entry-Point Function

The pointnetplusPredict entry-point function takes a point cloud data matrix as input and performs prediction on it by using the deep learning network saved in the pointnetplusCodegenNet.mat file. The function loads the network object from the pointnetplusCodegenNet.mat file into a persistent variable mynet and reuses the persistent variable in subsequent prediction calls.

type('pointnetplusPredict.m');
function out = pointnetplusPredict(in)
%#codegen

% A persistent object mynet is used to load the DAG network object. At
% the first call to this function, the persistent object is constructed and
% setup. When the function is called subsequent times, the same object is
% reused to call predict on inputs, thus avoiding reconstructing and
% reloading the network object.

% Copyright 2021 The MathWorks, Inc.

persistent mynet;

if isempty(mynet)
    mynet = coder.loadDeepLearningNetwork('pointnetplusCodegenNet.mat');
end

% pass in input
out = predict(mynet,in);

Generate CUDA MEX Code

To generate CUDA® code for the pointnetplusPredict entry-point function, create a GPU code configuration object for a MEX target and set the target language to C++. Use the coder.DeepLearningConfig (GPU Coder) function to create a CuDNN deep learning configuration object and assign it to the DeepLearningConfig property of the GPU code configuration object. Run the codegen command with the size of point cloud data in the input layer of the network, which in this case is [8192 1 3].

cfg = coder.gpuConfig('mex');
cfg.TargetLang = 'C++';
cfg.DeepLearningConfig = coder.DeepLearningConfig(TargetLibrary='cudnn');
codegen -config cfg pointnetplusPredict -args {randn(8192,1,3,'single')} -report
Code generation successful: View report

To generate CUDA® code for the TensorRT target, create and use a TensorRT deep learning configuration object instead of the CuDNN configuration object.

Segment Aerial Point Cloud Using Generated MEX Code

The network in this example is trained on the DALES data set [2]. Follow the instructions on the DALES website to download the data set to the folder specified by the dataFolder variable. Create a folder to store the test data.

dataFolder = fullfile(tempdir,'DALES');
testDataFolder = fullfile(dataFolder,'dales_las','test');

Each point cloud in the DALES dataset covers an area of 500-by-500 meters, which is much larger than the typical area covered by terrestrial lidar point clouds. For efficient memory processing, divide the point cloud into small, non-overlapping blocks by using a blockedPointCloud (Lidar Toolbox) object.

Define the block dimensions using the blockSize parameter. As the size of each point cloud in the dataset varies, set the z-dimension of the block to Inf to avoid block creation along z-axis.

blockSize = [51 51 Inf];

First, create a blockedPointCloud (Lidar Toolbox) object. Then, create a blockedPointCloudDatastore (Lidar Toolbox) object on the test data using the blockedPointCloud (Lidar Toolbox) object.

tbpc = blockedPointCloud(fullfile(testDataFolder,'5080_54470.las'),blockSize);
tbpcds = blockedPointCloudDatastore(tbpc);

Define the parameters used to train the network. For more details, see the Aerial Lidar Semantic Segmentation Using PointNet++ Deep Learning (Lidar Toolbox) example.

numNearestNeighbors = 20;
radius = 0.05;
numPoints = 8192;
maxLabel = 1;
classNames = [
    "ground"
    "vegetation"
    "cars"
    "trucks"
    "powerlines"
    "fences"
    "poles"
    "buildings"
    ];
numClasses = numel(classNames);

Initialize placeholders for the predicted and target labels.

labelsDensePred = [];
labelsDenseTarget = [];

Apply the same transformation used on training data to the test data, tbpcds, follow these steps.

  • Extract the point cloud.

  • Downsample the point cloud to a specified number, numPoints.

  • Normalize the point clouds to the range [0 1].

  • Convert the point cloud to make it compatible with the input layer of the network.

Perform inference on the test point cloud data to compute prediction labels. Predict the labels of the sparse point cloud using the pointnetplusPredict_mex function. Then interpolate the prediction labels of the sparse point cloud to obtain prediction labels of the dense point cloud and iterate this process on all the non-overlapping blocks.

while hasdata(tbpcds)
    
    % Read the block along with block information.
    [ptCloudDense,infoDense] = read(tbpcds);

    % Extract the labels from the block information.
    labelsDense = infoDense.PointAttributes.Classification;
    
    % Select only labeled data.
    ptCloudDense = select(ptCloudDense{1},labelsDense~=0);
    labelsDense = labelsDense(labelsDense~=0);

    % Use the helperDownsamplePoints function, attached to this example as a
    % supporting file, to extract a downsampled point cloud from the
    % dense point cloud.
    ptCloudSparse = helperDownsamplePoints(ptCloudDense, ...
        labelsDense,numPoints);

    % Make the spatial extent of the dense point cloud equal to the sparse
    % point cloud.
    limits = [ptCloudDense.XLimits;ptCloudDense.YLimits;ptCloudDense.ZLimits];
    ptCloudSparseLocation = ptCloudSparse.Location;
    ptCloudSparseLocation(1:2,:) = limits(:,1:2)';
    ptCloudSparse = pointCloud(ptCloudSparseLocation,Color=ptCloudSparse.Color, ...
        Intensity=ptCloudSparse.Intensity, Normal=ptCloudSparse.Normal);

    % Use the helperNormalizePointCloud function, attached to this example as
    % a supporting file, to normalize the point cloud between 0 and 1.
    ptCloudSparseNormalized = helperNormalizePointCloud(ptCloudSparse);
    ptCloudDenseNormalized = helperNormalizePointCloud(ptCloudDense);

    % Use the helperTransformToTestData function, defined at the end of this
    % example, to convert the point cloud to a cell array and to permute the
    % dimensions of the point cloud to make it compatible with the input layer
    % of the network.
    ptCloudSparseForPrediction = helperTransformToTestData(ptCloudSparseNormalized);

    % Get the output predictions.
    scoresPred = pointnetplusPredict_mex(single(ptCloudSparseForPrediction{1,1}));
    [~,labelsSparsePred] = max(scoresPred,[],3);
    labelsSparsePred = uint8(labelsSparsePred);

    % Use the helperInterpolate function, attached to this example as a
    % supporting file, to calculate labels for the dense point cloud,
    % using the sparse point cloud and labels predicted on the sparse point cloud.
    interpolatedLabels = helperInterpolate(ptCloudDenseNormalized, ...
        ptCloudSparseNormalized,labelsSparsePred,numNearestNeighbors, ...
        radius,maxLabel,numClasses);

    % Concatenate the predicted and target labels from the blocks.
    labelsDensePred = vertcat(labelsDensePred,interpolatedLabels);
    labelsDenseTarget = vertcat(labelsDenseTarget,labelsDense);
end
Starting parallel pool (parpool) using the 'Processes' profile ...
Connected to the parallel pool (number of workers: 6).

For better visualisation, display a single block inferred from the point cloud data.

figure;
ax = pcshow(ptCloudDense.Location,interpolatedLabels);
axis off;
helperLabelColorbar(ax,classNames);
title("Point Cloud Overlaid with Detected Semantic Labels");

Supporting Functions

The helperLabelColorbar function adds a colorbar to the current axis. The colorbar is formatted to display the class names with the color.

function helperLabelColorbar(ax,classNames)
% Colormap for the original classes.
cmap = [[0,0,255];
    [0,255,0];
    [255,192,203];
    [255,255,0];
    [255,0,255];
    [255,165,0];
    [139,0,150];
    [255,0,0]];
cmap = cmap./255;
cmap = cmap(1:numel(classNames),:);
colormap(ax,cmap);

% Add colorbar to current figure.
c = colorbar(ax);
c.Color = 'w';

% Center tick labels and use class names for tick marks.
numClasses = size(classNames, 1);
c.Ticks = 1:1:numClasses;
c.TickLabels = classNames;

% Remove tick mark.
c.TickLength = 0;
end

The helperTransformToTestData function converts the point cloud into a cell array and permutes the dimensions of the point cloud to make it compatible with the input layer of the network.

function data = helperTransformToTestData(data)
if ~iscell(data)
    data = {data};
end
numObservations = size(data,1);
for i = 1:numObservations
    tmp = data{i,1}.Location;
    data{i,1} = permute(tmp,[1 3 2]);
end
end

References

[1] Qi, Charles R., Li Yi, Hao Su, and Leonidas J. Guibas. "PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space." ArXiv:1706.02413 [Cs], June 7, 2017. https://arxiv.org/abs/1706.02413.

[2] Varney, Nina, Vijayan K. Asari, and Quinn Graehling. "DALES: A Large-Scale Aerial LiDAR Data Set for Semantic Segmentation." ArXiv:2004.11985 [Cs, Stat], April 14, 2020. https://arxiv.org/abs/2004.11985.