Main Content

trainRandlanet

Train RandLA-Net network to perform semantic segmentation

Since R2024a

    Description

    Train a Segmenter

    trainedSegmenter = trainRandlanet(trainingData,segmenter,options) trains the RandLA-Net deep learning network specified by segmenter on the training data trainingData with the specified training parameters options, and returns the trained network trainedSegmenter. The input segmenter can be a pretrained or custom RandLA-Net network. You can also use this syntax to fine-tune a trained RandLA-Net network.

    Resume Training a Segmenter

    trainedSegmenter = trainRandlanet(trainingData,checkpoint,options) resumes the training from the saved checkpoint specified by checkpoint. You can use this syntax to add more training data and continue training a network, or to improve the training accuracy by increasing the maximum number of iterations.

    Additional Options

    [trainedSegmenter,info] = trainRandlanet(___) returns information on the training progress of the network, using any combination of input arguments from previous syntaxes..

    [___] = trainRandlanet(___,Name=Value) specifies options using one or more name-value arguments in addition to any combination of arguments from previous syntaxes. For example, trainRandlanet(trainingData,segmenter,options,ExperimentMonitor=[]) specifies not to track training progress with Experiment Manager.

    Note

    This functionality requires Deep Learning Toolbox™ and the Lidar Toolbox™ Model for RandLA-Net Semantic Segmentation. You can download and install the Lidar Toolbox Model for RandLA-Net Semantic Segmentation from Add-On Explorer. For more information about installing add-ons, see Get and Manage Add-Ons.

    Input Arguments

    collapse all

    Training data, specified as a valid datastore object. Data must be set up such that using the read function on the datastore object returns a 1-by-2 cell array with this data in the two cells.

    • First cell — Unorganized or organized point cloud data, specified as a pointCloud object.

    • Second cell — Segmentation labels, specified as a categorical array.

      • If the corresponding point cloud is an unorganized point cloud with M points, the segmentation labels in the training data must be specified as an M-by-1 categorical vector.

      • If the corresponding point cloud is an organized point cloud with M-by-N points, the segmentation labels in the training data must be specified as an M-by-N categorical matrix.

    You can use the combine function to combine two or more datastores. For more information on creating datastore objects, see the datastore function.

    RandLA-Net semantic segmentation network, specified as a randlanet object. The network can be an untrained or pretrained network.

    Saved checkpoint, specified as a randlanet object. To periodically save a checkpoint during training, specify the CheckpointPath property of the training options object options. To control how frequently the network saves the checkpoint, tune the CheckpointFrequency and CheckpointFrequencyUnit properties of the training options. The trainRandlanet function saves the checkpoint at the specified path as a MAT file.

    To load a checkpoint for a previously trained network, load the corresponding MAT file from the checkpoint path, and then extract the object from the loaded data. For example, if the CheckpointPath property of your options object is "/checkpath", you can load a checkpoint MAT file by using this code.

    data = load("/checkpath/net_checkpoint__1__2023_12_31__00_00_00.mat");
    checkpoint = data.net;

    The name of the MAT file includes the iteration number and timestamp at which the network saves the checkpoint. The file stores the network in the net variable. To continue training, specify the network extracted from the file to the trainRandlanet function.

    trainedSegmenter  = trainRandlanet(trainingData,checkpoint,options);

    Training options, specified as a TrainingOptionsSGDM, TrainingOptionsRMSProp, or TrainingOptionsADAM object returned by the trainingOptions (Deep Learning Toolbox) function. To specify the solver name and other options for network training, use the trainingOptions (Deep Learning Toolbox) function.

    Note

    The trainRandlanet function supports these training options.

    Name-Value ArgumentsSupported Options
    ResetInputNormalizationfalse
    BatchNormalizationStatistics"moving"

    Name-Value Arguments

    Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Example: trainRandlanet(trainingData,segmenter,options,ExperimentMonitor=[]) specifies not to track training progress with Experiment Manager.

    Training experiment monitoring, specified as an experiments.Monitor (Deep Learning Toolbox) object for use with the Experiment Manager (Deep Learning Toolbox) app. You can use this object to track the progress of training, update information fields in the training results table, record values of the metrics used by the training, and produce training plots.

    The app monitors this information during training.

    • Training loss at each iteration

    • Learning rate at each iteration

    • Validation loss at each iteration, if the options input contains validation data

    The app also monitors the final validation loss at the end of the training, if the options input contains validation data.

    Output Arguments

    collapse all

    Trained RandLA-Net semantic segmentation network, returned as a randlanet object.

    Training progress information, returned as a structure array with these fields. Each field corresponds to a stage of training.

    • TrainingLoss — Training loss at each iteration. The trainRandlanet function uses weighted cross entropy loss for training.

    • BaseLearnRate — Learning rate at each iteration.

    • OutputNetworkLocation — Iteration number of the returned network.

    • ValidationLoss — Validation loss at each iteration.

    • FinalValidationLoss — Final validation loss at the end of the training.

    Each field of the info structure is a numeric vector with one element per training iteration. If the function does not calculate a metric at a specific iteration, the corresponding element of the vector has a value of NaN. The info structure contains ValidationLoss and FinalValidationLoss only if the options input contains validation data.

    Version History

    Introduced in R2024a