Main Content

Custom Training Loops

Train deep learning networks using custom training loops

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For networks that cannot be created using layer graphs, you can define custom networks as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Functions

expand all

dlnetworkDeep learning network for custom training loops (Since R2019b)
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops (Since R2022b)
minibatchqueueCreate mini-batches for deep learning (Since R2020b)
padsequencesPad or truncate sequence data to same length (Since R2021a)
dlarrayDeep learning array for customization (Since R2019b)
dlgradientCompute gradients for custom training loops using automatic differentiation (Since R2019b)
dlfevalEvaluate deep learning model for custom training loops (Since R2019b)
crossentropyCross-entropy loss for classification tasks (Since R2019b)
l1lossL1 loss for regression tasks (Since R2021b)
l2lossL2 loss for regression tasks (Since R2021b)
huberHuber loss for regression tasks (Since R2021a)
mseHalf mean squared error (Since R2019b)
ctcConnectionist temporal classification (CTC) loss for unaligned sequence classification (Since R2021a)

Topics