Main Content

Deep Learning Tuning

Programmatically tune training options, resume training from a checkpoint, and investigate adversarial examples

To learn how to set options using the trainingOptions function, see Set Up Parameters and Train Convolutional Neural Network. After you identify some good starting options, you can automate sweeping of hyperparameters or try Bayesian optimization using Experiment Manager.

Investigate network robustness by generating adversarial examples. You can then use fast gradient sign method (FGSM) adversarial training to train a network robust to adversarial perturbations.


Deep Network DesignerDesign, visualize, and train deep learning networks


trainingOptionsOptions for training deep learning neural network
trainNetworkTrain deep learning neural network


Set Up Parameters and Train Convolutional Neural Network

Learn how to set up training parameters for a convolutional neural network.

Deep Learning Using Bayesian Optimization

This example shows how to apply Bayesian optimization to deep learning and find optimal network hyperparameters and training options for convolutional neural networks.

Train Deep Learning Networks in Parallel

This example shows how to run multiple deep learning experiments on your local machine.

Train Network Using Custom Training Loop

This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.

Compare Activation Layers

This example shows how to compare the accuracy of training networks with ReLU, leaky ReLU, ELU, and swish activation layers.

Adapt Code Generated in Deep Network Designer for Use in Experiment Manager

Use Experiment Manager to tune the hyperparameters of a network trained in Deep Network Designer.

Deep Learning Tips and Tricks

Learn how to improve the accuracy of deep learning networks.

Train Robust Deep Learning Network with Jacobian Regularization

This example shows how to train a neural network that is robust to adversarial examples using a Jacobian regularization scheme [1].

Specify Custom Weight Initialization Function

This example shows how to create a custom He weight initialization function for convolution layers followed by leaky ReLU layers.

Compare Layer Weight Initializers

This example shows how to train deep learning networks with different weight initializers.

Related Information

Featured Examples