# leakyReluLayer

Leaky Rectified Linear Unit (ReLU) layer

## Description

A leaky ReLU layer performs a threshold operation, where any input value less than zero is multiplied by a fixed scalar.

This operation is equivalent to:

$f\left(x\right)=\left\{\begin{array}{ll}x,\hfill & x\ge 0\hfill \\ scale*x,\hfill & x<0\hfill \end{array}.$

## Creation

### Description

layer = leakyReluLayer returns a leaky ReLU layer.

layer = leakyReluLayer(scale) returns a leaky ReLU layer with a scalar multiplier for negative inputs equal to scale.

example

layer = leakyReluLayer(___,'Name',Name) returns a leaky ReLU layer and sets the optional Name property.

## Properties

expand all

### Leaky ReLU

Scalar multiplier for negative input values, specified as a numeric scalar.

Example: 0.4

### Layer

Layer name, specified as a character vector or a string scalar. For Layer array input, the trainNetwork, assembleNetwork, layerGraph, and dlnetwork functions automatically assign names to layers with the name ''.

Data Types: char | string

Number of inputs of the layer. This layer accepts a single input only.

Data Types: double

Input names of the layer. This layer accepts a single input only.

Data Types: cell

Number of outputs of the layer. This layer has a single output only.

Data Types: double

Output names of the layer. This layer has a single output only.

Data Types: cell

## Examples

collapse all

Create a leaky ReLU layer with the name 'leaky1' and a scalar multiplier for negative inputs equal to 0.1.

layer = leakyReluLayer(0.1,'Name','leaky1')
layer =
LeakyReLULayer with properties:

Name: 'leaky1'

Hyperparameters
Scale: 0.1000

Include a leaky ReLU layer in a Layer array.

layers = [
imageInputLayer([28 28 1])
convolution2dLayer(3,16)
batchNormalizationLayer
leakyReluLayer

maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,32)
batchNormalizationLayer
leakyReluLayer

fullyConnectedLayer(10)
softmaxLayer
classificationLayer]
layers =
11x1 Layer array with layers:

1   ''   Image Input             28x28x1 images with 'zerocenter' normalization
2   ''   2-D Convolution         16 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
3   ''   Batch Normalization     Batch normalization
4   ''   Leaky ReLU              Leaky ReLU with scale 0.01
5   ''   2-D Max Pooling         2x2 max pooling with stride [2  2] and padding [0  0  0  0]
6   ''   2-D Convolution         32 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
7   ''   Batch Normalization     Batch normalization
8   ''   Leaky ReLU              Leaky ReLU with scale 0.01
9   ''   Fully Connected         10 fully connected layer
10   ''   Softmax                 softmax
11   ''   Classification Output   crossentropyex

## References

[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. ICML, vol. 30, no. 1. 2013.

## Version History

Introduced in R2017b