Main Content
leakyrelu
Apply leaky rectified linear unit activation
Description
The leaky rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is multiplied by a fixed scale factor.
This operation is equivalent to
Note
This function applies the leaky ReLU operation to dlarray
data. If
you want to apply leaky ReLU activation within a dlnetwork
object, use leakyReluLayer
.
Examples
Input Arguments
Output Arguments
Extended Capabilities
Version History
Introduced in R2019b