relu
Syntax
Description
The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.
This operation is equivalent to
Note
This function applies the ReLU operation to dlarray
data. If
you want to apply the ReLU activation within a layerGraph
object
or Layer
array, use
the following layer:
Examples
Input Arguments
Output Arguments
Extended Capabilities
Version History
Introduced in R2019b