## Define Custom Recurrent Deep Learning Layer

If Deep Learning Toolbox™ does not provide the layer you require for your task, then you can define your own custom layer using this example as a guide. For a list of built-in layers, see List of Deep Learning Layers.

To define a custom deep learning layer, you can use the template provided in this example, which takes you through the following steps:

1. Name the layer — Give the layer a name so that you can use it in MATLAB®.

2. Declare the layer properties — Specify the properties of the layer, including learnable parameters and state parameters.

3. Create a constructor function (optional) — Specify how to construct the layer and initialize its properties. If you do not specify a constructor function, then at creation, the software initializes the `Name`, `Description`, and `Type` properties with `[]` and sets the number of layer inputs and outputs to 1.

4. Create initialize function (optional) — Specify how to initialize the learnable and state parameters when the software initializes the network. If you do not specify an initialize function, then the software does not initialize parameters when it initializes the network.

5. Create forward functions — Specify how data passes forward through the layer (forward propagation) at prediction time and at training time.

6. Create reset state function (optional) — Specify how to reset state parameters.

7. Create a backward function (optional) — Specify the derivatives of the loss with respect to the input data and the learnable parameters (backward propagation). If you do not specify a backward function, then the forward functions must support `dlarray` objects.

When defining the layer functions, you can use `dlarray` objects. Using `dlarray` objects makes working with high dimensional data easier by allowing you to label the dimensions. For example, you can label which dimensions correspond to spatial, time, channel, and batch dimensions using the `"S"`, `"T"`, `"C"`, and `"B"` labels, respectively. For unspecified and other dimensions, use the `"U"` label. For `dlarray` object functions that operate over particular dimensions, you can specify the dimension labels by formatting the `dlarray` object directly, or by using the `DataFormat` option.

Using formatted `dlarray` objects in custom layers also allows you to define layers where the inputs and outputs have different formats, such as layers that permute, add, or remove dimensions. For example, you can define a layer that takes as input a mini-batch of images with the format `"SSCB"` (spatial, spatial, channel, batch) and output a mini-batch of sequences with the format `"CBT"` (channel, batch, time). Using formatted `dlarray` objects also allows you to define layers that can operate on data with different input formats, for example, layers that support inputs with the formats `"SSCB"` (spatial, spatial, channel, batch) and `"CBT"` (channel, batch, time).

`dlarray` objects also enable support for automatic differentiation. Consequently, if your forward functions fully support `dlarray` objects, then defining the backward function is optional.

To enable support for using formatted `dlarray` objects in custom layer forward functions, also inherit from the `nnet.layer.Formattable` class when defining the custom layer. For an example, see Define Custom Deep Learning Layer with Formatted Inputs.

This example shows how to define a peephole LSTM layer [1], which is a recurrent layer with learnable parameters, and use it in a neural network. A peephole LSTM layer is a variant of an LSTM layer, where the gate calculations use the layer cell state.

### Intermediate Layer Template

Copy the intermediate layer template into a new file in MATLAB. This template gives the structure of an intermediate layer class definition. It outlines:

• The optional `properties` blocks for the layer properties, learnable parameters, and state parameters.

• The layer constructor function.

• The optional `initialize` function.

• The `predict` function and the optional `forward` function.

• The optional `resetState` function for layers with state properties.

• The optional `backward` function.

```classdef myLayer < nnet.layer.Layer % ... % & nnet.layer.Formattable ... % (Optional) % & nnet.layer.Acceleratable % (Optional) properties % (Optional) Layer properties. % Declare layer properties here. end properties (Learnable) % (Optional) Layer learnable parameters. % Declare learnable parameters here. end properties (State) % (Optional) Layer state parameters. % Declare state parameters here. end properties (Learnable, State) % (Optional) Nested dlnetwork objects with both learnable % parameters and state parameters. % Declare nested networks with learnable and state parameters here. end methods function layer = myLayer() % (Optional) Create a myLayer. % This function must have the same name as the class. % Define layer constructor function here. end function layer = initialize(layer,layout) % (Optional) Initialize layer learnable and state parameters. % % Inputs: % layer - Layer to initialize % layout - Data layout, specified as a networkDataLayout % object % % Outputs: % layer - Initialized layer % % - For layers with multiple inputs, replace layout with % layout1,...,layoutN, where N is the number of inputs. % Define layer initialization function here. end function [Z,state] = predict(layer,X) % Forward input data through the layer at prediction time and % output the result and updated state. % % Inputs: % layer - Layer to forward propagate through % X - Input data % Outputs: % Z - Output of layer forward function % state - (Optional) Updated layer state % % - For layers with multiple inputs, replace X with X1,...,XN, % where N is the number of inputs. % - For layers with multiple outputs, replace Z with % Z1,...,ZM, where M is the number of outputs. % - For layers with multiple state parameters, replace state % with state1,...,stateK, where K is the number of state % parameters. % Define layer predict function here. end function [Z,state,memory] = forward(layer,X) % (Optional) Forward input data through the layer at training % time and output the result, the updated state, and a memory % value. % % Inputs: % layer - Layer to forward propagate through % X - Layer input data % Outputs: % Z - Output of layer forward function % state - (Optional) Updated layer state % memory - (Optional) Memory value for custom backward % function % % - For layers with multiple inputs, replace X with X1,...,XN, % where N is the number of inputs. % - For layers with multiple outputs, replace Z with % Z1,...,ZM, where M is the number of outputs. % - For layers with multiple state parameters, replace state % with state1,...,stateK, where K is the number of state % parameters. % Define layer forward function here. end function layer = resetState(layer) % (Optional) Reset layer state. % Define reset state function here. end function [dLdX,dLdW,dLdSin] = backward(layer,X,Z,dLdZ,dLdSout,memory) % (Optional) Backward propagate the derivative of the loss % function through the layer. % % Inputs: % layer - Layer to backward propagate through % X - Layer input data % Z - Layer output data % dLdZ - Derivative of loss with respect to layer % output % dLdSout - (Optional) Derivative of loss with respect % to state output % memory - Memory value from forward function % Outputs: % dLdX - Derivative of loss with respect to layer input % dLdW - (Optional) Derivative of loss with respect to % learnable parameter % dLdSin - (Optional) Derivative of loss with respect to % state input % % - For layers with state parameters, the backward syntax must % include both dLdSout and dLdSin, or neither. % - For layers with multiple inputs, replace X and dLdX with % X1,...,XN and dLdX1,...,dLdXN, respectively, where N is % the number of inputs. % - For layers with multiple outputs, replace Z and dlZ with % Z1,...,ZM and dLdZ,...,dLdZM, respectively, where M is the % number of outputs. % - For layers with multiple learnable parameters, replace % dLdW with dLdW1,...,dLdWP, where P is the number of % learnable parameters. % - For layers with multiple state parameters, replace dLdSin % and dLdSout with dLdSin1,...,dLdSinK and % dLdSout1,...,dldSoutK, respectively, where K is the number % of state parameters. % Define layer backward function here. end end end```

### Name Layer

First, give the layer a name. In the first line of the class file, replace the existing name `myLayer` with `peepholeLSTMLayer`. To allow the layer to output different data formats, for example data with the format `"CBT"` (channel, batch, time) for sequence output and the format `"CB"` (channel, batch) for single time step or feature output, also include the `nnet.layer.Formattable` mixin.

```classdef peepholeLSTMLayer < nnet.layer.Layer & nnet.layer.Formattable ... end```

Next, rename the `myLayer` constructor function (the first function in the `methods` section) so that it has the same name as the layer.

``` methods function layer = peepholeLSTMLayer() ... end ... end```

#### Save Layer

Save the layer class file in a new file named `peepholeLSTMLayer.m`. The file name must match the layer name. To use the layer, you must save the file in the current folder or in a folder on the MATLAB path.

### Declare Properties, State, and Learnable Parameters

Declare the layer properties in the `properties` section, the layer states in the `properties (State)` section, and the learnable parameters in the `properties (Learnable)` section.

By default, custom intermediate layers have these properties. Do not declare these properties in the `properties` section.

PropertyDescription
`Name`Layer name, specified as a character vector or a string scalar. For `Layer` array input, the `trainNetwork`, `assembleNetwork`, `layerGraph`, and `dlnetwork` functions automatically assign names to layers with the name `''`.
`Description`

One-line description of the layer, specified as a string scalar or a character vector. This description appears when the layer is displayed in a `Layer` array.

If you do not specify a layer description, then the software displays the layer class name.

`Type`

Type of the layer, specified as a character vector or a string scalar. The value of `Type` appears when the layer is displayed in a `Layer` array.

If you do not specify a layer type, then the software displays the layer class name.

`NumInputs`Number of inputs of the layer, specified as a positive integer. If you do not specify this value, then the software automatically sets `NumInputs` to the number of names in `InputNames`. The default value is 1.
`InputNames`Input names of the layer, specified as a cell array of character vectors. If you do not specify this value and `NumInputs` is greater than 1, then the software automatically sets `InputNames` to `{'in1',...,'inN'}`, where `N` is equal to `NumInputs`. The default value is `{'in'}`.
`NumOutputs`Number of outputs of the layer, specified as a positive integer. If you do not specify this value, then the software automatically sets `NumOutputs` to the number of names in `OutputNames`. The default value is 1.
`OutputNames`Output names of the layer, specified as a cell array of character vectors. If you do not specify this value and `NumOutputs` is greater than 1, then the software automatically sets `OutputNames` to `{'out1',...,'outM'}`, where `M` is equal to `NumOutputs`. The default value is `{'out'}`.

If the layer has no other properties, then you can omit the `properties` section.

Tip

If you are creating a layer with multiple inputs, then you must set either the `NumInputs` or `InputNames` properties in the layer constructor. If you are creating a layer with multiple outputs, then you must set either the `NumOutputs` or `OutputNames` properties in the layer constructor. For an example, see Define Custom Deep Learning Layer with Multiple Inputs.

Declare the following layer properties in the `properties` section:

• `NumHiddenUnits` — Number of hidden units in the peephole LSTM operation

• `OutputMode` — Flag indicating whether the layer returns a sequence or a single time step

``` properties % Layer properties. NumHiddenUnits OutputMode end```

A peephole LSTM layer has four learnable parameters: the input weights, the recurrent weights, the peephole weights, and the bias. Declare these learnable parameters in the `properties (Learnable)` section with the names `InputWeights`, `RecurrentWeights`, `PeepholeWeights`, and `Bias`, respectively.

``` properties (Learnable) % Layer learnable parameters. InputWeights RecurrentWeights PeepholeWeights Bias end```

A peephole LSTM layer has two state parameters: the hidden state and the cell state. Declare these state parameters in the `properties (State)` section with the names `HiddenState` and `CellState`, respectively.

``` properties (State) % Layer state parameters. HiddenState CellState end```

Parallel training of networks containing custom layers with state parameters using the `trainNetwork` function is not supported. When you train a network with custom layers with state parameters, the `ExecutionEnvironment` training option must be `"auto"`, `"gpu"`, or `"cpu"`.

### Create Constructor Function

Create the function that constructs the layer and initializes the layer properties. Specify any variables required to create the layer as inputs to the constructor function.

The peephole LSTM layer constructor function requires two input arguments (the number of hidden units and the number of input channels) and two optional arguments (the layer name and output mode). Specify two input arguments named `numHiddenUnits` and `inputSize` in the `peepholeLSTMLayer` function that correspond to the number of hidden units and the number of input channels, respectively. Specify the optional input arguments as a single argument with the name `args`. Add a comment to the top of the function that explains the syntaxes of the function.

``` function layer = peepholeLSTMLayer(numHiddenUnits,inputSize,args) %PEEPHOLELSTMLAYER Peephole LSTM Layer % layer = peepholeLSTMLayer(numHiddenUnits,inputSize) % creates a peephole LSTM layer with the specified number of % hidden units and input channels. % % layer = peepholeLSTMLayer(numHiddenUnits,inputSize,Name=Value) % creates a peephole LSTM layer and specifies additional % options using one or more name-value arguments: % % Name - Name of the layer, specified as a string. % The default is "". % % OutputMode - Output mode, specified as one of the % following: % "sequence" - Output the entire sequence % of data. % % "last" - Output the last time step % of the data. % The default is "sequence". ... end```

#### Initialize Layer Properties

Initialize the layer properties in the constructor function. Replace the comment `% Layer constructor function goes here` with code that initializes the layer properties. Do not initialize learnable or state parameters in the constructor function, initialize them in the `initialize` function instead.

Parse the input arguments using an `arguments` block and set the `Name` and output properties.

``` arguments numHiddenUnits inputSize args.Name = ""; args.OutputMode = "sequence" end layer.NumHiddenUnits = numHiddenUnits; layer.Name = args.Name; layer.OutputMode = args.OutputMode;```

Give the layer a one-line description by setting the `Description` property of the layer. Set the description to describe the type of the layer and its size.

``` % Set layer description. layer.Description = "Peephole LSTM with " + numHiddenUnits + " hidden units";```

View the completed constructor function.

``` function layer = peepholeLSTMLayer(numHiddenUnits,inputSize,args) %PEEPHOLELSTMLAYER Peephole LSTM Layer % layer = peepholeLSTMLayer(numHiddenUnits) % creates a peephole LSTM layer with the specified number of % hidden units. % % layer = peepholeLSTMLayer(numHiddenUnits,Name=Value) % creates a peephole LSTM layer and specifies additional % options using one or more name-value arguments: % % Name - Name of the layer, specified as a string. % The default is "". % % OutputMode - Output mode, specified as one of the % following: % "sequence" - Output the entire sequence % of data. % % "last" - Output the last time step % of the data. % The default is "sequence". % Parse input arguments. arguments numHiddenUnits inputSize args.Name = ""; args.OutputMode = "sequence"; end layer.NumHiddenUnits = numHiddenUnits; layer.Name = args.Name; layer.OutputMode = args.OutputMode; % Set layer description. layer.Description = "Peephole LSTM with " + numHiddenUnits + " hidden units"; end```

With this constructor function, the command `peepholeLSTMLayer(200,12,OutputMode="last",Name="peephole")` creates a peephole LSTM layer with 200 hidden units, an input size of 12, and the name `"peephole"`, and outputs the last time step of the peephole LSTM operation.

### Create Initialize Function

Because the size of the input data is unknown until the network is ready to use, you must create an initialize function that initializes the learnable and state parameters using `networkDataLayout` objects that the software provides to the function. Network data layout objects contain information about the sizes and formats of expected input data. Create an initialize function that uses the size and format information to initialize learnable and state parameters such that they have the correct size.

Initialize the input weights using Glorot initialization. Initialize the recurrent weights using orthogonal initialization. Initialize the bias using unit-forget-gate normalization. This code uses the helper functions `initializeGlorot`, `initializeOrthogonal`, and `initializeUnitForgetGate`. To access these functions, open the example in the Include Custom Layer in Network section as a live script. For more information about initializing weights, see Initialize Learnable Parameters for Model Function.

Note that the recurrent weights of a peephole LSTM layer and standard LSTM layers have different sizes. A peephole LSTM layer does not require recurrent weights for the cell candidate calculation, so the recurrent weights is a `3*NumHiddenUnits`-by-`NumHiddenUnits` array.

For convenience, initialize the state parameters using the `resetState` function defined in the section Create Reset State Function.

``` function layer = initialize(layer,layout) % layer = initialize(layer,layout) initializes the layer % learnable and state parameters. % % Inputs: % layer - Layer to initialize. % layout - Data layout, specified as a % networkDataLayout object. % % Outputs: % layer - Initialized layer. numHiddenUnits = layer.NumHiddenUnits; % Find number of channels. idx = finddim(layout,"C"); numChannels = layout.Size(idx); % Initialize weights and bias. sz = [4*numHiddenUnits numChannels]; numOut = 4*numHiddenUnits; numIn = numChannels; layer.InputWeights = initializeGlorot(sz,numOut,numIn); sz = [4*numHiddenUnits numHiddenUnits]; layer.RecurrentWeights = initializeOrthogonal(sz); sz = [3*numHiddenUnits 1]; numOut = 3*numHiddenUnits; numIn = 1; layer.PeepholeWeights = initializeGlorot(sz,numOut,numIn); layer.Bias = initializeUnitForgetGate(numHiddenUnits); % Initialize layer states. layer = resetState(layer); end```

### Create Predict Function

Create the layer forward functions to use at prediction time and training time.

Create a function named `predict` that propagates the data forward through the layer at prediction time and outputs the result.

The `predict` function syntax depends on the type of layer.

• `Z = predict(layer,X)` forwards the input data `X` through the layer and outputs the result `Z`, where `layer` has a single input and a single output.

• `[Z,state] = predict(layer,X)` also outputs the updated state parameter `state`, where `layer` has a single state parameter.

You can adjust the syntaxes for layers with multiple inputs, multiple outputs, or multiple state parameters:

• For layers with multiple inputs, replace `X` with `X1,...,XN`, where `N` is the number of inputs. The `NumInputs` property must match `N`.

• For layers with multiple outputs, replace `Z` with `Z1,...,ZM`, where `M` is the number of outputs. The `NumOutputs` property must match `M`.

• For layers with multiple state parameters, replace `state` with `state1,...,stateK`, where `K` is the number of state parameters.

Tip

If the number of inputs to the layer can vary, then use `varargin` instead of `X1,…,XN`. In this case, `varargin` is a cell array of the inputs, where `varargin{i}` corresponds to `Xi`.

If the number of outputs can vary, then use `varargout` instead of `Z1,…,ZN`. In this case, `varargout` is a cell array of the outputs, where `varargout{j}` corresponds to `Zj`.

Tip

If the custom layer has a `dlnetwork` object for a learnable parameter, then in the `predict` function of the custom layer, use the `predict` function for the `dlnetwork`. When you do so, the `dlnetwork` object `predict` function uses the appropriate layer operations for prediction.

Because a peephole LSTM layer has only one input, one output, and two state parameters, the syntax for `predict` for a peephole LSTM layer is `[Z,hiddenState,cellState] = predict(layer,X)`.

By default, the layer uses `predict` as the forward function at training time. To use a different forward function at training time, or retain a value required for a custom backward function, you must also create a function named `forward`.

Because the layer inherits from `nnet.layer.Formattable`, the layer inputs are formatted `dlarray` objects and the `predict` function must also output data as formatted `dlarray` objects.

The hidden state at time step t is given by

`${h}_{t}=\text{tanh}\left({c}_{t}\right)\odot {o}_{t},$`

$\odot$ denotes the Hadamard product (element-wise multiplication of vectors).

The cell state at time step t is given by

`${c}_{t}={g}_{t}\odot {i}_{t}+{c}_{t-1}\odot {f}_{t}.$`

The following formulas describe the components at time step t.

ComponentFormula
Input gate${i}_{t}={\sigma }_{g}\left({W}_{i}{x}_{t}+{R}_{i}{h}_{t-1}+{p}_{i}\odot {c}_{t-1}+{b}_{i}\right)$
Forget gate${f}_{t}={\sigma }_{g}\left({W}_{f}{x}_{t}+\text{​}{R}_{f}{h}_{t-1}+{p}_{f}\odot {c}_{t-1}+{b}_{f}\right)$
Cell candidate${g}_{t}={\sigma }_{c}\left({W}_{g}{x}_{t}+{R}_{h}\text{​}{h}_{t-1}+{b}_{g}\right)$
Output gate${o}_{t}={\sigma }_{g}\left({W}_{o}{x}_{t}+{R}_{o}{h}_{t-1}+{p}_{o}\odot {c}_{t}+{b}_{o}\right)$

Note that the output gate calculation requires the updated cell state ${c}_{t}$.

In these calculations, ${\sigma }_{g}$ and ${\sigma }_{c}$ denote the gate and state activation functions. For peephole LSTM layers, use the sigmoid and hyperbolic tangent functions as the gate and state activation functions, respectively.

Implement this operation in the `predict` function. Because the layer does not require a different forward function for training or a memory value for a custom backward function, you can remove the `forward` function from the class file. Add a comment to the top of the function that explains the syntaxes of the function.

Tip

If you preallocate arrays using functions such as `zeros`, then you must ensure that the data types of these arrays are consistent with the layer function inputs. To create an array of zeros of the same data type as another array, use the `"like"` option of `zeros`. For example, to initialize an array of zeros of size `sz` with the same data type as the array `X`, use `Z = zeros(sz,"like",X)`.

``` function [Z,cellState,hiddenState] = predict(layer,X) %PREDICT Peephole LSTM predict function % [Z,hiddenState,cellState] = predict(layer,X) forward % propagates the data X through the layer and returns the % layer output Z and the updated hidden and cell states. X % is a dlarray with format "CBT" and Z is a dlarray with % format "CB" or "CBT", depending on the layer OutputMode % property. % Initialize sequence output. numHiddenUnits = layer.NumHiddenUnits; miniBatchSize = size(X,finddim(X,"B")); numTimeSteps = size(X,finddim(X,"T")); if layer.OutputMode == "sequence" Z = zeros(numHiddenUnits,miniBatchSize,numTimeSteps,"like",X); Z = dlarray(Z,"CBT"); end % Calculate WX + b. X = stripdims(X); WX = pagemtimes(layer.InputWeights,X) + layer.Bias; % Indices of concatenated weight arrays. idx1 = 1:numHiddenUnits; idx2 = 1+numHiddenUnits:2*numHiddenUnits; idx3 = 1+2*numHiddenUnits:3*numHiddenUnits; idx4 = 1+3*numHiddenUnits:4*numHiddenUnits; % Initial states. hiddenState = layer.HiddenState; cellState = layer.CellState; % Loop over time steps. for t = 1:numTimeSteps % Calculate R*h_{t-1}. Rht = layer.RecurrentWeights * hiddenState; % Calculate p*c_{t-1}. pict = layer.PeepholeWeights(idx1) .* cellState; pfct = layer.PeepholeWeights(idx2) .* cellState; % Gate calculations. it = sigmoid(WX(idx1,:,t) + Rht(idx1,:) + pict); ft = sigmoid(WX(idx2,:,t) + Rht(idx2,:) + pfct); gt = tanh(WX(idx3,:,t) + Rht(idx3,:)); % Calculate ot using updated cell state. cellState = gt .* it + cellState .* ft; poct = layer.PeepholeWeights(idx3) .* cellState; ot = sigmoid(WX(idx4,:,t) + Rht(idx4,:) + poct); % Update hidden state. hiddenState = tanh(cellState) .* ot; % Update sequence output. if layer.OutputMode == "sequence" Z(:,:,t) = hiddenState; end end % Last time step output. if layer.OutputMode == "last" Z = dlarray(hiddenState,"CB"); end end```

Because the `predict` function uses only functions that support `dlarray` objects, defining the `backward` function is optional. For a list of functions that support `dlarray` objects, see List of Functions with dlarray Support.

### Create Reset State Function

When `DAGNetwork` or `SeriesNetwork` objects contain layers with state parameters, you can make predictions and update the layer states using the `predictAndUpdateState` and `classifyAndUpdateState` functions. You can reset the network state using the `resetState` function.

The `resetState` function for `DAGNetwork`, `SeriesNetwork`, and `dlnetwork` objects, by default, has no effect on custom layers with state parameters. To define the layer behavior for the `resetState` function for network objects, define the optional layer `resetState` function in the layer definition that resets the state parameters.

The `resetState` function must have the syntax ```layer = resetState(layer)```, where the returned layer has the reset state properties.

Create a function named `resetState` that resets the layer state parameters to vectors of zeros.

``` function layer = resetState(layer) %RESETSTATE Reset layer state % layer = resetState(layer) resets the state properties of the % layer. numHiddenUnits = layer.NumHiddenUnits; layer.HiddenState = zeros(numHiddenUnits,1); layer.CellState = zeros(numHiddenUnits,1); end```

### Completed Layer

View the completed layer class file.

```classdef peepholeLSTMLayer < nnet.layer.Layer & nnet.layer.Formattable %PEEPHOLELSTMLAYER Peephole LSTM Layer properties % Layer properties. NumHiddenUnits OutputMode end properties (Learnable) % Layer learnable parameters. InputWeights RecurrentWeights PeepholeWeights Bias end properties (State) % Layer state parameters. HiddenState CellState end methods function layer = peepholeLSTMLayer(numHiddenUnits,inputSize,args) %PEEPHOLELSTMLAYER Peephole LSTM Layer % layer = peepholeLSTMLayer(numHiddenUnits) % creates a peephole LSTM layer with the specified number of % hidden units. % % layer = peepholeLSTMLayer(numHiddenUnits,Name=Value) % creates a peephole LSTM layer and specifies additional % options using one or more name-value arguments: % % Name - Name of the layer, specified as a string. % The default is "". % % OutputMode - Output mode, specified as one of the % following: % "sequence" - Output the entire sequence % of data. % % "last" - Output the last time step % of the data. % The default is "sequence". % Parse input arguments. arguments numHiddenUnits inputSize args.Name = ""; args.OutputMode = "sequence"; end layer.NumHiddenUnits = numHiddenUnits; layer.Name = args.Name; layer.OutputMode = args.OutputMode; % Set layer description. layer.Description = "Peephole LSTM with " + numHiddenUnits + " hidden units"; end function [Z,cellState,hiddenState] = predict(layer,X) %PREDICT Peephole LSTM predict function % [Z,hiddenState,cellState] = predict(layer,X) forward % propagates the data X through the layer and returns the % layer output Z and the updated hidden and cell states. X % is a dlarray with format "CBT" and Z is a dlarray with % format "CB" or "CBT", depending on the layer OutputMode % property. % Initialize sequence output. numHiddenUnits = layer.NumHiddenUnits; miniBatchSize = size(X,finddim(X,"B")); numTimeSteps = size(X,finddim(X,"T")); if layer.OutputMode == "sequence" Z = zeros(numHiddenUnits,miniBatchSize,numTimeSteps,"like",X); Z = dlarray(Z,"CBT"); end % Calculate WX + b. X = stripdims(X); WX = pagemtimes(layer.InputWeights,X) + layer.Bias; % Indices of concatenated weight arrays. idx1 = 1:numHiddenUnits; idx2 = 1+numHiddenUnits:2*numHiddenUnits; idx3 = 1+2*numHiddenUnits:3*numHiddenUnits; idx4 = 1+3*numHiddenUnits:4*numHiddenUnits; % Initial states. hiddenState = layer.HiddenState; cellState = layer.CellState; % Loop over time steps. for t = 1:numTimeSteps % Calculate R*h_{t-1}. Rht = layer.RecurrentWeights * hiddenState; % Calculate p*c_{t-1}. pict = layer.PeepholeWeights(idx1) .* cellState; pfct = layer.PeepholeWeights(idx2) .* cellState; % Gate calculations. it = sigmoid(WX(idx1,:,t) + Rht(idx1,:) + pict); ft = sigmoid(WX(idx2,:,t) + Rht(idx2,:) + pfct); gt = tanh(WX(idx3,:,t) + Rht(idx3,:)); % Calculate ot using updated cell state. cellState = gt .* it + cellState .* ft; poct = layer.PeepholeWeights(idx3) .* cellState; ot = sigmoid(WX(idx4,:,t) + Rht(idx4,:) + poct); % Update hidden state. hiddenState = tanh(cellState) .* ot; % Update sequence output. if layer.OutputMode == "sequence" Z(:,:,t) = hiddenState; end end % Last time step output. if layer.OutputMode == "last" Z = dlarray(hiddenState,"CB"); end end function layer = resetState(layer) %RESETSTATE Reset layer state % layer = resetState(layer) resets the state properties of the % layer. numHiddenUnits = layer.NumHiddenUnits; layer.HiddenState = zeros(numHiddenUnits,1); layer.CellState = zeros(numHiddenUnits,1); end end end```

### GPU Compatibility

If the layer forward functions fully support `dlarray` objects, then the layer is GPU compatible. Otherwise, to be GPU compatible, the layer functions must support inputs and return outputs of type `gpuArray` (Parallel Computing Toolbox).

Many MATLAB built-in functions support `gpuArray` (Parallel Computing Toolbox) and `dlarray` input arguments. For a list of functions that support `dlarray` objects, see List of Functions with dlarray Support. For a list of functions that execute on a GPU, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox). To use a GPU for deep learning, you must also have a supported GPU device. For information on supported devices, see GPU Computing Requirements (Parallel Computing Toolbox). For more information on working with GPUs in MATLAB, see GPU Computing in MATLAB (Parallel Computing Toolbox).

In this example, the MATLAB functions used in `predict` all support `dlarray` objects, so the layer is GPU compatible.

### Include Custom Layer in Network

You can use a custom layer in the same way as any other layer in Deep Learning Toolbox. Create and train a network for sequence classification using the peephole LSTM layer you created earlier.

`[XTrain,TTrain] = japaneseVowelsTrainData;`

Define the network architecture. Create a layer array containing a peephole LSTM layer.

```inputSize = 12; numHiddenUnits = 100; numClasses = 9; layers = [ sequenceInputLayer(inputSize) peepholeLSTMLayer(numHiddenUnits,OutputMode="last") fullyConnectedLayer(numClasses) softmaxLayer classificationLayer];```

Specify the training options and train the network. Train with a mini-batch size of 27 and left-pad the data.

```options = trainingOptions("adam",MiniBatchSize=27,SequencePaddingDirection="left"); net = trainNetwork(XTrain,TTrain,layers,options);```
```Training on single CPU. |========================================================================================| | Epoch | Iteration | Time Elapsed | Mini-batch | Mini-batch | Base Learning | | | | (hh:mm:ss) | Accuracy | Loss | Rate | |========================================================================================| | 1 | 1 | 00:00:01 | 3.70% | 2.2060 | 0.0010 | | 5 | 50 | 00:00:21 | 92.59% | 0.5917 | 0.0010 | | 10 | 100 | 00:00:33 | 92.59% | 0.2182 | 0.0010 | | 15 | 150 | 00:00:45 | 100.00% | 0.0587 | 0.0010 | | 20 | 200 | 00:00:58 | 96.30% | 0.0825 | 0.0010 | | 25 | 250 | 00:01:13 | 100.00% | 0.0328 | 0.0010 | | 30 | 300 | 00:01:24 | 100.00% | 0.0129 | 0.0010 | |========================================================================================| Training finished: Max epochs completed. ```

Evaluate the network performance by predicting on new data and calculating the accuracy.

```[XTest,TTest] = japaneseVowelsTestData; YTest = classify(net,XTest,MiniBatchSize=27); accuracy = mean(YTest==TTest)```
```accuracy = 0.8703 ```

## References

[1] Greff, Klaus, Rupesh K. Srivastava, Jan Koutník, Bas R. Steunebrink, and Jürgen Schmidhuber. "LSTM: A Search Space Odyssey." IEEE Transactions on Neural Networks and Learning Systems 28, no. 10 (2016): 2222–2232.