Main Content


Global average pooling layer


A global average pooling layer performs down-sampling by computing the mean of the height and width dimensions of the input.



layer = globalAveragePooling2dLayer creates a global average pooling layer.


layer = globalAveragePooling2dLayer('Name',name) sets the optional Name property.


expand all

Layer name, specified as a character vector or a string scalar. To include a layer in a layer graph, you must specify a nonempty unique layer name. If you train a series network with the layer and Name is set to '', then the software automatically assigns a name to the layer at training time.

Data Types: char | string

Number of inputs of the layer. This layer accepts a single input only.

Data Types: double

Input names of the layer. This layer accepts a single input only.

Data Types: cell

Number of outputs of the layer. This layer has a single output only.

Data Types: double

Output names of the layer. This layer has a single output only.

Data Types: cell


collapse all

Create a global average pooling layer with the name 'gap1'.

layer = globalAveragePooling2dLayer('Name','gap1')
layer = 
  GlobalAveragePooling2DLayer with properties:

    Name: 'gap1'

Include a global average pooling layer in a Layer array.

layers = [ ...
    imageInputLayer([28 28 1])
layers = 
  7x1 Layer array with layers:

     1   ''   Image Input              28x28x1 images with 'zerocenter' normalization
     2   ''   Convolution              20 5x5 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   ReLU                     ReLU
     4   ''   Global Average Pooling   Global average pooling
     5   ''   Fully Connected          10 fully connected layer
     6   ''   Softmax                  softmax
     7   ''   Classification Output    crossentropyex


  • In an image classification network, you can use a globalAveragePooling2dLayer before the final fully connected layer to reduce the size of the activations without sacrificing performance. The reduced size of the activations means that the downstream fully connected layers will have fewer weights, reducing the size of your network.

  • You can use a globalAveragePooling2dLayer towards the end of a classification network instead of a fullyConnectedLayer. Since global pooling layers have no learnable parameters, they can be less prone to overfitting and can reduce the size of the network. These networks can also be more robust to spatial translations of input data. You can also replace a fully connected layer with a globalMaxPooling2dLayer instead. Whether a globalMaxPooling2dLayer or a globalAveragePooling2dLayer is more appropriate depends on your data set.

    To use a global average pooling layer instead of a fully connected layer, the size of the input to globalAveragePooling2dLayer must match the number of classes in the classification problem

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Introduced in R2019b