Keras Layers

In this page, we will learn What is keras layers?, Core Layers, Pooling Layers, Locally Connected Layers, RNN Layers, Noise Layers, Layer Wrappers, Normalization Layer, Embedding Layer, and Advanced Activation Layers.


What is Keras layers?

Keras has a variety of readymade layers as well as the ability to construct your own layer. It is a key component in the construction of a Keras model. When a layer in Keras takes an input, it executes some computations that result in altered data. The output of one layer is fed into the input of the next layer.

The Keras Core layer consists of a dense layer that is a dot product plus bias; an activation layer that transfers a function or neuron shape; a dropout layer that randomly sets a fraction of the input unit to zero at each training update to avoid overfitting; a lambda layer that wraps an arbitrary expression like an object of a layer; and so on.

The Keras convolution layer employs filters to generate a feature map that ranges from 1D to 3D. It contains the majority of the typical invariants, such as cropping and transposed convolution layers for each dimension. Because it is inspired by the visual brain, 2D convolution is employed for image recognition.

The downscaling layer, also known as pooling, scales from 1D to 3D. It also provides the most popular variations, such as maximum and average pooling. The layers that are locally connected behave like convolution layers; the only difference is that the weights are not shared. Overfitting is eliminated by the noise layer. The recurrent layer, which includes simple, gated, and LSTM algorithms, is used in applications such as language processing.

Each Keras layer has the following number of common methods:

  • get_weights(): It returns the weights of the layer as a list of numpy arrays. 
  • set_weights(weights): It sets the weight of the layer with the same form as the output of get weights() from the numpy arrays list. 
  • get_config(): It returns a dictionary containing the layer's configuration, allowing you to instantiate it from its configuration.

  layer = Dense(32)  
  config = layer.get_config()  
  reconstructed_layer = Dense.from_config(config)  

Additionally,


  from keras import layers

  config = layer.get_config()

  layer = layers.deserialize({'class_name': layer.__class__.__name__,
                              'config': config})

If the layer is not a shared layer or if the layer consists of individual nodes, we can obtain its input tensor, output tensor, input shape, and output shape using the following methods:

  • input
  • output
  • input_shape
  • output_shape

Otherwise, if the layer contains multiple nodes, you can employ the approaches listed below.

  • get_input_at(node_index)
  • get_output_at(node_index)
  • get_input_shape_at(node_index)
  • get_output_shape_at(node_index)

Core Layers:

  1. 
      keras.layers.Dense(units, activation=None, use_bias=True, 
                         kernel_initializer='glorot_uniform', 
                         bias_initializer='zeros', kernel_regularizer=None,
                         bias_regularizer=None, activity_regularizer=None,
                         kernel_constraint=None, bias_constraint=None)
    
    
  2. 
      keras.layers.Activation(activation)  
    
    
  3. 
      keras.layers.Dropout(rate, noise_shape=None, seed=None)
    
    
  4. 
      keras.layers.Flatten() 
    
    
  5. 
      keras.engine.topology.Input() 
    
    
  6. 
      keras.layers.Reshape(target_shape) 
    
    
  7. 
      keras.layers.Permute(dims)
    
    
  8. 
      keras.layers.RepeatVector(n) 
    
    
  9. 
      keras.layers.Lambda(function, output_shape=None, mask=None, arguments=None)
    
    
  10. 
      keras.layers.ActivityRegularization(l1=0.0, l2=0.0)  
    
    
  11. 
      keras.layers.Masking(mask_value=0.0)  
    
    

Convolution Layer:

  1. 
      keras.layers.Conv1D(filters, kernel_size, strides=1, padding='valid', 
                          dilation_rate=1, activation=None, use_bias=True, 
                          kernel_initializer='glorot_uniform', bias_initializer='zeros', 
                          kernel_regularizer=None, bias_regularizer=None, 
                          activity_regularizer=None, kernel_constraint=None, 
                          bias_constraint=None)  
    
    
  2. 
      keras.layers.Conv2D(filters, kernel_size, strides=(1, 1), 
                          padding='valid', data_format=None, 
                          dilation_rate=(1, 1), activation=None, 
                          use_bias=True, kernel_initializer='glorot_uniform', 
                          bias_initializer='zeros', kernel_regularizer=None, 
                          bias_regularizer=None, activity_regularizer=None, 
                          kernel_constraint=None, bias_constraint=None)  
    
    
  3. 
      keras.layers.SeparableConv2D(filters, kernel_size, strides=(1, 1), 
                                   padding='valid', data_format=None, 
                                   depth_multiplier=1, activation=None, 
                                   use_bias=True, depthwise_initializer='glorot_uniform', 
                                   pointwise_initializer='glorot_uniform', 
                                   bias_initializer='zeros', depthwise_regularizer=None, 
                                   pointwise_regularizer=None, bias_regularizer=None, 
                                   activity_regularizer=None, depthwise_constraint=None, 
                                   pointwise_constraint=None, bias_constraint=None)
    
    
  4. 
      keras.layers.Conv2DTranspose(filters, kernel_size, strides=(1, 1), 
                                   padding='valid', data_format=None, activation=None, 
                                   use_bias=True, kernel_initializer='glorot_uniform', 
                                   bias_initializer='zeros', kernel_regularizer=None, 
                                   bias_regularizer=None, activity_regularizer=None, 
                                   kernel_constraint=None, bias_constraint=None)  
    
    
  5. 
      keras.layers.Conv3D(filters, kernel_size, strides=(1, 1, 1),
                          padding='valid', data_format=None, 
                          dilation_rate=(1, 1, 1), activation=None, 
                          use_bias=True, kernel_initializer='glorot_uniform', 
                          bias_initializer='zeros', kernel_regularizer=None, 
                          bias_regularizer=None, activity_regularizer=None, 
                          kernel_constraint=None, bias_constraint=None)  
    
    
  6. 
      keras.layers.Cropping1D(cropping=(1, 1)) 
    
    
  7. 
      keras.layers.Cropping2D(cropping=((0, 0), (0, 0)), data_format=None)  
    
    
  8. 
      keras.layers.Cropping3D(cropping=((1, 1), (1, 1), (1, 1)), data_format=None)  
    
    
  9. 
      keras.layers.UpSampling1D(size=2)  
    
    
  10. 
      keras.layers.UpSampling2D(size=(2, 2), data_format=None)  
    
    
  11. 
      keras.layers.UpSampling3D(size=(2, 2, 2), data_format=None)
    
    
  12. 
      keras.layers.ZeroPadding1D(padding=1)  
    
    
  13. 
      keras.layers.ZeroPadding2D(padding=(1, 1), data_format=None)  
    
    
  14. 
      keras.layers.ZeroPadding3D(padding=(1, 1, 1), data_format=None)  
    
    

Pooling Layers:

  1. 
      keras.layers.MaxPooling1D(pool_size=2, strides=None, padding='valid')  
    
    
  2. 
      keras.layers.MaxPooling2D(pool_size=(2, 2), strides=None, 
                                padding='valid', data_format=None)  
    
    
  3. 
      keras.layers.MaxPooling3D(pool_size=(2, 2, 2), strides=None, 
                                padding='valid', data_format=None) 
    
    
  4. 
      keras.layers.AveragePooling1D(pool_size=2, strides=None, padding='valid')  
    
    
  5. 
      keras.layers.AveragePooling2D(pool_size=(2, 2), strides=None, 
                                    padding='valid', data_format=None)
    
    
  6. 
      keras.layers.AveragePooling3D(pool_size=(2, 2, 2), strides=None, 
                                    padding='valid', data_format=None)  
    
    
  7. 
      keras.layers.GlobalMaxPooling1D()  
    
    
  8. 
      keras.layers.GlobalAveragePooling1D()  
    
    
  9. 
      keras.layers.GlobalMaxPooling1D()  
    
    
  10. 
      keras.layers.GlobalMaxPooling2D(data_format=None)  
    
    
  11. 
      keras.layers.GlobalAveragePooling2D(data_format=None)  
    
    

Locally Connected Layers:

  1. 
      keras.layers.LocallyConnected1D(filters, kernel_size, strides=1, 
                                      padding='valid', data_format=None, 
                                      activation=None, use_bias=True, 
                                      kernel_initializer='glorot_uniform', 
                                      bias_initializer='zeros', kernel_regularizer=None, 
                                      bias_regularizer=None, activity_regularizer=None, 
                                      kernel_constraint=None, bias_constraint=None)  
    
    
  2. 
      keras.layers.LocallyConnected2D(filters, kernel_size, strides=(1, 1),
                                      padding='valid', data_format=None, 
                                      activation=None, use_bias=True, 
                                      kernel_initializer='glorot_uniform', 
                                      bias_initializer='zeros', kernel_regularizer=None, 
                                      bias_regularizer=None, activity_regularizer=None, 
                                      kernel_constraint=None, bias_constraint=None)  
    
    

RNN Layers:

  1. 
      keras.layers.RNN(cell, return_sequences=False, return_state=False, 
                       go_backwards=False, stateful=False, unroll=False)  
    
    
  2. 
      keras.layers.SimpleRNN(units, activation='tanh', use_bias=True, 
                             kernel_initializer='glorot_uniform', 
                             recurrent_initializer='orthogonal', bias_initializer='zeros',
                             kernel_regularizer=None, recurrent_regularizer=None,
                             bias_regularizer=None, activity_regularizer=None,
                             kernel_constraint=None, recurrent_constraint=None,
                             bias_constraint=None, dropout=0.0, recurrent_dropout=0.0, 
                             return_sequences=False, return_state=False, 
                             go_backwards=False, stateful=False, unroll=False)  
    
    
  3. 
      keras.layers.GRU(units, activation='tanh', recurrent_activation='hard_sigmoid',
                       use_bias=True, kernel_initializer='glorot_uniform', 
                       recurrent_initializer='orthogonal', bias_initializer='zeros', 
                       kernel_regularizer=None, recurrent_regularizer=None, 
                       bias_regularizer=None, activity_regularizer=None, 
                       kernel_constraint=None, recurrent_constraint=None, 
                       bias_constraint=None, dropout=0.0, recurrent_dropout=0.0, 
                       implementation=1, return_sequences=False, return_state=False, 
                       go_backwards=False, stateful=False, unroll=False)  
    
    
  4. 
      keras.layers.LSTM(units, activation='tanh', recurrent_activation='hard_sigmoid',
                        use_bias=True, kernel_initializer='glorot_uniform', 
                        recurrent_initializer='orthogonal', bias_initializer='zeros', 
                        unit_forget_bias=True, kernel_regularizer=None, 
                        recurrent_regularizer=None, bias_regularizer=None, 
                        activity_regularizer=None, kernel_constraint=None, 
                        recurrent_constraint=None, bias_constraint=None, 
                        dropout=0.0, recurrent_dropout=0.0, implementation=1, 
                        return_sequences=False, return_state=False, 
                        go_backwards=False, stateful=False, unroll=False)  
    
    
  5. 
      keras.layers.ConvLSTM2D(filters, kernel_size, strides=(1, 1), 
                              padding='valid', data_format=None, 
                              dilation_rate=(1, 1), activation='tanh', 
                              recurrent_activation='hard_sigmoid', 
                              use_bias=True, kernel_initializer='glorot_uniform', 
                              recurrent_initializer='orthogonal', 
                              bias_initializer='zeros', unit_forget_bias=True, 
                              kernel_regularizer=None, recurrent_regularizer=None, 
                              bias_regularizer=None, activity_regularizer=None, 
                              kernel_constraint=None, recurrent_constraint=None, 
                              bias_constraint=None, return_sequences=False, 
                              go_backwards=False, stateful=False, 
                              dropout=0.0, recurrent_dropout=0.0)  
    
    
  6. 
      keras.layers.SimpleRNNCell(units, activation='tanh', 
                                 use_bias=True, kernel_initializer='glorot_uniform', 
                                 recurrent_initializer='orthogonal', bias_initializer='zeros', 
                                 kernel_regularizer=None, recurrent_regularizer=None, 
                                 bias_regularizer=None, kernel_constraint=None, 
                                 recurrent_constraint=None, bias_constraint=None, 
                                 dropout=0.0, recurrent_dropout=0.0)  
    
    
  7. 
      keras.layers.GRUCell(units, activation='tanh', recurrent_activation='hard_sigmoid', 
                           use_bias=True, kernel_initializer='glorot_uniform', 
                           recurrent_initializer='orthogonal', bias_initializer='zeros', 
                           kernel_regularizer=None, recurrent_regularizer=None, 
                           bias_regularizer=None, kernel_constraint=None, 
                           recurrent_constraint=None, bias_constraint=None, dropout=0.0, 
                           recurrent_dropout=0.0, implementation=1)  
    
    
  8. 
      keras.layers.LSTMCell(units, activation='tanh', recurrent_activation='hard_sigmoid', 
                            use_bias=True, kernel_initializer='glorot_uniform', 
                            recurrent_initializer='orthogonal', bias_initializer='zeros', 
                            unit_forget_bias=True, kernel_regularizer=None, 
                            recurrent_regularizer=None, bias_regularizer=None, 
                            kernel_constraint=None, recurrent_constraint=None, 
                            bias_constraint=None, dropout=0.0, 
                            recurrent_dropout=0.0, implementation=1)  
    
    
  9. 
      keras.layers.StackedRNNCells(cells)  
    
    
  10. 
      keras.layers.CuDNNGRU(units, kernel_initializer='glorot_uniform', 
                            recurrent_initializer='orthogonal', bias_initializer='zeros', 
                            kernel_regularizer=None, recurrent_regularizer=None, 
                            bias_regularizer=None, activity_regularizer=None, 
                            kernel_constraint=None, recurrent_constraint=None, 
                            bias_constraint=None, return_sequences=False, 
                            return_state=False, stateful=False)  
    
    
  11. 
      keras.layers.CuDNNLSTM(units, kernel_initializer='glorot_uniform', 
                             recurrent_initializer='orthogonal', bias_initializer='zeros', 
                             unit_forget_bias=True, kernel_regularizer=None, 
                             recurrent_regularizer=None, bias_regularizer=None, 
                             activity_regularizer=None, kernel_constraint=None, 
                             recurrent_constraint=None, bias_constraint=None, 
                             return_sequences=False, return_state=False, stateful=False)  
    
    

Noise Layers:

  1. 
      keras.layers.GaussianNoise(stddev)  
    
    
  2. 
      keras.layers.GaussianDropout(rate)  
    
    
  3. 
      keras.layers.AlphaDropout(rate, noise_shape=None, seed=None)
    
    

Layer Wrappers:

  1. 
      keras.layers.GaussianNoise(stddev) 
    
    
  2. 
      keras.layers.GaussianDropout(rate)  
    
    
  3. 
      keras.layers.AlphaDropout(rate, noise_shape=None, seed=None)  
    
    

Normalization Layer:

  1. 
      keras.layers.BatchNormalization(axis=-1, momentum=0.99, epsilon=0.001, 
                                      center=True, scale=True, beta_initializer='zeros', 
                                      gamma_initializer='ones', moving_mean_initializer='zeros', 
                                      moving_variance_initializer='ones', beta_regularizer=None, 
                                      gamma_regularizer=None, beta_constraint=None, 
                                      gamma_constraint=None) 
    
    

Embedding Layer:

  1. 
      keras.layers.Embedding(input_dim, output_dim, embeddings_initializer='uniform', 
                             embeddings_regularizer=None, activity_regularizer=None, 
                             embeddings_constraint=None, mask_zero=False, input_length=None)
    
    

Advanced Activation Layers:

  1. 
      keras.layers.LeakyReLU(alpha=0.3)
    
    
  2. 
      keras.layers.PReLU(alpha_initializer='zeros', alpha_regularizer=None,
                         alpha_constraint=None, shared_axes=None)  
    
    
  3. 
      keras.layers.ELU(alpha=1.0)  
    
    
  4. 
      keras.layers.ThresholdedReLU(theta=1.0)  
    
    
  5. 
      keras.layers.Softmax(axis=-1)