Keras layers – Parameters and Properties

Free Keras course with real-time projects Start Now!!

Layers are the primary unit to create neural networks. We compose a deep learning architecture by adding successive layers. Each successive layer performs some computation on the input it receives. Then after it propagates the output information to the next layer. At last, we get the desired results from the output of the last layer. In this Keras article, we will walk through different types of Keras layers, its properties and its parameters.

Keras Layers

Keras Layers

To define or create a Keras layer, we need the following information:

  • The shape of Input: To understand the structure of input information.
  • Units: To determine the number of nodes/ neurons in the layer.
  • Initializer: To determine the weights for each input to perform computation.
  • Activators: To transform the input in a nonlinear format, such that each neuron can learn better.
  • Constraints: To put restrictions on weights at the time of optimization.
  • Regularizers: To apply a penalty on the parameters during optimization.

Different Layers in Keras

1. Core Keras Layers

  • Dense
    It computes the output in the following way:
output=activation(dot(input,kernel)+bias)

Here,
“activation” is the activator, “kernel” is a weighted matrix which we apply on input tensors, and “bias” is a constant which helps to fit the model in a best way.

Dense layer receives input from all the nodes in the previous layer. It has the following arguments and its default values:

Dense(units, activation=NULL, use_bias=TRUE,kernel_initializer=’glorot_uniform’,bias_regularizer=NULL,activity_regularizer=NULL,kernel_constraint=NULL,bias_constrain=NULL)
  • Activation

It is used to apply the activation function to output. It is the same as passing activation in the Dense layer and has the following arguments:

Activation(activation_function)

If you do not give activation_function, it will perform linear activation.

  • Dropout

We use Dropout in our neural network to save it from overfitting. To prevent overfitting, it randomly chooses a fraction of units and set to 0 at each update.

It has the following arguments:

Dropout(rate, noise_shape=NULL, seed=NULL)
  • Flatten

We use Flatten to convert the input to a lower dimension.

For example: an input layer of shape(batch_size, 3,2) is flatten to output of shape(batch_size, 6). It has the following arguments:

Flatten(data_format=None)
  • Input

We use this layer to create Keras model using only the input and output of the model. This layer is the entry point into the model graph.

It has following arguments:

Input(shape, batch_shape, name, dtype, sparse=FALSE,tensor=NULL)
  • Reshape

Reshape output to a particular dimension.

Argument:

Reshape(target_shape)

Gives output of shape:

(batch_size,)+ target_shape
  • Permute

Permute input according to the given pattern. We may also use the Permute layer to change input shapes using specified patterns.

Arguments:

Permute(dims)
  • Lambda

We use Lambda layers to build extra layer features, which are not provided in Keras.

Arguments:

Lambda(lambda_fun,output_shape=None, mask=None, arguments= None)
  • Masking

To skip the timestep if all the features are equal to mask_value.

Arguments:

Masking(mask_value=0.0)

2. Convolution Layers of Keras

  • Conv1D and Conv2D

Here we define a weighted kernel, we convolve this kernel all over the input and produce output tensors.

Arguments:

Conv1D(filters,kernel_size,strides=1, padding=’valid’, data_format=’channels_last’, dilation_rate=1, activation=None, use_bias=True, kernel_initializer=’glorot_uniform’, bias_initializer=’zeros’, kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None)

Conv2D(filters,kernel_size,strides=(1,1) , padding=’valid’, data_format=’channels_last’, dilation_rate=(1,1) , activation=None, use_bias=True, kernel_initializer=’glorot_uniform’, bias_initializer=’zeros’, kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=Non

3. Pooling Layers

We use pooling to reduce the size of the input and extract important information.

  • MaxPooling1D and MaxPooling2D

Extract maximum in the pooling window.

Arguments:

MaxPooling1D(pool_size=2, strides=None, padding=’valid’, data_format=’channels_last’)

MaxPooling2D(pool_size=(2,2), strides=None, padding=’valid’, data_format=’channels_last’)
  • AveragePooling1D and AveragePooling2D

Get average from the pooling window.

Arguments:

AveragePooling1D(pool_size=2, strides=None, padding=’valid’, data_format=’channels_last’)

AveragePooling1D(pool_size=(2,2), strides=None, padding=’valid’, data_format=None)

4. Recurrent layer

We use this layer to compute sequence data, i.e time series or natural language.

  • SimpleRNN

This is fully connected RNN, where output of layer is fed back to the input

Arguments:

SimpleRNN(units, activation, use_bias, kernel_initializer, recurrent_initializer, bias_initializer, kernel_regularizer, recurrent_regularizer, bias_regularizer, activity_regularizer, kernel_constraint, recurrent_constraint, bias_constraint, dropout, recurrent_dropout, return_sequences, return_state)
  • LSTM

It is a big form of RNN, it has some storage to keep the information.

LSTM(units, activation , recurrent_activation, use_bias, kernel_initializer, recurrent_initializer, bias_initializer, unit_forget_bias, kernel_regularizer, recurrent_regularizer, bias_regularizer, activity_regularizer, kernel_constraint, recurrent_constraint, bias_constraint, dropout, recurrent_dropout, implementation, return_sequences, return_state)

Keras provides many other layers, but in general, we work upon the layers described above.

Summary

This article explains the concept of layers in building Keras models. We learn about the basic attributes required to build a layer.
Then we discussed the different types of Keras layers i.e Core Layers, Convolution Layers, Pooling Layers, Recurrent Layers, its properties, and parameters.

Any suggestion or changes are most welcomed in the comment section.

Did you know we work 24x7 to provide you best tutorials
Please encourage us - write a review on Google

follow dataflair on YouTube

Leave a Reply

Your email address will not be published. Required fields are marked *