
- Keras Tutorial
- Keras - Home
- Keras - Introduction
- Keras - Installation
- Keras - Backend Configuration
- Keras - Overview of Deep learning
- Keras - Deep learning
- Keras - Modules
- Keras - Layers
- Keras - Customized Layer
- Keras - Models
- Keras - Model Compilation
- Keras - Model Evaluation and Prediction
- Keras - Convolution Neural Network
- Keras - Regression Prediction using MPL
- Keras - Time Series Prediction using LSTM RNN
- Keras - Applications
- Keras - Real Time Prediction using ResNet Model
- Keras - Pre-Trained Models
- Keras Useful Resources
- Keras - Quick Guide
- Keras - Useful Resources
- Keras - Discussion
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
Keras - Dense Layer
Dense layer is the regular deeply connected neural network layer. It is most common and frequently used layer. Dense layer does the below operation on the input and return the output.
output = activation(dot(input, kernel) + bias)
where,
input represent the input data
kernel represent the weight data
dot represent numpy dot product of all input and its corresponding weights
bias represent a biased value used in machine learning to optimize the model
activation represent the activation function.
Let us consider sample input and weights as below and try to find the result −
input as 2 x 2 matrix [ [1, 2], [3, 4] ]
kernel as 2 x 2 matrix [ [0.5, 0.75], [0.25, 0.5] ]
bias value as 0
activation as linear. As we learned earlier, linear activation does nothing.
>>> import numpy as np >>> input = [ [1, 2], [3, 4] ] >>> kernel = [ [0.5, 0.75], [0.25, 0.5] ] >>> result = np.dot(input, kernel) >>> result array([[1. , 1.75], [2.5 , 4.25]]) >>>
result is the output and it will be passed into the next layer.
The output shape of the Dense layer will be affected by the number of neuron / units specified in the Dense layer. For example, if the input shape is (8,) and number of unit is 16, then the output shape is (16,). All layer will have batch size as the first dimension and so, input shape will be represented by (None, 8) and the output shape as (None, 16). Currently, batch size is None as it is not set. Batch size is usually set during training phase.
>>> from keras.models import Sequential >>> from keras.layers import Activation, Dense >>> model = Sequential() >>> layer_1 = Dense(16, input_shape = (8,)) >>> model.add(layer_1) >>> layer_1.input_shape (None, 8) >>> layer_1.output_shape (None, 16) >>>
where,
layer_1.input_shape returns the input shape of the layer.
layer_1.output_shape returns the output shape of the layer.
The argument supported by Dense layer is as follows −
units represent the number of units and it affects the output layer.
activation represents the activation function.
use_bias represents whether the layer uses a bias vector.
kernel_initializer represents the initializer to be used for kernel.
bias_initializer represents the initializer to be used for the bias vector.
kernel_regularizer represents the regularizer function to be applied to the kernel weights matrix.
bias_regularizer represents the regularizer function to be applied to the bias vector.
activity_regularizer represents the regularizer function tp be applied to the output of the layer.
kernel_constraint represent constraint function to be applied to the kernel weights matrix.
bias_constraint represent constraint function to be applied to the bias vector.
As you have seen, there is no argument available to specify the input_shape of the input data. input_shape is a special argument, which the layer will accept only if it is designed as first layer in the model.
Also, all Keras layer has few common methods and they are as follows −
get_weights
Fetch the full list of the weights used in the layer.
>>> from keras.models import Sequential >>> from keras.layers import Activation, Dense >>> model = Sequential() >>> layer_1 = Dense(16, input_shape = (8,)) >>> model.add(layer_1) >>> layer_1.get_weights() >>> [array([[-0.19929028, 0.4162618 , 0.20081699, -0.25589502, 0.3612864 , 0.25088787, -0.47544873, 0.0321095 , -0.26070702, -0.24102116, 0.32778358, 0.4667952 , -0.43322265, -0.14500427, 0.04341269, -0.34929228], [ 0.41898954, 0.42256463, 0.2399621 , -0.272717 , -0.37069297, -0.37802136, 0.11428618, 0.12749982, 0.10182762, 0.14897704, 0.06569374, 0.15424263, 0.42638576, 0.34037888, -0.15504825, -0.0740819 ], [-0.3132702 , 0.34885168, -0.3259498 , -0.47076607, 0.33696914, -0.49143505, -0.04318619, -0.11252558, 0.29669464, -0.28431225, -0.43165374, -0.49687648, 0.13632 , -0.21099591, -0.10608876, -0.13568914], [-0.27421212, -0.180812 , 0.37240648, 0.25100648, -0.07199466, -0.23680925, -0.21271884, -0.08706653, 0.4393121 , 0.23259485, 0.2616762 , 0.23966897, -0.4502542 , 0.0058881 , 0.14847124, 0.08835125], [-0.36905527, 0.08948278, -0.19254792, 0.26783705, 0.25979865, -0.46963632, 0.32761025, -0.25718856, 0.48987913, 0.3588251 , -0.06586111, 0.2591269 , 0.48289275, 0.3368858 , -0.17145419, -0.35674667], [-0.32851398, 0.42289603, -0.47025883, 0.29027188, -0.0498147 , 0.46215963, -0.10123312, 0.23069787, 0.00844061, -0.11867595, -0.2602347 , -0.27917898, 0.22910392, 0.18214619, -0.40857887, 0.2606709 ], [-0.19066167, -0.11464512, -0.06768692, -0.21878994, -0.2573272 , 0.13698077, 0.45221198, 0.10634196, 0.06784797, 0.07192957, 0.2946936 , 0.04968262, -0.15899467, 0.15757453, -0.1343019 , 0.24561536], [-0.04272163, 0.48315823, -0.13382411, 0.01752126, -0.1630218 , 0.4629662 , -0.21412933, -0.1445911 , -0.03567278, -0.20948446, 0.15742278, 0.11139905, 0.11066687, 0.17430818, 0.36413217, 0.19864106]], dtype=float32), array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], dtype = float32)] >>>
set_weights − Set the weights for the layer
get_config − Get the complete configuration of the layer as an object which can be reloaded at any time.
config = layer_1.get_config()
from_config
Load the layer from the configuration object of the layer.
config = layer_1.get_config() reload_layer = Dense.from_config(config)
input_shape
Get the input shape, if only the layer has single node.
>>> from keras.models import Sequential >>> from keras.layers import Activation, Dense >>> model = Sequential() >>> layer_1 = Dense(16, input_shape = (8,)) >>> model.add(layer_1) >>> layer_1.get_weights() >>> layer_1.input_shape (None, 8)
input
Get the input data, if only the layer has single node.
>>> from keras.models import Sequential >>> from keras.layers import Activation, Dense >>> model = Sequential() >>> layer_1 = Dense(16, input_shape = (8,)) >>> model.add(layer_1) >>> layer_1.get_weights() >>> layer_1.input <tf.Tensor 'dense_1_input:0' shape = (?, 8) dtype = float32>
get_input_at − Get the input data at the specified index, if the layer has multiple node
get_input_shape_at − Get the input shape at the specified index, if the layer has multiple node
output_shape − Get the output shape, if only the layer has single node.
>>> from keras.models import Sequential >>> from keras.layers import Activation, Dense >>> model = Sequential() >>> layer_1 = Dense(16, input_shape = (8,)) >>> model.add(layer_1) >>> layer_1.get_weights() >>> layer_1.output_shape (None, 16)
output
Get the output data, if only the layer has single node.
>>> from keras.models import Sequential >>> from keras.layers import Activation, Dense >>> model = Sequential() >>> layer_1 = Dense(16, input_shape = (8,)) >>> model.add(layer_1) >>> layer_1.get_weights() >>> layer_1.output <tf.Tensor 'dense_1/BiasAdd:0' shape = (?, 16) dtype = float32>
get_output_at − Get the output data at the specified index, if the layer has multiple node
get_output_shape_ at − Get the output shape at the specified index, if the layer has multiple node