How can Tensorflow be used to compose layers using Python?

Tensorflow can be used to compose layers by defining a class that inherits from ‘ResnetIdentityBlock’. This is used to define a block which can be used to compose the layers.

Read More: What is TensorFlow and how Keras work with TensorFlow to create Neural Networks?

A neural network that contains at least one layer is known as a convolutional layer. We can use the Convolutional Neural Network to build learning model. 

TensorFlow Hub is a repository that contains pre-trained TensorFlow models. TensorFlow can be used to fine-tune learning models. We will understand how to use models from TensorFlow Hub with tf.keras, use an image classification model from TensorFlow Hub.  Once this is done, transfer learning can be performed to fine-tune a model for customized image classes. This is done by using a pretrained classifier model to take an image and predict what it is. This can be done without needing any training.  

We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.


print("Composing layers")
class ResnetIdentityBlock(tf.keras.Model):
   def __init__(self, kernel_size, filters):
      super(ResnetIdentityBlock, self).__init__(name='')
      filters1, filters2, filters3 = filters
      self.conv2a = tf.keras.layers.Conv2D(filters1, (1, 1))
      self.bn2a = tf.keras.layers.BatchNormalization()
      self.conv2b = tf.keras.layers.Conv2D(filters2, kernel_size, padding='same')
      self.bn2b = tf.keras.layers.BatchNormalization()
      self.conv2c = tf.keras.layers.Conv2D(filters3, (1, 1))
      self.bn2c = tf.keras.layers.BatchNormalization()
   def call(self, input_tensor, training=False):
      x = self.conv2a(input_tensor)
      x = self.bn2a(x, training=training)
      x = tf.nn.relu(x)
      x = self.conv2b(x)
      x = self.bn2b(x, training=training)
      x = tf.nn.relu(x)
      x = self.conv2c(x)
      x = self.bn2c(x, training=training)
      x += input_tensor
      return tf.nn.relu(x)
print("The layer is called")
block = ResnetIdentityBlock(1, [1, 2, 3])
_ = block(tf.zeros([1, 2, 3, 3]))

Code credit −


Composing layers
The layer is called
Model: "resnet_identity_block"
Layer (type)        Output Shape      Param #
conv2d (Conv2D)       multiple          4
batch_normalization (BatchNo multiple   4
conv2d_1 (Conv2D)      multiple        4
batch_normalization_1 (Batch multiple  8
conv2d_2 (Conv2D)     multiple         9
batch_normalization_2 (Batch multiple  12
Total params: 41
Trainable params: 29
Non-trainable params: 12


  • Every residual block in a resnet is composed of convolutions, batch normalizations, and a shortcut.

  • Layers can be nested inside other layers too.

  • When we need model methods such as,Model.evaluate, and, it can be inherited from keras.Model.

  • The keras.Model is used instead of keras.layers.Layer, which helps in tracking variables.

  • A keras.Model tracks its internal layers, thereby making it easier to inspect the layers