How can TensorFlow be used with keras.Model to track the variables defined using sequential model?

TensorflowServer Side ProgrammingProgramming

Tensorflow can be used to create a model that tracks internal layers by creating a sequential model and using this model to call ‘tf.zeros’ method.

Read More: What is TensorFlow and how Keras work with TensorFlow to create Neural Networks?

A neural network that contains at least one layer is known as a convolutional layer. We can use the Convolutional Neural Network to build learning model. 

The intuition behind transfer learning for image classification is, if a model is trained on a large and general dataset, this model can be used to effectively serve as a generic model for the visual world. It would have learned the feature maps, which means the user won’t have to start from scratch by training a large model on a large dataset.

TensorFlow Hub is a repository that contains pre-trained TensorFlow models. TensorFlow can be used to fine-tune learning models.

We will understand how to use models from TensorFlow Hub with tf.keras, use an image classification model from TensorFlow Hub.  Once this is done, transfer learning can be performed to fine-tune a model for customized image classes. This is done by using a pretrained classifier model to take an image and predict what it is. This can be done without needing any training.  

We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.

Example

print("It tracks internal layers")
my_seq = tf.keras.Sequential([tf.keras.layers.Conv2D(1, (1, 1), input_shape=(None, None, 3)),
   tf.keras.layers.BatchNormalization(),
   tf.keras.layers.Conv2D(2, 1, padding='same'),
   tf.keras.layers.BatchNormalization(),
   tf.keras.layers.Conv2D(3, (1, 1)),
   tf.keras.layers.BatchNormalization()])
my_seq(tf.zeros([1, 2, 3, 3]))
print("The architecture of the model is")
my_seq.summary()

Code credit −https://www.tensorflow.org/tutorials/customization/custom_layers

Output

It tracks internal layers
The architecture of the model is
Model: "sequential_4"
_________________________________________________________________
Layer (type)           Output Shape               Param #
=================================================================
conv2d_3 (Conv2D) (None, None, None, 1)             4
_________________________________________________________________
batch_normalization_3 (Batch (None, None, None, 1)  4
_________________________________________________________________
conv2d_4 (Conv2D)  (None, None, None, 2)            4
_________________________________________________________________
batch_normalization_4 (Batch (None, None, None, 2)  8
_________________________________________________________________
conv2d_5 (Conv2D) (None, None, None, 3)             9
_________________________________________________________________
batch_normalization_5 (Batch (None, None, None, 3) 12
=================================================================
Total params: 41
Trainable params: 29
Non-trainable params: 12
_________________________________________________________________

Explanation

  • Many times, models that have many layers usually call one layer after the other.

  • This is done using tf.keras.Sequential.

raja
Published on 13-Feb-2021 09:08:36
Advertisements