How can Tensorflow and pre-trained model be used to configure the dataset for performance?


Tensorflow and pre-trained model can be used to configure the dataset for performance using the ‘AUTOTUNE’ attribute that is present in the ‘tf.Data’ module. Buffered prefetching is used to ensure that the data can be taken from disk without having I/O become blocking. Dataset.prefetch() will overlap the data preprocessing and model execution while training.

Read More: What is TensorFlow and how Keras work with TensorFlow to create Neural Networks?

We will understand how to classify images of cats and dogs with the help of transfer learning from a pre-trained network.

The intuition behind transfer learning for image classification is, if a model is trained on a large and general dataset, this model can be used to effectively serve as a generic model for the visual world. It would have learned the feature maps, which means the user won’t have to start from scratch by training a large model on a large dataset.

Read More: How can a customized model be pre-trained?

We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.

Example

AUTOTUNE = tf.data.AUTOTUNE
print("Configuring dataset for performance")
train_dataset = train_dataset.prefetch(buffer_size=AUTOTUNE)
validation_dataset = validation_dataset.prefetch(buffer_size=AUTOTUNE)
test_dataset = test_dataset.prefetch(buffer_size=AUTOTUNE)

Code credit −https://www.tensorflow.org/tutorials/images/transfer_learning

Output

Configuring dataset for performance

Explanation

  • Buffered prefetching can be used to load images from the disk.
  • This will avoid I/O from blocking.

Updated on: 25-Feb-2021

187 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements