# How can Tensorflow be used with pre-trained model to determine the number of batches in the dataset?

TensorflowServer Side ProgrammingProgramming

Tensorflow can be used with pre-trained model to determine the number of batches in the dataset using the ‘cardinality’ method that is present in the ‘tf.data.experimental’ module.

We will understand how to classify images of cats and dogs with the help of transfer learning from a pre-trained network.

The intuition behind transfer learning for image classification is, if a model is trained on a large and general dataset, this model can be used to effectively serve as a generic model for the visual world. It would have learned the feature maps, which means the user won’t have to start from scratch by training a large model on a large dataset.

We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.

## Example

print("Number of batches in the data is determined")
val_batches = tf.data.experimental.cardinality(validation_dataset)
test_dataset = validation_dataset.take(val_batches // 5)
validation_dataset = validation_dataset.skip(val_batches // 5)
print('Number of validation batches are: %d' % tf.data.experimental.cardinality(validation_dataset))
print('Number of test batches are: %d' % tf.data.experimental.cardinality(test_dataset)

## Output

Number of batches in the data is determined
Number of validation batches are: 21
Number of test batches are: 5

## Explanation

• The original datset doesn’t have a test set.
• Hence it is created.
• For this purpose, the number of batches of data available are determined.
• Once this is done, 20 percent of the data is moved to test dataset.
Published on 12-Feb-2021 13:18:36