# How can Tensorflow be used to standardize the flower dataset?

PythonServer Side ProgrammingProgrammingTensorflow

Data standardization refers to the act of scaling the dataset to a level so that all the features can be represented using equivalent units. The rescaling layer is built using the ‘Rescaling’ method which is present in Keras module. The layer is applied to the entire dataset using the ‘map’ method.

We will be using the flowers dataset, which contains images of several thousands of flowers. It contains 5 sub-directories, and there is one sub-directory for every class.

We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.

from tensorflow.keras import layers
print("Standardizing the data using a rescaling layer")
normalization_layer = tf.keras.layers.experimental.preprocessing.Rescaling(1./255)

print("This layer can be applied by calling the map function on the dataset")
normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
image_batch, labels_batch = next(iter(normalized_ds))
first_image = image_batch[0]
print(np.min(first_image), np.max(first_image))

## Output

Standardizing the data using a rescaling layer
This layer can be applied by calling the map function on the dataset
0.0 0.96902645

## Explanation

• The RGB channel values are in the range of 0 and 255.
• This isn't good for a neural network.
• The idea is to make the input data as small as possible.
• The values in the image are standardized, to be in thenage of 0 and 1.
• This is done with the help of a rescaling layer.
• Alternative is to include this rescaling layer in the definition of model, which would simplify the deployment.
Published on 11-Feb-2021 06:31:09