- Related Questions & Answers
- How can Tensorflow be used to build normalization layer using Python?
- How can Tensorflow be used with abalone dataset to build a sequential model?
- How can Tensorflow be used to load the csv data from abalone dataset?
- How can Tensorflow be used to display sample data from abalone dataset?
- After normalization, how can Tensorflow be used to train and build the model?
- How can TensorFlow be used to build the model for Fashion MNIST dataset in Python?
- How can Tensorflow be used to configure the dataset for performance?
- How can Tensorflow be used to build vocabulary from tokenized words for Illiad dataset using Python?
- How can Tensorflow be used to configure the flower dataset for performance?
- How can Tensorflow be used to define a model for MNIST dataset?
- How can Tensorflow be used to standardize the flower dataset?
- How can Tensorflow be used with pre-trained model to build the training and validation dataset?
- How can Tensorflow be used to find the state of preprocessing layer in dataset using Python?
- How can Tensorflow be used to get the variables in a layer?
- How can Tensorflow be used to prepare the IMDB dataset for training in Python?

- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who

A normalization layer can be built using the ‘Normalization’ method present in the ‘preprocessing’ module. This layer is made to adapt to the features of the abalone dataset. In addition to this, a dense layer is added to improve the training capacity of the model. This layer will help pre-compute the mean and variance associated with every column. This mean and variance values will be used to normalize the data.

**Read More:**
What is TensorFlow and how Keras work with TensorFlow to create Neural Networks?

We will be using the abalone dataset, which contains a set of measurements of abalone. Abalone is a type of sea snail. The goal is to predict the age based on other measurements.

We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.

print("A normalization layer is being built") normalize = preprocessing.Normalization() normalize.adapt(abalone_features) print("A dense layer is being added") norm_abalone_model = tf.keras.Sequential([ normalize, layers.Dense(64), layers.Dense(1) ])

Code credit: https://www.tensorflow.org/tutorials/load_data/csv

A normalization layer is being built A dense layer is being added

- The inputs to the model are normalized.
- This normalization can be incorporated by adding the 'experimental.preprocessing' layer.
- This layer will help pre-compute the mean and variance associated with every column.
- This mean and variance values are used to normalize the data.
- Firstly, the normalization layer is created using the 'Normalization.adapt' method.
- Only training data should be used with the 'adapt' method for preprocessing layers.
- This normalization layer is used to build the model.

Advertisements