A normalization layer can be built using the ‘Normalization’ method present in the ‘preprocessing’ module. This layer is made to adapt to the features of the abalone dataset. In addition to this, a dense layer is added to improve the training capacity of the model. This layer will help pre-compute the mean and variance associated with every column. This mean and variance values will be used to normalize the data.
We will be using the abalone dataset, which contains a set of measurements of abalone. Abalone is a type of sea snail. The goal is to predict the age based on other measurements.
We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.
print("A normalization layer is being built") normalize = preprocessing.Normalization() normalize.adapt(abalone_features) print("A dense layer is being added") norm_abalone_model = tf.keras.Sequential([ normalize, layers.Dense(64), layers.Dense(1) ])
Code credit: https://www.tensorflow.org/tutorials/load_data/csv
A normalization layer is being built A dense layer is being added