- Trending Categories
Data Structure
Networking
RDBMS
Operating System
Java
iOS
HTML
CSS
Android
Python
C Programming
C++
C#
MongoDB
MySQL
Javascript
PHP
Physics
Chemistry
Biology
Mathematics
English
Economics
Psychology
Social Studies
Fashion Studies
Legal Studies
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
How can TensorFlow be used to build the model for Fashion MNIST dataset in Python?
Tensorflow is a machine learning framework that is provided by Google. It is an open−source framework used in conjunction with Python to implement algorithms, deep learning applications and much more. It is used in research and for production purposes.
The ‘tensorflow’ package can be installed on Windows using the below line of code −
pip install tensorflow
Tensor is a data structure used in TensorFlow. It helps connect edges in a flow diagram. This flow diagram is known as the ‘Data flow graph’. Tensors are nothing but multidimensional array or a list.
The ‘Fashion MNIST’ dataset contains images of clothing of different kinds. It contains grayscale images of more than 70 thousand clothes that belong to 10 different categories. These images are of low resolution (28 x 28 pixels). We are using the Google Colaboratory to run the below code.
Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook.
Following is the code snippet to build the model for Fashion MNIST dataset in Python −
Example
model = tf.keras.Sequential([ tf.keras.layers.Flatten(input_shape=(28, 28)), tf.keras.layers.Dense(128, activation='relu'), tf.keras.layers.Dense(10) ]) print("Sequential model is being built") model.compile(optimizer='adam', loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), metrics=['accuracy']) print("Sequential model is being compiled")
Code credit − https://www.tensorflow.org/tutorials/keras/classification
Output
Sequential model is being built Sequential model is being compiled
Explanation
The layers in the model are configured.
Layer, which is the basic block of a neural network extracts representations from data which is given to the layer as input data.
Many simple layers are grouped together.
Some layers also have parameters which are tuned to reach optimal value during the training phase.
The first layer ‘Flatten’ transforms the images from 2D to 1D array.
This layer doesn’t have any parameters that need to be learnt.
Once the pixels are flattened, two ‘Dense’ layers are built, where every dense layer has 128 neurons. The last layer returs a logits array that has length 10.
Every neuron/node contains a value which is the score that tells about which class the image belongs to.
Then the model is compiled.
- Related Articles
- How can TensorFlow be used to train the model for Fashion MNIST dataset in Python?
- How can TensorFlow be used to make predictions for Fashion MNIST dataset in Python?
- How can Tensorflow be used to define a model for MNIST dataset?
- How can TensorFlow be used to download and explore the Fashion MNIST dataset using Python?
- How can Tensorflow be used with Fashion MNIST dataset so that the trained model is used to predict a different image in Python?
- How can TensorFlow be used to verify the predictions for Fashion MNIST in Python?
- How can TensorFlow be used to preprocess Fashion MNIST data in Python?
- How can Tensorflow be used with abalone dataset to build a sequential model?
- How can Tensorflow be used to save and load weights for MNIST dataset?
- How can Tensorflow be used with pre-trained model to build the training and validation dataset?
- How can Tensorflow be used to build a normalization layer for the abalone dataset?
- How can Tensorflow be used to build vocabulary from tokenized words for Illiad dataset using Python?
- How can Tensorflow be used with Estimators to train the model for titanic dataset?
- How can Tensorflow and pre-trained model be used to configure the dataset for performance?
- After normalization, how can Tensorflow be used to train and build the model?
