
- Python Basic Tutorial
- Python - Home
- Python - Overview
- Python - Environment Setup
- Python - Basic Syntax
- Python - Comments
- Python - Variables
- Python - Data Types
- Python - Operators
- Python - Decision Making
- Python - Loops
- Python - Numbers
- Python - Strings
- Python - Lists
- Python - Tuples
- Python - Dictionary
- Python - Date & Time
- Python - Functions
- Python - Modules
- Python - Files I/O
- Python - Exceptions
How can Tensorflow be used to define a model for MNIST dataset?
Tensorflow is a machine learning framework that is provided by Google. It is an open−source framework used in conjunction with Python to implement algorithms, deep learning applications and much more. It has optimization techniques that help in performing complicated mathematical operations quickly. This is because it uses NumPy and multi−dimensional arrays. These multi−dimensional arrays are also known as ‘tensors’. The framework supports working with deep neural network.
The ‘tensorflow’ package can be installed on Windows using the below line of code −
pip install tensorflow
Tensor is a data structure used in TensorFlow. It helps connect edges in a flow diagram. This flow diagram is known as the ‘Data flow graph’. Tensors are nothing but multidimensional array or a list.
Keras is a deep learning API, which is written in Python. It is a high-level API that has a productive interface that helps solve machine learning problems. It runs on top of Tensorflow framework. It was built to help experiment in a quick manner. Keras is already present within the Tensorflow package. It can be accessed using the below line of code.
import tensorflow from tensorflow import keras
We are using the Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook. Following is the code snippet −
Example
print("Defining a sequential model") def create_model(): model = tf.keras.models.Sequential([ keras.layers.Dense(512, activation='relu', input_shape=(784,)), keras.layers.Dropout(0.2), keras.layers.Dense(10) ]) model.compile(optimizer='adam', loss=tf.losses.SparseCategoricalCrossentropy(from_logits=True), metrics=[tf.metrics.SparseCategoricalAccuracy()]) return model print("Creating a model instance") model = create_model() print("Displaying the architecture of the sequential model") model.summary()
Code credit − https://www.tensorflow.org/tutorials/keras/save_and_load
Output
Explanation
A sequential model is created using Keras.
The layers are created which are ‘dense’.
This model is compiled.
An instance of this model is created.
Details about this model are displayed on the screen using the ‘summary’ method.
- Related Articles
- How can TensorFlow be used to build the model for Fashion MNIST dataset in Python?
- How can TensorFlow be used to train the model for Fashion MNIST dataset in Python?
- How can Tensorflow be used to save and load weights for MNIST dataset?
- How can TensorFlow be used to make predictions for Fashion MNIST dataset in Python?
- How can Tensorflow be used with Estimators to train the model for titanic dataset?
- How can Tensorflow be used with Fashion MNIST dataset so that the trained model is used to predict a different image in Python?
- How can TensorFlow be used to download and explore the Fashion MNIST dataset using Python?
- How can Tensorflow be used with abalone dataset to build a sequential model?
- How can Tensorflow and pre-trained model be used to configure the dataset for performance?
- How can Tensorflow be used with flower dataset to continue training the model?
- How can MNIST data be used with TensorFlow for recognizing handwritten digits?
- How can Tensorflow be used to configure the dataset for performance?
- How can Tensorflow and Estimator be used to define input function for training and evaluation of dataset?
- How can TensorFlow be used to verify the predictions for Fashion MNIST in Python?
- How can Tensorflow be used to configure the flower dataset for performance?
