How can Keras be used to manually save the weights using Python?

TensorFlow is a machine learning framework provided by Google. It is an open-source framework used in conjunction with Python to implement algorithms, deep learning applications and much more. It is used in research and for production purposes.

Keras is a high-level deep learning API written in Python that runs on top of TensorFlow. It provides essential abstractions and building blocks for developing machine learning solutions with a productive interface. Keras models can be exported to run in web browsers or mobile phones, and it's highly scalable across TPUs and GPU clusters.

Keras is already present within the TensorFlow package and can be accessed using:

import tensorflow as tf
from tensorflow import keras

Creating a Sample Model

First, let's create a simple model function that we can use to demonstrate saving and loading weights ?

import tensorflow as tf
from tensorflow import keras

def create_model():
    model = keras.Sequential([
        keras.layers.Dense(512, activation='relu', input_shape=(784,)),
        keras.layers.Dropout(0.2),
        keras.layers.Dense(10)
    ])
    
    model.compile(optimizer='adam',
                  loss=tf.losses.SparseCategoricalCrossentropy(from_logits=True),
                  metrics=['accuracy'])
    
    return model

# Create and display model summary
model = create_model()
print("Model created successfully")
print(f"Model has {len(model.layers)} layers")

Saving Model Weights Manually

Keras provides the save_weights() method to manually save model weights to a checkpoint file. This is useful when you want to save only the learned parameters without the model architecture ?

import tensorflow as tf
from tensorflow import keras
import tempfile
import os

# Create a simple model
def create_model():
    model = keras.Sequential([
        keras.layers.Dense(512, activation='relu', input_shape=(784,)),
        keras.layers.Dropout(0.2),
        keras.layers.Dense(10)
    ])
    
    model.compile(optimizer='adam',
                  loss=tf.losses.SparseCategoricalCrossentropy(from_logits=True),
                  metrics=['accuracy'])
    return model

# Create model and train briefly (simulation)
model = create_model()

# Create some dummy data for demonstration
import numpy as np
dummy_x = np.random.random((100, 784))
dummy_y = np.random.randint(0, 10, (100,))

# Train the model briefly
model.fit(dummy_x, dummy_y, epochs=1, verbose=0)

# Save the weights manually
checkpoint_path = "./model_weights"
print("Saving weights...")
model.save_weights(checkpoint_path)
print("Weights saved successfully!")

Loading Weights into a New Model

After saving weights, you can load them into a new model instance with the same architecture ?

# Create a new model instance
print("Creating a new model instance...")
new_model = create_model()

# Load the previously saved weights
print("Loading weights from checkpoint...")
new_model.load_weights(checkpoint_path)

# Verify that weights are loaded
print("Weights loaded successfully!")

# Compare predictions (they should be identical)
test_input = np.random.random((1, 784))
original_prediction = model.predict(test_input, verbose=0)
loaded_prediction = new_model.predict(test_input, verbose=0)

print(f"Predictions match: {np.allclose(original_prediction, loaded_prediction)}")

Different Weight Saving Formats

Keras supports different formats for saving weights:

# Save in TensorFlow format (default)
model.save_weights('./weights_tf_format')

# Save in HDF5 format
model.save_weights('./weights_hdf5_format.h5')

# Save weights of specific layers only
layer_weights = model.layers[0].get_weights()
print(f"First layer has {len(layer_weights)} weight arrays")
print(f"Weight shape: {layer_weights[0].shape}")
print(f"Bias shape: {layer_weights[1].shape}")

Key Benefits of Manual Weight Saving

Benefit Description
Smaller file size Only weights are saved, not the full model
Flexibility Can load weights into different model architectures
Version control Easier to track changes in model parameters
Transfer learning Load pre-trained weights into new models

Conclusion

Manually saving weights in Keras using save_weights() provides flexibility and efficiency when you only need to preserve the learned parameters. This approach is ideal for transfer learning, model versioning, and reducing storage requirements compared to saving the entire model.

Updated on: 2026-03-25T15:42:43+05:30

217 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements