How can Keras be used with Embedding layer to share layers using Python?

Keras was developed as a part of research for the project ONEIROS (Open ended Neuro-Electronic Intelligent Robot Operating System). Keras is a deep learning API, which is written in Python. It is a high-level API that has a productive interface that helps solve machine learning problems. It runs on top of Tensorflow framework. It was built to help experiment in a quick manner. It provides essential abstractions and building blocks that are essential in developing and encapsulating machine learning solutions.

It is highly scalable and comes with cross-platform abilities. This means Keras can be run on TPU or clusters of GPUs. Keras models can also be exported to run in a web browser or a mobile phone as well.

Keras is already present within the Tensorflow package. It can be accessed using the below line of code −

import tensorflow
from tensorflow import keras

The Keras functional API helps create models that are more flexible in comparison to models created using sequential API. The functional API can work with models that have non-linear topology, can share layers and work with multiple inputs and outputs. A deep learning model is usually a directed acyclic graph (DAG) that contains multiple layers. The functional API helps build the graph of layers.

We are using Google Colaboratory to run the below code. Google Colab or Colaboratory helps run Python code over the browser and requires zero configuration and free access to GPUs (Graphical Processing Units). Colaboratory has been built on top of Jupyter Notebook. Following is the code snippet to implement Keras used with Embedding layer to share layers using Python −


print("Embedding for 2000 unique words mapped to 128-dimensional vectors")
shared_embedding = layers.Embedding(2000, 128)
print("Variable-length integer sequence")
text_input_a = keras.Input(shape=(None,), dtype="int32")
print("Variable-length integer sequence")
text_input_b = keras.Input(shape=(None,), dtype="int32")
print("Reuse the same layers to encode both the inputs")
encoded_input_a = shared_embedding(text_input_a)
encoded_input_b = shared_embedding(text_input_b)

Code credit −


Embedding for 2000 unique words mapped to 128-dimensional vectors
Variable-length integer sequence
Variable-length integer sequence
Reuse the same layers to encode both the inputs


  • Functional API models use shared layers.

  • These shared layers are instances that can be reused multiple times in the same model.

  • These layers learn features that correspond to multiple paths in the graph.

  • Shared layers are also used to encode inputs from two different parts of the text that have the similar vocabulary.

  • This way, information sharing across different inputs is possible.

  • Due to this, the model can be trained on less amount of data.

  • The ‘Embedding’ layer is shared across two different texts in the above code.