- Trending Categories
Data Structure
Networking
RDBMS
Operating System
Java
MS Excel
iOS
HTML
CSS
Android
Python
C Programming
C++
C#
MongoDB
MySQL
Javascript
PHP
Physics
Chemistry
Biology
Mathematics
English
Economics
Psychology
Social Studies
Fashion Studies
Legal Studies
- Selected Reading
- UPSC IAS Exams Notes
- Developer's Best Practices
- Questions and Answers
- Effective Resume Writing
- HR Interview Questions
- Computer Glossary
- Who is Who
How to normalize a tensor in PyTorch?
A tensor in PyTorch can be normalized using the normalize() function provided in the torch.nn.functional module. This is a non-linear activation function.
It performs Lp normalization of a given tensor over a specified dimension.
It returns a tensor of normalized value of the elements of original tensor.
A 1D tensor can be normalized over dimension 0, whereas a 2D tensor can be normalized over both dimensions 0 and 1, i.e., column-wise or row-wise.
An n-dimensional tensor can be normalized over dimensions (0,1, 2,..., n-1).
Syntax
torch.nn.functional.normalize(input, p=2.0, dim = 1)
Parameters
Input – Input tensor
p – Power (exponent) value in norm formulation
dim – Dimension over which the elements are normalized.
Steps
We could use the following steps to normalize a tensor −
Import the torch library. Make sure you have it already installed.
import torch from torch.nn.functional import normalize
Create a tensor and print it.
t = torch.tensor([[1.,2.,3.],[4.,5.,6.]]) print("Tensor:", t)
Normalize the tensor using different p values and over different dimensions. The above defined tensor is a 2D tensor, so we can normalize it over two dimensions.
t1 = normalize(t, p=1.0, dim = 1) t2 = normalize(t, p=2.0, dim = 0)
Print the above computed normalized tensor.
print("Normalized tensor:
", t1) print("Normalized tensor:
", t2)
Example 1
# import torch library import torch from torch.nn.functional import normalize # define a torch tensor t = torch.tensor([1., 2., 3., -2., -5.]) # print the above tensor print("Tensor:
", t) # normalize the tensor t1 = normalize(t, p=1.0, dim = 0) t2 = normalize(t, p=2.0, dim = 0) # print normalized tensor print("Normalized tensor with p=1:
", t1) print("Normalized tensor with p=2:
", t2)
Output
Tensor: tensor([ 1., 2., 3., -2., -5.]) Normalized tensor with p=1: tensor([ 0.0769, 0.1538, 0.2308, -0.1538, -0.3846]) Normalized tensor with p=2: tensor([ 0.1525, 0.3050, 0.4575, -0.3050, -0.7625])
Example 2
# import torch library import torch from torch.nn.functional import normalize # define a 2D tensor t = torch.tensor([[1.,2.,3.],[4.,5.,6.]]) # print the above tensor print("Tensor:
", t) # normalize the tensor t0 = normalize(t, p=2.0) # print the normalized tensor print("Normalized tensor:
", t0) # normalize the tensor in dim 0 or column-wise tc = normalize(t, p=2.0, dim = 0) # print the normalized tensor print("Column-wise Normalized tensor:
", tc) # normalize the tensor in dim 1 or row-wise tr = normalize(t, p=2.0, dim = 1) # print the normalized tensor print("Row-wise Normalized tensor:
", tr)
Output
Tensor: tensor([[1., 2., 3.], [4., 5., 6.]]) Normalized tensor: tensor([[0.2673, 0.5345, 0.8018], [0.4558, 0.5698, 0.6838]]) Column-wise Normalized tensor: tensor([[0.2425, 0.3714, 0.4472], [0.9701, 0.9285, 0.8944]]) Row-wise Normalized tensor: tensor([[0.2673, 0.5345, 0.8018], [0.4558, 0.5698, 0.6838]])