PyTorch – How to compute element-wise entropy of an input tensor?


To compute the element-wise entropy of an input tensor, we use torch.special.entr() method. It returns a new tensor with entropy computed element-wise.

  • If the element of tensor is negative, the entropy is negative infinity.

  • If the element of the tensor is a zero, the entropy is zero.

  • The entropy for a positive number element is computed as the negative value of the element multiplied by its natural logarithm. It accepts torch tensor of any dimension.

Steps

We could use the following steps to compute the entropy on a tensor element-wise −

  • Import the required library. In all the following examples, the required Python library is torch. Make sure you have already installed it.

import torch
  • Define a torch tensor. Here we define a 2D tensor of random numbers.

tensor = torch.randn(2,3,3)
  • Compute the entropy of the above-defined tensor using torch.special.entr(tensor). Optionally assign this value to a new variable.

ent = torch.special.entr(tensor)
  • Print the computed entropy.

print("Entropy:", ent)

Example 1

In this example, we compute the entropy of a 1D user-defined tensor.

# import necessary libraries
import torch

# define a 1D tensor
tensor1 = torch.tensor([-1,1,2,0,.4])

# print above created tensor
print("Tensor:", tensor1)

# compute the entropy on input tensor element wise
ent = torch.special.entr(tensor1)

# Display the computed entropies
print("Entropy:", ent)

Output

It will produce the following output −

Tensor: tensor([-1.0000, 1.0000, 2.0000, 0.0000, 0.4000])
Entropy: tensor([ -inf, -0.0000, -1.3863, 0.0000, 0.3665])

Notice that the entropy of negative number is -inf, of zero is zero.

Example 2

In this example, we compute the entropy of a 2D torch tensor element-wise.

# import necessary libraries
import torch

# define a tensor of random numbers
tensor1 = torch.randn(2,3,3)

# print above created tensor
print("Tensor:
", tensor1) # compute the entropy on input tensor element wise ent = torch.special.entr(tensor1) # Display the computed entropies print("Entropy:
", ent)

Output

It will produce the following output −

Tensor:
tensor([[[ 0.5996, -0.7526, -1.0233],
   [-0.9907, -0.0358, 0.6433],
   [ 0.4527, -0.1434, 0.3338]],
   [[ 0.0521, -0.3729, -0.1162],
   [ 0.2417, 0.7732, -0.6362],
   [-0.7942, -0.2582, 1.0860]]])
Entropy:
tensor([[[ 0.3067, -inf, -inf],
   [ -inf, -inf, 0.2838],
   [ 0.3588, -inf, 0.3663]],

   [[ 0.1539, -inf, -inf],
   [ 0.3432, 0.1989, -inf],
   [ -inf, -inf, -0.0896]]])

Updated on: 06-Jan-2022

2K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements