Jacobian matrix in PyTorch

In this article we will learn about the Jacobian matrix and how to calculate this matrix using different methods in PyTorch. We use Jacobian matrix in various machine learning applications.

What is a Jacobian Matrix?

The Jacobian matrix is a mathematical tool used to calculate the relationship between input and output variables. It contains all the partial derivatives of a vector-valued function. This matrix has various applications in machine learning and computational mathematics:

  • Analyzing gradients and derivatives of functions in multivariable calculus

  • Solving differential equations of systems

  • Calculating inverse of vector-valued functions

  • Analyzing stability of dynamic systems

Installing PyTorch

First, install the PyTorch module using the following command ?

pip install torch

torch.autograd.functional.jacobian() Function

To calculate the Jacobian matrix in PyTorch, we use torch.autograd.functional.jacobian(). This function takes the following parameters:

  • func A Python function that takes tensor as input and returns tensor or tuples of tensors

  • inputs Input tensor or tuples of tensors to send to the function

Example 1: Simple Matrix Sum

Let's calculate the Jacobian for a simple sum function ?

import torch

mat = torch.tensor([[1.0, 2.0], [3.0, 4.0]], requires_grad=True)
jacobian = torch.autograd.functional.jacobian(lambda val: val.sum(), mat)
print("Jacobian Matrix:")
print(jacobian)
Jacobian Matrix:
tensor([[1., 1.],
        [1., 1.]])

The Jacobian contains all ones because the derivative of sum with respect to each element is 1.

Example 2: Matrix Multiplication

Here we calculate the Jacobian for matrix multiplication ?

import torch

def mat_mul(A, B):
    return torch.mm(A, B)

mat1 = torch.tensor([[2.0, 3.0], [4.0, 5.0]], requires_grad=True)
mat2 = torch.tensor([[1.0, 2.0], [3.0, 4.0]], requires_grad=True)
jacobian = torch.autograd.functional.jacobian(lambda x: mat_mul(mat1, x), mat2)
print("Jacobian shape:", jacobian.shape)
print("Jacobian Matrix:")
print(jacobian)
Jacobian shape: torch.Size([2, 2, 2, 2])
Jacobian Matrix:
tensor([[[[2., 0.],
          [3., 0.]],

         [[0., 2.],
          [0., 3.]]],


        [[[4., 0.],
          [5., 0.]],

         [[0., 4.],
          [0., 5.]]]])

Example 3: Random Matrix

Let's work with a random matrix ?

import torch

torch.manual_seed(42)  # For reproducible results
rand_matrix = torch.randn((3, 2), requires_grad=True)
print("Random matrix:")
print(rand_matrix)

jacobian = torch.autograd.functional.jacobian(lambda val: val.sum(), rand_matrix)
print("Jacobian Matrix:")
print(jacobian)
Random matrix:
tensor([[ 0.3367, -0.1288],
        [ 0.2345,  0.2303],
        [-1.1229, -0.1863]], requires_grad=True)
Jacobian Matrix:
tensor([[1., 1.],
        [1., 1.],
        [1., 1.]])

Example 4: Matrix Squaring

Computing the Jacobian for a matrix squaring operation ?

import torch

def mat_square(M):
    return torch.mm(M, M)

mat = torch.tensor([[2.0, 3.0], [4.0, 5.0]], requires_grad=True)
jacobian = torch.autograd.functional.jacobian(lambda x: mat_square(x), mat)
print("Jacobian shape:", jacobian.shape)
print("Jacobian Matrix:")
print(jacobian)
Jacobian shape: torch.Size([2, 2, 2, 2])
Jacobian Matrix:
tensor([[[[ 4.,  4.],
          [ 3.,  0.]],

         [[ 3.,  7.],
          [ 0.,  3.]]],


        [[[ 4.,  0.],
          [ 7.,  4.]],

         [[ 0.,  4.],
          [ 3., 10.]]]])

Example 5: Converting NumPy Array

Working with NumPy arrays converted to PyTorch tensors ?

import torch
import numpy as np

array = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
mat = torch.tensor(array, requires_grad=True)
jacobian = torch.autograd.functional.jacobian(lambda val: val.sum(), mat)
print("Jacobian Matrix:")
print(jacobian)
Jacobian Matrix:
tensor([[1., 1., 1.],
        [1., 1., 1.]], dtype=torch.float64)

Key Points

  • Set requires_grad=True for tensors when computing Jacobians

  • The Jacobian shape depends on input and output dimensions

  • For scalar outputs, the Jacobian has the same shape as the input

  • Use torch.manual_seed() for reproducible random results

Conclusion

The Jacobian matrix in PyTorch is computed using torch.autograd.functional.jacobian(). This powerful tool helps analyze gradients in machine learning models and can be applied to various mathematical operations for optimization tasks.

Updated on: 2026-03-27T14:37:32+05:30

494 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements