Shahid Akhtar Khan has Published 120 Answers

What does Tensor.detach() do in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 11:24:29

Tensor.detach() is used to detach a tensor from the current computational graph. It returns a new tensor that doesn't require a gradient.When we don't need a tensor to be traced for the gradient computation, we detach the tensor from the current computational graph.We also need to detach a tensor when ... Read More

How to compute gradients in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 11:20:48

To compute the gradients, a tensor must have its parameter requires_grad = true. The gradients are same as the partial derivatives.For example, in the function y = 2*x + 1, x is a tensor with requires_grad = True. We can compute the gradients using y.backward() function and the gradient can ... Read More

PyTorch – How to compute element-wise logical XOR of tensors?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 11:13:01

torch.logical_xor() computes the element-wise logical XOR of the given two input tensors. In a tensor, the elements with zero values are treated as False and non-zero elements are treated as True. It takes two tensors as input parameters and returns a tensor with values after computing the logical XOR.Syntaxtorch.logical_xor(tensor1, tensor2)where ... Read More

How to narrow down a tensor in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 11:08:19

torch.narrow() method is used to perform narrow operation on a PyTorch tensor. It returns a new tensor that is a narrowed version of the original input tensor.For example, a tensor of [4, 3] can be narrowed to a tensor of size [2, 3] or [4, 2]. We can narrow down ... Read More

How to perform a permute operation in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 11:03:59

torch.permute() method is used to perform a permute operation on a PyTorch tensor. It returns a view of the input tensor with its dimension permuted. It doesn't make a copy of the original tensor.For example, a tensor with dimension [2, 3] can be permuted to [3, 2]. We can also ... Read More

How to perform an expand operation in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 10:59:35

Tensor.expand() attribute is used to perform expand operation. It expands the Tensor to new dimensions along the singleton dimension.Expanding a tensor only creates a new view of the original tensor; it doesn't make a copy of the original tensor.If you set a particular dimension as -1, the tensor will not ... Read More

How to create tensors with gradients in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 10:54:45

To create a tensor with gradients, we use an extra parameter "requires_grad = True" while creating a tensor.requires_grad is a flag that controls whether a tensor requires a gradient or not.Only floating point and complex dtype tensors can require gradients.If requires_grad is false, then the tensor is same as the ... Read More

How to find element-wise remainder in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 10:49:37

Element-wise remainder when a tensor is divided by other tensor is computed using the torch.remainder() method. We can also apply torch.fmod() to find the remainder.The difference between these two methods is that in torch.remainder(), when the sign of result is different than the sign of divisor, then the divisor is ... Read More

PyTorch – How to compute Singular Value Decomposition (SVD) of a matrix?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 10:43:33

torch.linalg.svd() computes the singular value decomposition (SVD) of a matrix or a batch of matrices. Singular value decomposition is represented as a named tuple (U, S, Vh).U and Vh are orthogonal for real matrix and unitary for input complex matrix.Vh is transpose of V when V is a real value ... Read More

How to perform in-place operations in PyTorch?

Shahid Akhtar Khan

Shahid Akhtar Khan

Updated on 06-Dec-2021 10:31:27

In-place operations directly change the content of a tensor without making a copy of it. Since it does not create a copy of the input, it reduces the memory usage when dealing with high-dimensional data. An in-place operation helps to utilize less GPU memory.In PyTorch, in-place operations are always post-fixed ... Read More

Advertisements