Fix PyTorch RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

By | December 29, 2022

In this tutorial, we will introduce you how to fix pytorch error: RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation.

Look at example code below:

    def softmax(self, inputs, length, max_length):
        # batch = 4, inputs = [160,1, 50]
        inputs = torch.exp(inputs)
        length = torch.reshape(length, [-1, 1])
        mask = self.sequence_mask(length, max_length)
        #print("length = ", length.shape)
        #print("inputs = ", inputs.shape)
        #print("mask   = ", mask.shape)
        mask = torch.reshape(mask, [inputs.shape[0], 1, inputs.shape[2]])
        inputs *= mask
        _sum = torch.sum(inputs, dim=2, keepdim=True) + 1e-9
        return inputs / _sum

We can find code inputs *= mask will report this error.

Fix PyTorch RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

How to fix this error?

We should avoid to use operation +=, -=, *=, /= in pytorch.

In order to fix this error, we can do as follows:

inputs = inputs * mask

Then, we wil find this error is fixed.