0scar Chang 晴れ男

In PyTorch, x += 1 is not the same as x = x+1

In Python, x += 1 is syntactic sugar for x = x+1.

In PyTorch however, the former modifies the Tensor x in place, while the latter creates a new Tensor for x and refers to it. You can use id(x) to verify this.

Understanding this difference is important to having a good mental picture for how AutoGrad works behind the scenes. For example, if we initialize x = torch.ones(3, requires_grad=True), then x += 1 throws an error RuntimeError: a leaf Variable that requires grad has been used in an in-place operation, but x = x+1 does not.