Result will never require gradient....0 of tensors does not require grad and does not have a grad_fn如果此时对c进行了更改,这个更改会被autograd追踪,在对out.sum...>)tensor([0.7311, 0.8808, 0.9526])tensor([0., 0., 0.])tensor([0., 0., 0.], grad_fn=)...>)tensor([0.7311, 0.8808, 0.9526])tensor([0., 0., 0.])tensor([0., 0., 0.], grad_fn=)...)Nonetensor(2.5644, grad_fn=)tensor(2.5644)tensor(0.)tensor(0., grad_fn=)