塞进裤子ヾ(≧O≦)〃嗷~

0%

PyTorch踩过的坑

loss.backward()

执行lossbackward()会报错

RuntimeError: grad can be implicitly created only for scalar outputs

原因:loss it should be a tensor with single number,我的是【batchsize】大小的tensor

if help:小手一抖点个广告 or 大手一挥资助一下