need zero grad now

This commit is contained in:
George Hotz
2020-12-07 23:10:43 -08:00
parent b355cd2571
commit c63f950348

View File

@@ -73,6 +73,7 @@ optim = optim.SGD([model.l1, model.l2], lr=0.001)
out = model.forward(x) out = model.forward(x)
loss = out.mul(y).mean() loss = out.mul(y).mean()
optim.zero_grad()
loss.backward() loss.backward()
optim.step() optim.step()
``` ```