Skip to content

Commit

Permalink
Merge pull request #1086 from tberends/main
Browse files Browse the repository at this point in the history
Additional explaination for the training loop
  • Loading branch information
mrdbourke authored Sep 12, 2024
2 parents 8974543 + 81a7b92 commit b99a203
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion 01_pytorch_workflow.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -880,7 +880,7 @@
">\n",
"> And on the ordering of things, the above is a good default order but you may see slightly different orders. Some rules of thumb: \n",
"> * Calculate the loss (`loss = ...`) *before* performing backpropagation on it (`loss.backward()`).\n",
"> * Zero gradients (`optimizer.zero_grad()`) *before* stepping them (`optimizer.step()`).\n",
"> * Zero gradients (`optimizer.zero_grad()`) *before* computing the gradients of the loss with respect to every model parameter (`loss.backward()`).\n",
"> * Step the optimizer (`optimizer.step()`) *after* performing backpropagation on the loss (`loss.backward()`).\n",
"\n",
"For resources to help understand what's happening behind the scenes with backpropagation and gradient descent, see the extra-curriculum section.\n"
Expand Down
2 changes: 1 addition & 1 deletion docs/01_pytorch_workflow.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -881,7 +881,7 @@
">\n",
"> And on the ordering of things, the above is a good default order but you may see slightly different orders. Some rules of thumb: \n",
"> * Calculate the loss (`loss = ...`) *before* performing backpropagation on it (`loss.backward()`).\n",
"> * Zero gradients (`optimizer.zero_grad()`) *before* stepping them (`optimizer.step()`).\n",
"> * Zero gradients (`optimizer.zero_grad()`) *before* computing the gradients of the loss with respect to every model parameter (`loss.backward()`).\n",
"> * Step the optimizer (`optimizer.step()`) *after* performing backpropagation on the loss (`loss.backward()`).\n",
"\n",
"For resources to help understand what's happening behind the scenes with backpropagation and gradient descent, see the extra-curriculum section.\n"
Expand Down

0 comments on commit b99a203

Please sign in to comment.