Skip to content

Commit

Permalink
Merge pull request #1054 from CarterT27/main
Browse files Browse the repository at this point in the history
Fix typos and tensorboard installation
  • Loading branch information
mrdbourke authored Aug 22, 2024
2 parents e32e544 + 554233b commit bc304a1
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 5 deletions.
10 changes: 8 additions & 2 deletions 07_pytorch_experiment_tracking.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -745,7 +745,13 @@
"metadata": {},
"outputs": [],
"source": [
"from torch.utils.tensorboard import SummaryWriter\n",
"try:\n",
" from torch.utils.tensorboard import SummaryWriter\n",
"except:\n",
" print(\"[INFO] Couldn't find tensorboard... installing it.\")\n",
" !pip install -q tensorboard\n",
" from torch.utils.tensorboard import SummaryWriter\n",
"\n",
"\n",
"# Create a writer with all default settings\n",
"writer = SummaryWriter()"
Expand Down Expand Up @@ -2298,7 +2304,7 @@
"source": [
"Running the cell above we should get an output similar to the following.\n",
"\n",
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inheret randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inherent randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
"\n",
"<img src=\"https://raw.githubusercontent.com/mrdbourke/pytorch-deep-learning/main/images/07-tensorboard-lowest-test-loss.png\" alt=\"various modelling experiments visualized on tensorboard with model that has the lowest test loss highlighted\" width=900/>\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/02_pytorch_classification.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2391,7 +2391,7 @@
"\n",
"PyTorch has a bunch of [ready-made non-linear activation functions](https://pytorch.org/docs/stable/nn.html#non-linear-activations-weighted-sum-nonlinearity) that do similiar but different things. \n",
"\n",
"One of the most common and best performing is [ReLU](https://en.wikipedia.org/wiki/Rectifier_(neural_networks) (rectified linear-unit, [`torch.nn.ReLU()`](https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html)).\n",
"One of the most common and best performing is [ReLU](https://en.wikipedia.org/wiki/Rectifier_(neural_networks)) (rectified linear-unit, [`torch.nn.ReLU()`](https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html)).\n",
"\n",
"Rather than talk about it, let's put it in our neural network between the hidden layers in the forward pass and see what happens."
]
Expand Down
10 changes: 8 additions & 2 deletions docs/07_pytorch_experiment_tracking.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -725,7 +725,13 @@
"metadata": {},
"outputs": [],
"source": [
"from torch.utils.tensorboard import SummaryWriter\n",
"try:\n",
" from torch.utils.tensorboard import SummaryWriter\n",
"except:\n",
" print(\"[INFO] Couldn't find tensorboard... installing it.\")\n",
" !pip install -q tensorboard\n",
" from torch.utils.tensorboard import SummaryWriter\n",
"\n",
"\n",
"# Create a writer with all default settings\n",
"writer = SummaryWriter()"
Expand Down Expand Up @@ -2254,7 +2260,7 @@
"source": [
"Running the cell above we should get an output similar to the following.\n",
"\n",
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inheret randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inherent randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
"\n",
"<img src=\"https://raw.githubusercontent.com/mrdbourke/pytorch-deep-learning/main/images/07-tensorboard-lowest-test-loss.png\" alt=\"various modelling experiments visualized on tensorboard with model that has the lowest test loss highlighted\" width=900/>\n",
"\n",
Expand Down

0 comments on commit bc304a1

Please sign in to comment.