SGD Course v3 Lesson 5 Notes

Back propagation To calculate the loss between output layer/final activations and actual target values. Use the resulting losses to: Calculate the gradients with respect to the parameters and Update the parameters: $\text{parameters} -= \text{learning rate} \cdot \text{gradient of parameters}$. Fine tuning Example: ResNet-34 The final layer, i.e. that final weight matrix, of ResNet-34 has 1000 columns because the images can be in one of 1000 different categories, i. Course v3 Lesson 2 Notes

General notes regarding previous lesson No need to feel intimidated for all the good projects in lesson 1. Just open your imagination and start an interesting project! Keep going: code and experiment –> “The whole game” –> concepts –> lesson 2 Creating your own dataset from Google Images Download images After opened image search result page, run the following javascript code in javascript console, by pressing Ctrl+Shift+J in Windows/Linux or Cmd+Opt+J in Mac.