Carlin Wiegner > AI > Backpropogation

Backpropogation

3 Blue 1 Brown Part 1

What is backpropagation really doing? | Chapter 3, Deep learning

Andrej Karpathy

The spelled-out intro to neural networks and backpropagation: building micrograd

Which weights are adjusted?

All.

Won't this introduce regressions in neural circuits who have a good loss function?
Yes. It will. You're really becoming better at all things in your training set versus improving drastically at one quickly.

Learning Rate - The actual size of each weight update is scaled by a learning rate hyperparameter. This controls how large each adjustment step should be.

Importantly, backpropagation doesn't selectively choose individual weights to update. Instead, it calculates updates for all weights simultaneously. Each weight is adjusted proportionally to its contribution to the error.

3 Blue 1 Brown, Part 2 - heavier math

Backpropagation calculus | Chapter 4, Deep learning

Created: September 17 2024.
Modified: September 25 2024.