Backpropagation
Backlinks
- Exploding Gradient
Gradients (error signals) become extremely large during backpropagation, causing huge, unstable updates to network weights, leading to divergence, loss spikes, and model failure
- Vanishing Gradient
Gradients used for weight updates during backpropagation, become extremely small as they travel back through layers, effectively halting learning in early layers