Training a neural network happens in two main passes:
- Forward pass: Predict output from input
- Backward pass: Update weights using gradients
๐ Forward Pass
In the forward pass:
- Inputs flow through the layers
- Neurons apply weights and activations
- Output is generated (e.g. classification)
z = wโxโ + wโxโ + b
a = activation(z)
Each layer transforms data until we reach the output.
๐ Backward Pass (Backpropagation)
After comparing output to the correct label (using a loss function), we need to update weights.
This happens via backpropagation:
- Gradients of loss are computed using chain rule
- Gradients flow from output layer โ input layer
- Weights are adjusted to reduce loss
๐ Example
- Input:
x = [2.0, 1.0]
- Forward pass โ prediction:
ลท = 0.76
- Compute loss:
L(ลท, y)
- Backward pass โ โL/โwโ, โL/โwโ
- Update weights using gradient descent
๐ Visualization
Track data in the forward direction, and gradients in reverse.
๐ง Summary
| Phase | Role | |-----------|-------------------------------------| | Forward | Compute prediction | | Backward | Compute gradients and update weights|
โ Self-Check
- What happens in the forward pass?
- How are weights updated?
- Why is the backward pass necessary?