Forward and Backward Propagation

Summary

Understand how data flows through a neural network in the forward pass, and how gradients flow backward during training.

basic
neural-network-basics

Training a neural network happens in two main passes:

  • Forward pass: Predict output from input
  • Backward pass: Update weights using gradients

๐Ÿ” Forward Pass

In the forward pass:

  1. Inputs flow through the layers
  2. Neurons apply weights and activations
  3. Output is generated (e.g. classification)
z = wโ‚xโ‚ + wโ‚‚xโ‚‚ + b
a = activation(z)

Each layer transforms data until we reach the output.


๐Ÿ”™ Backward Pass (Backpropagation)

After comparing output to the correct label (using a loss function), we need to update weights.

This happens via backpropagation:

  • Gradients of loss are computed using chain rule
  • Gradients flow from output layer โ†’ input layer
  • Weights are adjusted to reduce loss

๐Ÿ”— Example

  1. Input: x = [2.0, 1.0]
  2. Forward pass โ†’ prediction: ลท = 0.76
  3. Compute loss: L(ลท, y)
  4. Backward pass โ†’ โˆ‚L/โˆ‚wโ‚, โˆ‚L/โˆ‚wโ‚‚
  5. Update weights using gradient descent

๐Ÿ”„ Visualization

Track data in the forward direction, and gradients in reverse.


๐Ÿง  Summary

| Phase | Role | |-----------|-------------------------------------| | Forward | Compute prediction | | Backward | Compute gradients and update weights|


โœ… Self-Check

  • What happens in the forward pass?
  • How are weights updated?
  • Why is the backward pass necessary?