AI Lessons
Explore individual AI lessons covering fundamental concepts, practical applications, and advanced topics. Each lesson is designed to be self-contained and focused on a specific aspect of artificial intelligence.
What is AI, ML and DL?
Understand the differences between AI, machine learning, and deep learning, and how they relate to each other with real-world examples.
Types of AI
Learn how AI is classified by capabilities (Narrow, General, Superintelligent) and cognitive features.
Where AI is Used Today
Explore real-world applications of artificial intelligence across industries and everyday life.
Myths and Realities About AI
Debunk common AI myths about consciousness, jobs, bias, and language understanding. Separate fact from fiction.
What is an AI Model?
Understand what an AI model is, how it makes predictions, and how it learns from data.
Supervised, Unsupervised, Reinforcement Learning
Explore the three main types of machine learning: supervised, unsupervised, and reinforcement learning.
What is a Loss Function?
Understand what a loss function is, why it matters, and how it's used during training to improve models.
How Models Learn via Gradient Descent
Learn how gradient descent helps models improve step by step by minimizing the loss function.
Gradient Descent: Math, Derivatives, Optimizers
Dive deeper into the math behind gradient descent, including partial derivatives and popular optimization methods.
Overfitting and Generalization
Understand what overfitting means in machine learning and how to detect and prevent it.
Neurons, Weights, and Layers
Understand how artificial neurons, weights, and layers form the building blocks of neural networks.
ReLU, Sigmoid, Tanh: Activation Functions
Learn how activation functions like ReLU, Sigmoid, and Tanh shape the outputs of neurons.
Forward and Backward Propagation
Understand how data flows through a neural network in the forward pass, and how gradients flow backward during training.
Gradients and Derivatives: Backpropagation Deep Dive
A deeper look into how backpropagation works using calculus and partial derivatives.
Epochs, Batches and Learning Rate
Learn the concepts of epochs, batch size, and learning rate in the training loop of a neural network.
What Happens During Training?
A high-level overview of how a neural network trains step by step: from data to improved predictions.