Neural Networks
From the basic perceptron to convolutional, recurrent, and sequence models — how neural networks are structured and how they learn.
In This Section
Perceptrons & Feedforward Networks
Neurons, layers, activations, and the universal approximation theorem.
Backpropagation — How Gradients Flow
The computational graph, chain rule in practice, and how weights are updated.
CNNs — Convolutional Networks
Filters, pooling, receptive fields, and why convolutions work for images.
RNNs, LSTMs & Sequence Models
Recurrent architectures for sequences, the vanishing gradient problem, and LSTM gating.
Encoder-Decoder & Seq2Seq
The seq2seq architecture, attention in encoder-decoder models, and T5/BART.