Vinit Vyas
WritingToolsTopicsAbout

Topics

backpropagation

3 posts

Oct 30, 2025·93 min read·advanced

Backpropagation Part 3: Systems, Stability, Interpretability, Frontiers

Theory assumes infinite precision; hardware delivers float16. Bridge the gap between mathematical backprop and production systems. In this post, we cover a lot of "practical" ground from PyTorch's tape to mixed precision training, from numerical disasters to systematic testing, from gradient monitoring to interpretability. What breaks, why, and how to fix it.

Oct 27, 2025·112 min read·intermediate

Backpropagation Part 2: Patterns, Architectures, and Training

Every gradient rule, from convolutions to attention, follows one pattern: the vector-Jacobian product. See past the memorized formulas to the unifying abstraction, understand how residuals and normalization tame deep networks, and learn why modern architectures are really just careful gradient engineering.

Oct 24, 2025·138 min read·foundation

Backpropagation Part 1: From Graphs to a Working MLP

Backprop computes a million gradients for the price of two forward passes. From computational graphs to adjoints, from chain rule to a working neural network, this is the algorithm that made deep learning possible; and is demystified here step by step.

2026
GitHubXLinkedIn