Vinit Vyas
WritingToolsTopicsAbout

Topics

gradient-checking

2 posts

Oct 30, 2025·93 min read·advanced

Backpropagation Part 3: Systems, Stability, Interpretability, Frontiers

Theory assumes infinite precision; hardware delivers float16. Bridge the gap between mathematical backprop and production systems. In this post, we cover a lot of "practical" ground from PyTorch's tape to mixed precision training, from numerical disasters to systematic testing, from gradient monitoring to interpretability. What breaks, why, and how to fix it.

Oct 24, 2025·138 min read·foundation

Backpropagation Part 1: From Graphs to a Working MLP

Backprop computes a million gradients for the price of two forward passes. From computational graphs to adjoints, from chain rule to a working neural network, this is the algorithm that made deep learning possible; and is demystified here step by step.

2026
GitHubXLinkedIn