Vinit Vyas
WritingToolsTopicsAbout

Topics

exploding-gradients

1 post

Oct 24, 2025·138 min read·foundation

Backpropagation Part 1: From Graphs to a Working MLP

Backprop computes a million gradients for the price of two forward passes. From computational graphs to adjoints, from chain rule to a working neural network, this is the algorithm that made deep learning possible; and is demystified here step by step.

2026
GitHubXLinkedIn