Discrete convolutions, from probability, to image processing and FFTs.
Help fund future projects: www.patreon.com/3blue1brown
Special thanks to these supporters: 3b1b.co/lessons/convolutions#thanks
An equally valuable form of support is to simply share the videos.
Other videos I referenced
Live lecture on image convolutions for the MIT Julia lab
Lecture on Discrete Fourier Transforms
Reducible video on FFTs
Veritasium video on FFTs
A small correction for the integer multiplication algorithm mentioned at the end. A “straightforward” application of FFT results in a runtime of O(N * log(n) log(log(n)) ). That log(log(n)) term is tiny, but it is only recently in 2019, Harvey and van der Hoeven found an algorithm that removed that log(log(n)) term.
Another small correction at 17:00
. I describe O(N^2) as meaning "the number of operations needed scales with N^2". However, this is technically what Theta(N^2) would mean. O(N^2) would mean that the number of operations needed is at most constant times N^2, in particular, it includes algorithms whose runtimes don't actually have any N^2 term, but which are bounded by it. The distinction doesn't matter in this case, since there is an explicit N^2 term.
These animations are largely made using a custom python library, manim. See the FAQ comments here:
You can find code for specific videos and projects here:
Music by Vincent Rubinetti.
Download the music on Bandcamp:
Stream the music on Spotify:
- Where do convolutions show up?
- Add two random variables
- A simple example
- Moving averages
- Image processing
- Measuring runtime
- Polynomial multiplication
- Speeding up with FFTs
- Concluding thoughts
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe: 3b1b.co/subscribe
Various social media stuffs: