♾️ Infinite Widths Part II: The Neural Tangent Kernel
Published:
This is the second post of a short series on the infinite-width limits of deep neural networks (DNNs). Previously, we reviewed the correspondence between neural networks and Gaussian Processes (GPs), basically finding that, as the number neurons in the hidden layers grows to infinity, the output of a random network becomes Gaussian distributed.