Exponential expressivity in deep neural networks through transient chaos
Ben Poole, Stanford University firstname.lastname@example.org
Subhaneil Lahiri, Stanford University email@example.com
Maithra Raghu, Google Brain and Cornell University firstname.lastname@example.org
Surya Ganguli. Stanford University email@example.com
Jascha Sohl-Dickstein, Google Brain firstname.lastname@example.org
We combine Riemannian geometry with the mean field theory of high dimensional chaos to study the nature of signal propagation in generic, deep neural networks with random weights. Our results reveal an order-to-chaos expressivity phase transition, with networks in the chaotic phase computing nonlinear functions whose global curvature grows exponentially with depth but not width. We prove this generic class of deep random functions cannot be efficiently computed by any shal- low network, going beyond prior work restricted to the analysis of single functions. Moreover, we formalize and quantitatively demonstrate the long conjectured idea that deep networks can disentangle highly curved manifolds in input space into flat manifolds in hidden space. Our theoretical analysis of the expressive power of deep networks broadly applies to arbitrary nonlinearities, and provides a quantitative underpinning for previously abstract notions about the geometry of deep functions.