Infinitely wide limits for deep Stable neural networks: sub-linear, linear and super-linear activation functions

Abstract

There is a growing literature on the study of large-width properties of deep Gaussian neural networks (NNs), i.e. deep NNs with Gaussian-distributed parameters or weights, and Gaussian stochastic processes. Motivated by some empirical and theoretical studies showing the potential of replacing Gaussian distributions with Stable distributions, namely distributions with heavy tails, in this paper we investigate large-width properties of deep Stable NNs, i.e. deep NNs with Stable-distributed parameters. For sub-linear activation functions, a recent work has characterized the infinitely wide limit of a suitable rescaled deep Stable NN in terms of a Stable stochastic process, both under the assumption of a “joint growth” and under the assumption of a “sequential growth” of the width over the NN’s layers. Here, assuming a “sequential growth” of the width, we extend such a characterization to a general class of activation functions, which includes sub-linear, asymptotically linear and super-linear functions. As a novelty with respect to previous works, our results rely on the use of a generalized central limit theorem for heavy tails distributions, which allows for an interesting unified treatment of infinitely wide limits for deep Stable NNs. Our study shows that the scaling of Stable NNs and the stability of their infinitely wide limits may depend on the choice of the activation function, bringing out a critical difference with respect to the Gaussian setting.

Publication
In Transaction of Machine Learning Research
Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Alberto Bordino
Alberto Bordino
PhD Student in Statistics at Warwick University

My research interests include Stastistical Machine Learning, Nonparametric Statistics and Empirical Process Theory.