{"article_type":"original","citation":{"ista":"Javanmard A, Mondelli M, Montanari A. 2020. Analysis of a two-layer neural network via displacement convexity. Annals of Statistics. 48(6), 3619–3642.","chicago":"Javanmard, Adel, Marco Mondelli, and Andrea Montanari. “Analysis of a Two-Layer Neural Network via Displacement Convexity.” Annals of Statistics. Institute of Mathematical Statistics, 2020. https://doi.org/10.1214/20-AOS1945.","apa":"Javanmard, A., Mondelli, M., & Montanari, A. (2020). Analysis of a two-layer neural network via displacement convexity. Annals of Statistics. Institute of Mathematical Statistics. https://doi.org/10.1214/20-AOS1945","ama":"Javanmard A, Mondelli M, Montanari A. Analysis of a two-layer neural network via displacement convexity. Annals of Statistics. 2020;48(6):3619-3642. doi:10.1214/20-AOS1945","short":"A. Javanmard, M. Mondelli, A. Montanari, Annals of Statistics 48 (2020) 3619–3642.","mla":"Javanmard, Adel, et al. “Analysis of a Two-Layer Neural Network via Displacement Convexity.” Annals of Statistics, vol. 48, no. 6, Institute of Mathematical Statistics, 2020, pp. 3619–42, doi:10.1214/20-AOS1945.","ieee":"A. Javanmard, M. Mondelli, and A. Montanari, “Analysis of a two-layer neural network via displacement convexity,” Annals of Statistics, vol. 48, no. 6. Institute of Mathematical Statistics, pp. 3619–3642, 2020."},"language":[{"iso":"eng"}],"_id":"6748","page":"3619-3642","doi":"10.1214/20-AOS1945","publication":"Annals of Statistics","author":[{"first_name":"Adel","last_name":"Javanmard","full_name":"Javanmard, Adel"},{"first_name":"Marco","last_name":"Mondelli","id":"27EB676C-8706-11E9-9510-7717E6697425","full_name":"Mondelli, Marco","orcid":"0000-0002-3242-7020"},{"first_name":"Andrea","last_name":"Montanari","full_name":"Montanari, Andrea"}],"publication_identifier":{"issn":["1932-6157"],"eissn":["1941-7330"]},"external_id":{"isi":["000598369200021"],"arxiv":["1901.01375"]},"isi":1,"volume":48,"intvolume":" 48","year":"2020","issue":"6","publication_status":"published","date_published":"2020-12-11T00:00:00Z","abstract":[{"text":"Fitting a function by using linear combinations of a large number N of `simple' components is one of the most fruitful ideas in statistical learning. This idea lies at the core of a variety of methods, from two-layer neural networks to kernel regression, to boosting. In general, the resulting risk minimization problem is non-convex and is solved by gradient descent or its variants. Unfortunately, little is known about global convergence properties of these approaches.\r\nHere we consider the problem of learning a concave function f on a compact convex domain Ω⊆ℝd, using linear combinations of `bump-like' components (neurons). The parameters to be fitted are the centers of N bumps, and the resulting empirical risk minimization problem is highly non-convex. We prove that, in the limit in which the number of neurons diverges, the evolution of gradient descent converges to a Wasserstein gradient flow in the space of probability distributions over Ω. Further, when the bump width δ tends to 0, this gradient flow has a limit which is a viscous porous medium equation. Remarkably, the cost function optimized by this gradient flow exhibits a special property known as displacement convexity, which implies exponential convergence rates for N→∞, δ→0. Surprisingly, this asymptotic theory appears to capture well the behavior for moderate values of δ,N. Explaining this phenomenon, and understanding the dependence on δ,N in a quantitative manner remains an outstanding challenge.","lang":"eng"}],"main_file_link":[{"url":"https://arxiv.org/abs/1901.01375","open_access":"1"}],"department":[{"_id":"MaMo"}],"day":"11","type":"journal_article","date_updated":"2024-03-06T08:28:50Z","month":"12","article_processing_charge":"No","status":"public","quality_controlled":"1","title":"Analysis of a two-layer neural network via displacement convexity","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","date_created":"2019-07-31T09:39:42Z","oa":1,"oa_version":"Preprint","publisher":"Institute of Mathematical Statistics"}