{"oa_version":"Preprint","oa":1,"date_created":"2019-02-14T14:51:57Z","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","publisher":"ML Research Press","scopus_import":"1","title":"Data-dependent stability of stochastic gradient descent","quality_controlled":"1","status":"public","type":"conference","day":"01","article_processing_charge":"No","month":"02","date_updated":"2023-10-17T09:51:13Z","date_published":"2018-02-01T00:00:00Z","publication_status":"published","department":[{"_id":"ChLa"}],"main_file_link":[{"url":"https://arxiv.org/abs/1703.01678","open_access":"1"}],"abstract":[{"text":"We establish a data-dependent notion of algorithmic stability for Stochastic Gradient Descent (SGD), and employ it to develop novel generalization bounds. This is in contrast to previous distribution-free algorithmic stability results for SGD which depend on the worst-case constants. By virtue of the data-dependent argument, our bounds provide new insights into learning with SGD on convex and non-convex problems. In the convex case, we show that the bound on the generalization error depends on the risk at the initialization point. In the non-convex case, we prove that the expected curvature of the objective function around the initialization point has crucial influence on the generalization error. In both cases, our results suggest a simple data-driven strategy to stabilize SGD by pre-screening its initialization. As a corollary, our results allow us to show optimistic generalization bounds that exhibit fast convergence rates for SGD subject to a vanishing empirical risk and low noise of stochastic gradient. ","lang":"eng"}],"intvolume":" 80","volume":80,"year":"2018","isi":1,"project":[{"call_identifier":"FP7","name":"Lifelong Learning of Visual Scene Understanding","_id":"2532554C-B435-11E9-9278-68D0E5697425","grant_number":"308036"}],"publication":"Proceedings of the 35 th International Conference on Machine Learning","author":[{"full_name":"Kuzborskij, Ilja","first_name":"Ilja","last_name":"Kuzborskij"},{"orcid":"0000-0001-8622-7887","last_name":"Lampert","first_name":"Christoph","id":"40C20FD2-F248-11E8-B48F-1D18A9856A87","full_name":"Lampert, Christoph"}],"page":"2815-2824","external_id":{"arxiv":["1703.01678"],"isi":["000683379202095"]},"citation":{"ama":"Kuzborskij I, Lampert C. Data-dependent stability of stochastic gradient descent. In: Proceedings of the 35 Th International Conference on Machine Learning. Vol 80. ML Research Press; 2018:2815-2824.","ista":"Kuzborskij I, Lampert C. 2018. Data-dependent stability of stochastic gradient descent. Proceedings of the 35 th International Conference on Machine Learning. ICML: International Conference on Machine Learning vol. 80, 2815–2824.","chicago":"Kuzborskij, Ilja, and Christoph Lampert. “Data-Dependent Stability of Stochastic Gradient Descent.” In Proceedings of the 35 Th International Conference on Machine Learning, 80:2815–24. ML Research Press, 2018.","apa":"Kuzborskij, I., & Lampert, C. (2018). Data-dependent stability of stochastic gradient descent. In Proceedings of the 35 th International Conference on Machine Learning (Vol. 80, pp. 2815–2824). Stockholm, Sweden: ML Research Press.","short":"I. Kuzborskij, C. Lampert, in:, Proceedings of the 35 Th International Conference on Machine Learning, ML Research Press, 2018, pp. 2815–2824.","mla":"Kuzborskij, Ilja, and Christoph Lampert. “Data-Dependent Stability of Stochastic Gradient Descent.” Proceedings of the 35 Th International Conference on Machine Learning, vol. 80, ML Research Press, 2018, pp. 2815–24.","ieee":"I. Kuzborskij and C. Lampert, “Data-dependent stability of stochastic gradient descent,” in Proceedings of the 35 th International Conference on Machine Learning, Stockholm, Sweden, 2018, vol. 80, pp. 2815–2824."},"conference":{"name":"ICML: International Conference on Machine Learning","start_date":"2018-07-10","location":"Stockholm, Sweden","end_date":"2018-07-15"},"ec_funded":1,"_id":"6011","language":[{"iso":"eng"}]}