{"date_updated":"2023-09-13T08:32:23Z","publication_identifier":{"isbn":["9781510860964"]},"article_processing_charge":"No","month":"05","external_id":{"arxiv":["1705.11041"]},"publication":"Advances in Neural Information Processing Systems","author":[{"orcid":"0000-0002-4850-0683","last_name":"Locatello","first_name":"Francesco","full_name":"Locatello, Francesco","id":"26cfd52f-2483-11ee-8040-88983bcc06d4"},{"last_name":"Tschannen","first_name":"Michael","full_name":"Tschannen, Michael"},{"last_name":"Rätsch","first_name":"Gunnar","full_name":"Rätsch, Gunnar"},{"last_name":"Jaggi","first_name":"Martin","full_name":"Jaggi, Martin"}],"day":"31","type":"conference","main_file_link":[{"url":"https://arxiv.org/abs/1705.11041","open_access":"1"}],"abstract":[{"lang":"eng","text":"Greedy optimization methods such as Matching Pursuit (MP) and Frank-Wolfe (FW) algorithms regained popularity in recent years due to their simplicity, effectiveness and theoretical guarantees. MP and FW address optimization over the linear span and the convex hull of a set of atoms, respectively. In this paper, we consider the intermediate case of optimization over the convex cone, parametrized as the conic hull of a generic atom set, leading to the first principled definitions of non-negative MP algorithms for which we give explicit convergence rates and demonstrate excellent empirical performance. In particular, we derive sublinear (O(1/t)) convergence on general smooth and convex objectives, and linear convergence (O(e−t)) on strongly convex objectives, in both cases for general sets of atoms. Furthermore, we establish a clear correspondence of our algorithms to known algorithms from the MP and FW literature. Our novel algorithms and analyses target general atom sets and general objective functions, and hence are directly applicable to a large variety of learning settings."}],"_id":"14206","department":[{"_id":"FrLo"}],"language":[{"iso":"eng"}],"extern":"1","conference":{"name":"NeurIPS: Neural Information Processing Systems","start_date":"2017-12-04","location":"Long Beach, CA, United States","end_date":"2017-12-09"},"date_published":"2017-05-31T00:00:00Z","citation":{"short":"F. Locatello, M. Tschannen, G. Rätsch, M. Jaggi, in:, Advances in Neural Information Processing Systems, 2017.","ama":"Locatello F, Tschannen M, Rätsch G, Jaggi M. Greedy algorithms for cone constrained optimization with convergence guarantees. In: Advances in Neural Information Processing Systems. ; 2017.","ista":"Locatello F, Tschannen M, Rätsch G, Jaggi M. 2017. Greedy algorithms for cone constrained optimization with convergence guarantees. Advances in Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems.","apa":"Locatello, F., Tschannen, M., Rätsch, G., & Jaggi, M. (2017). Greedy algorithms for cone constrained optimization with convergence guarantees. In Advances in Neural Information Processing Systems. Long Beach, CA, United States.","chicago":"Locatello, Francesco, Michael Tschannen, Gunnar Rätsch, and Martin Jaggi. “Greedy Algorithms for Cone Constrained Optimization with Convergence Guarantees.” In Advances in Neural Information Processing Systems, 2017.","ieee":"F. Locatello, M. Tschannen, G. Rätsch, and M. Jaggi, “Greedy algorithms for cone constrained optimization with convergence guarantees,” in Advances in Neural Information Processing Systems, Long Beach, CA, United States, 2017.","mla":"Locatello, Francesco, et al. “Greedy Algorithms for Cone Constrained Optimization with Convergence Guarantees.” Advances in Neural Information Processing Systems, 2017."},"publication_status":"published","year":"2017","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","date_created":"2023-08-22T14:17:38Z","oa_version":"Preprint","oa":1,"status":"public","title":"Greedy algorithms for cone constrained optimization with convergence guarantees","quality_controlled":"1"}