{"date_updated":"2022-06-27T06:54:31Z","month":"12","article_processing_charge":"No","day":"06","type":"conference","main_file_link":[{"url":"https://proceedings.neurips.cc/paper/2021/file/3b92d18aa7a6176dd37d372bc2f1eb71-Paper.pdf","open_access":"1"}],"abstract":[{"lang":"eng","text":"We consider a standard distributed optimisation setting where N machines, each holding a d-dimensional function\r\nfi, aim to jointly minimise the sum of the functions ∑Ni=1fi(x). This problem arises naturally in large-scale distributed optimisation, where a standard solution is to apply variants of (stochastic) gradient descent. We focus on the communication complexity of this problem: our main result provides the first fully unconditional bounds on total number of bits which need to be sent and received by the N machines to solve this problem under point-to-point communication, within a given error-tolerance. Specifically, we show that Ω(Ndlogd/Nε) total bits need to be communicated between the machines to find an additive ϵ-approximation to the minimum of ∑Ni=1fi(x). The result holds for both deterministic and randomised algorithms, and, importantly, requires no assumptions on the algorithm structure. The lower bound is tight under certain restrictions on parameter values, and is matched within constant factors for quadratic objectives by a new variant of quantised gradient descent, which we describe and analyse. Our results bring over tools from communication complexity to distributed optimisation, which has potential for further applications."}],"department":[{"_id":"DaAl"}],"publication_status":"published","date_published":"2021-12-06T00:00:00Z","scopus_import":"1","publisher":"Curran Associates","date_created":"2022-06-26T22:01:35Z","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","oa":1,"oa_version":"Published Version","status":"public","quality_controlled":"1","title":"Towards tight communication lower bounds for distributed optimisation","publication_identifier":{"isbn":["9781713845393"],"issn":["1049-5258"]},"external_id":{"arxiv":["2010.08222"]},"page":"7254-7266","author":[{"full_name":"Alistarh, Dan-Adrian","id":"4A899BFC-F248-11E8-B48F-1D18A9856A87","last_name":"Alistarh","first_name":"Dan-Adrian","orcid":"0000-0003-3650-940X"},{"first_name":"Janne","last_name":"Korhonen","id":"C5402D42-15BC-11E9-A202-CA2BE6697425","full_name":"Korhonen, Janne"}],"publication":"35th Conference on Neural Information Processing Systems","language":[{"iso":"eng"}],"_id":"11464","ec_funded":1,"citation":{"ama":"Alistarh D-A, Korhonen J. Towards tight communication lower bounds for distributed optimisation. In: 35th Conference on Neural Information Processing Systems. Vol 34. Curran Associates; 2021:7254-7266.","apa":"Alistarh, D.-A., & Korhonen, J. (2021). Towards tight communication lower bounds for distributed optimisation. In 35th Conference on Neural Information Processing Systems (Vol. 34, pp. 7254–7266). Virtual, Online: Curran Associates.","chicago":"Alistarh, Dan-Adrian, and Janne Korhonen. “Towards Tight Communication Lower Bounds for Distributed Optimisation.” In 35th Conference on Neural Information Processing Systems, 34:7254–66. Curran Associates, 2021.","ista":"Alistarh D-A, Korhonen J. 2021. Towards tight communication lower bounds for distributed optimisation. 35th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems vol. 34, 7254–7266.","short":"D.-A. Alistarh, J. Korhonen, in:, 35th Conference on Neural Information Processing Systems, Curran Associates, 2021, pp. 7254–7266.","mla":"Alistarh, Dan-Adrian, and Janne Korhonen. “Towards Tight Communication Lower Bounds for Distributed Optimisation.” 35th Conference on Neural Information Processing Systems, vol. 34, Curran Associates, 2021, pp. 7254–66.","ieee":"D.-A. Alistarh and J. Korhonen, “Towards tight communication lower bounds for distributed optimisation,” in 35th Conference on Neural Information Processing Systems, Virtual, Online, 2021, vol. 34, pp. 7254–7266."},"conference":{"name":"NeurIPS: Neural Information Processing Systems","start_date":"2021-12-06","location":"Virtual, Online","end_date":"2021-12-14"},"year":"2021","volume":34,"intvolume":" 34","project":[{"call_identifier":"H2020","grant_number":"805223","name":"Elastic Coordination for Scalable Machine Learning","_id":"268A44D6-B435-11E9-9278-68D0E5697425"}],"acknowledgement":"We thank the NeurIPS reviewers for insightful comments that helped us improve the positioning of our results, as well as for pointing out the subsampling approach for complementing the randomised lower bound. We also thank Foivos Alimisis and Peter Davies for useful discussions. This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 805223 ScaleML)."}