Towards tight communication lower bounds for distributed optimisation
Alistarh D-A, Korhonen J. 2021. Towards tight communication lower bounds for distributed optimisation. 35th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems vol. 34, 7254–7266.
Download (ext.)
Conference Paper
| Published
| English
Scopus indexed
Department
Abstract
We consider a standard distributed optimisation setting where N machines, each holding a d-dimensional function
fi, aim to jointly minimise the sum of the functions ∑Ni=1fi(x). This problem arises naturally in large-scale distributed optimisation, where a standard solution is to apply variants of (stochastic) gradient descent. We focus on the communication complexity of this problem: our main result provides the first fully unconditional bounds on total number of bits which need to be sent and received by the N machines to solve this problem under point-to-point communication, within a given error-tolerance. Specifically, we show that Ω(Ndlogd/Nε) total bits need to be communicated between the machines to find an additive ϵ-approximation to the minimum of ∑Ni=1fi(x). The result holds for both deterministic and randomised algorithms, and, importantly, requires no assumptions on the algorithm structure. The lower bound is tight under certain restrictions on parameter values, and is matched within constant factors for quadratic objectives by a new variant of quantised gradient descent, which we describe and analyse. Our results bring over tools from communication complexity to distributed optimisation, which has potential for further applications.
Publishing Year
Date Published
2021-12-06
Proceedings Title
35th Conference on Neural Information Processing Systems
Publisher
Curran Associates
Acknowledgement
We thank the NeurIPS reviewers for insightful comments that helped us improve the positioning of our results, as well as for pointing out the subsampling approach for complementing the randomised lower bound. We also thank Foivos Alimisis and Peter Davies for useful discussions. This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 805223 ScaleML).
Volume
34
Page
7254-7266
Conference
NeurIPS: Neural Information Processing Systems
Conference Location
Virtual, Online
Conference Date
2021-12-06 – 2021-12-14
ISBN
ISSN
IST-REx-ID
Cite this
Alistarh D-A, Korhonen J. Towards tight communication lower bounds for distributed optimisation. In: 35th Conference on Neural Information Processing Systems. Vol 34. Curran Associates; 2021:7254-7266.
Alistarh, D.-A., & Korhonen, J. (2021). Towards tight communication lower bounds for distributed optimisation. In 35th Conference on Neural Information Processing Systems (Vol. 34, pp. 7254–7266). Virtual, Online: Curran Associates.
Alistarh, Dan-Adrian, and Janne Korhonen. “Towards Tight Communication Lower Bounds for Distributed Optimisation.” In 35th Conference on Neural Information Processing Systems, 34:7254–66. Curran Associates, 2021.
D.-A. Alistarh and J. Korhonen, “Towards tight communication lower bounds for distributed optimisation,” in 35th Conference on Neural Information Processing Systems, Virtual, Online, 2021, vol. 34, pp. 7254–7266.
Alistarh D-A, Korhonen J. 2021. Towards tight communication lower bounds for distributed optimisation. 35th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems vol. 34, 7254–7266.
Alistarh, Dan-Adrian, and Janne Korhonen. “Towards Tight Communication Lower Bounds for Distributed Optimisation.” 35th Conference on Neural Information Processing Systems, vol. 34, Curran Associates, 2021, pp. 7254–66.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level
Open Access
Export
Marked PublicationsOpen Data ISTA Research Explorer
Sources
arXiv 2010.08222