Communication-optimal distributed clustering

Jiecao Chen, He Sun, David Woodruff, Qin Zhang

Research output: Contribution to conferenceConference Posterpeer-review

48 Downloads (Pure)

Abstract

Clustering large datasets is a fundamental problem with a number of applications in machine learning. Data is often collected on different sites and clustering needs to be performed in a distributed manner with low communication. We would like the quality of the clustering in the distributed setting to match that in the centralized setting for which all the data resides on a single site. In this work, we study both graph and geometric clustering problems in two distributed models: (1) a point-to-point model, and (2) a model with a broadcast channel. We give protocols in both models which we show are nearly optimal by proving almost matching communication lower bounds. Our work highlights the surprising power of a broadcast channel for clustering problems; roughly speaking, to cluster n points or n vertices in a graph distributed across s servers, for a worst-case partitioning the communication complexity in a point-to-point model is n · s, while in the broadcast model it is n + s. We implement our algorithms and demonstrate this phenomenon on real life datasets, showing that our algorithms are also very efficient in practice.
Original languageEnglish
Publication statusUnpublished - Dec 2016
EventAnnual Conference on Neural Information Processing -
Duration: 4 Dec 20169 Dec 2016
Conference number: 30th

Conference

ConferenceAnnual Conference on Neural Information Processing
Abbreviated titleNIPS 16
Period4/12/169/12/16

Keywords

  • distributed computing
  • graph clustering

Fingerprint

Dive into the research topics of 'Communication-optimal distributed clustering'. Together they form a unique fingerprint.

Cite this