Abstract
Nowadays, many problems require learning a model from data owned by different participants who are restricted to share their examples due to privacy concerns, which is referred to as multiparty learning in the literature. In conventional multiparty learning, a global model is usually trained from scratch via a communication protocol, ignoring the fact that each party may already have a local model trained on her own dataset. In this paper, we define a multiparty multiclass margin to measure the global behavior of a set of heterogeneous local models, and propose a general learning method called HMR (Heterogeneous Model Reuse) to optimize the margin. Our method reuses local models to approximate a global model, even when data are non-i.i.d distributed among parties, by exchanging few examples under predefined budget. Experiments on synthetic and real-world data covering different multiparty scenarios show the effectiveness of our proposal.
Original language | English |
---|---|
Title of host publication | Proceedings of the 36th International Conference on Machine Learning |
Pages | 6840-6849 |
Volume | 97 |
Publication status | Published - Jun 2019 |
Event | International Conference on Machine Learning - Long Beach, United States Duration: 9 Jun 2019 → 15 Jun 2019 Conference number: 36 https://icml.cc/Conferences/2019 |
Conference
Conference | International Conference on Machine Learning |
---|---|
Abbreviated title | ICML 2019 |
Country/Territory | United States |
City | Long Beach |
Period | 9/06/19 → 15/06/19 |
Internet address |