Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin

Xizhu Wu*, Song Liu, Zhi-Hua Zhou

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)


Nowadays, many problems require learning a model from data owned by different participants who are restricted to share their examples due to privacy concerns, which is referred to as multiparty learning in the literature. In conventional multiparty learning, a global model is usually trained from scratch via a communication protocol, ignoring the fact that each party may already have a local model trained on her own dataset. In this paper, we define a multiparty multiclass margin to measure the global behavior of a set of heterogeneous local models, and propose a general learning method called HMR (Heterogeneous Model Reuse) to optimize the margin. Our method reuses local models to approximate a global model, even when data are non-i.i.d distributed among parties, by exchanging few examples under predefined budget. Experiments on synthetic and real-world data covering different multiparty scenarios show the effectiveness of our proposal.
Original languageEnglish
Title of host publicationProceedings of the 36th International Conference on Machine Learning
Publication statusPublished - Jun 2019
EventInternational Conference on Machine Learning - Long Beach, United States
Duration: 9 Jun 201915 Jun 2019
Conference number: 36


ConferenceInternational Conference on Machine Learning
Abbreviated titleICML 2019
Country/TerritoryUnited States
CityLong Beach
Internet address


Dive into the research topics of 'Heterogeneous Model Reuse via Optimizing Multiparty Multiclass Margin'. Together they form a unique fingerprint.

Cite this