JCSE, vol. 16, no. 1, pp.52-62, 2022
DOI: http://dx.doi.org/10.5626/JCSE.2022.16.1.52
Stable Federated Learning with Dataset Condensation
Seong-Woong Kim and Dong-Wan Choi
Department of Computer Science and Engineering, Inha University, Incheon, Korea
Abstract: Federated learning (FL) is a new machine learning paradigm, where multiple clients learn their local models to collaboratively integrate into a single global model. Unlike centralized learning, the global model being integrated cannot be tested in FL as the server does not collect any data samples, further, the global model is often sent back and immediately applied to clients even at the middle of training such as Gboard. Therefore, if the performance of the global model is not stable, which is, unfortunately, the case in many FL scenarios with non-IID data, clients can be provided with an inaccurate model. This paper explores the main reason for this training instability of FL, that is, what we call temporary imbalance that happens across rounds, leading to loss of knowledge from previous rounds. To solve this problem, we propose a dataset condensation method to summarize the local data for each client without compromising on privacy. The condensed data are transmitted to the server with the local model and utilized by the server to ensure stable and consistent performance of the global model. Experimental results show that the global model not only achieves training stability but also exhibits a fast convergence speed.
Keyword:
Deep learning; Federated learning; Dataset compression; Class imbalance
Full Paper: 254 Downloads, 1111 View
|