Heppe/etal/2020a: Resource-Constrained On-Device Learning by Dynamic Averaging

Bibtype Inproceedings
Bibkey Heppe/etal/2020a
Author Heppe, Lukas and Kamp, Michael and Adilova, Linara and Piatkowski, Nico and Heinrich, Danny and Morik, Katharina
Ls8autor Heinrich, Danny
Heppe, Lukas
Morik, Katharina
Piatkowski, Nico
Title Resource-Constrained On-Device Learning by Dynamic Averaging
Journal ECML PKDD 2020 - Workshop on Parallel, Distributed and Federated Learning
Abstract The communication between data-generating devices is par-
tially responsible for a growing portion of the world’s power consumption.
Thus reducing communication is vital, both, from an economical and an
ecological perspective. For machine learning, on-device learning avoids
sending raw data, which can reduce communication substantially. Fur-
thermore, not centralizing the data protects privacy-sensitive data. How-
ever, most learning algorithms require hardware with high computation
power and thus high energy consumption. In contrast, ultra-low-power
processors, like FPGAs or micro-controllers, allow for energy-efficient
learning of local models. Combined with communication-efficient dis-
tributed learning strategies, this reduces the overall energy consumption
and enables applications that were yet impossible due to limited energy
on local devices. The major challenge is then, that the low-power pro-
cessors typically only have integer processing capabilities. This paper
investigates an approach to communication-efficient on-device learning
of integer exponential families that can be executed on low-power pro-
cessors, is privacy-preserving, and effectively minimizes communication.
The empirical evaluation shows that the approach can reach a model
quality comparable to a centrally learned regular model with an order of
magnitude less communication. Comparing the overall energy consump-
tion, this reduces the required energy for solving the machine learning
task by a significant amount.
Year 2020
Projekt SFB876-A1

  • Privacy Policy
  • Imprint