Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 10 Oct 2023

One major problem concerned in federated learning is data non-IIDness. Existing federated learning methods to deal with non-IID data generally assume that the data is globally balanced. However, real-world multi-class data tends to exhibit long-tail distribution. Therefore, we propose a new federated learning method called Federated Aggregated Meta Mapping (FedAMM) to address the joint problem of non-IID and global long-tailed data in a federated learning scenario. FedAMM assigns different weights to the local training samples by trainable loss-weight mapping in a meta-learning manner. To deal with data non-IIDness and global long-tail, the meta loss-weight mappings are aggregated on the server to acquire global long-tail distribution knowledge implicitly. We further propose an asynchronous meta updating mechanism to reduce the communication cost for meta-learning training. Experiments show that FedAMM outperforms the state-of-the-art federated learning methods.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00