Skip to content

This repo. contains our implementation for Federated Learning with PEFT methods (e.g. Adapters) integrated with frozen WavLM

License

Notifications You must be signed in to change notification settings

mnabihali/FL-WavLM-with-Adapters

Repository files navigation

EFL-PEFT Banner

Description

This repo. contains our implementation for our research "EFL-PEFT: A communication Efficient Federated Learning framework using PEFT sparsification for ASR". The paper is submitted to ICASSP (2025). More details on how to use the code, and advanced scripts will be available soon. Stay tuned...

EFL-PEFT architecture

WavLM + EL adapters

How to use

Dataset preparation

--> Will add it asap <--

Train

Syntax: To use EL adapters python client.py --train_lawithea true

Inference

Syntax: To use EL adapters python inference.py --train_lawithea true

References

Our implementation is based on this nice work:

@inproceedings{otake2023parameter,
  title = {Parameter Efficient Transfer Learning for Various Speech Processing Tasks},
  author = {S. Otake, R. Kawakami, N. Inoue},
  booktitle = {Proc. ICASSP},
  year = {2023},
  code_url = {https://github.com/sinhat98/adapter-wavlm}
}

Publication

@inproceedings{mnabihali,
  title = {EFL-PEFT: A Communication Efficient Federated Learning framework using PEFT sparsification for ASR},
  author = {M.Nabih, D. Falavigna, A. Brutti},
  booktitle = {Proc. of ICASSP},
  year = {2025},
}

Acknowledgment

  • We acknowledge the support of the PNRR project FAIR - Future AI Research (PE00000013), under the NRRP MUR program funded by the NextGenerationEU.

  • We acknowledge the CINECA award under the ISCRC initiative, for the availability of high performance computing resources and support”

About

This repo. contains our implementation for Federated Learning with PEFT methods (e.g. Adapters) integrated with frozen WavLM

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages