You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Mahdi Beitollahi, Alex Bie, Sobhan Hemati, Leo Maxime Brunswic, Xu Li, Xi Chen, Guojun Zhang (2024). Parametric Feature Transfer: One-shot Federated Learning with Foundation Models
Maybe give motivations about why the paper should be implemented as a baseline.
This paper introduces FedPFT (Federated Learning with Parametric Feature Transfer), a method that utilizes the transferability of foundation models to enhance both accuracy and communication efficiency in one-shot FL. FedPFT brings foundation models to one-shot FL without requiring users to train or transmit large foundation models to the server.
Is there something else you want to add?
This baseline surpasses previous one-shot SOTA methods. I think it can be very useful for the Flower community to utilize large, pre-trained models.
I will be implementing the main experiment of the paper (Sec. for 5.2) using 50 clients, the Caltech101 dataset, and Clip-ViT as the feature extractor.
Implementation
To implement this baseline, it is recommended to do the following items in that order:
Hi @mahdibeit , this paper looks cool! Let me know if you need of any help to start with the baseline creation process. If some of the steps aren't clear, ping me!
Hi @mahdibeit, thank you for the ping. I've approved the workflow that runs the CI tests (one of which is for the tests that check your baseline). I'll review your code soon
@mahdibeit , many thanks for adding your FedPFT into the Flower Baselines!! Would you be interested in presenting it in an upcoming Flower Monthly event? It runs on the first Wednesday of each month. Would the June session (June 5th) work for you? More info in our Flower Monthly page. Feel free to reach to me via the Flower Slack! (https://flower.ai/join-slack/)
Paper
Mahdi Beitollahi, Alex Bie, Sobhan Hemati, Leo Maxime Brunswic, Xu Li, Xi Chen, Guojun Zhang (2024). Parametric Feature Transfer: One-shot Federated Learning with Foundation Models
Link
https://arxiv.org/abs/2402.01862
Maybe give motivations about why the paper should be implemented as a baseline.
This paper introduces FedPFT (Federated Learning with Parametric Feature Transfer), a method that utilizes the transferability of foundation models to enhance both accuracy and communication efficiency in one-shot FL. FedPFT brings foundation models to one-shot FL without requiring users to train or transmit large foundation models to the server.
Is there something else you want to add?
This baseline surpasses previous one-shot SOTA methods. I think it can be very useful for the Flower community to utilize large, pre-trained models.
I will be implementing the main experiment of the paper (Sec. for 5.2) using 50 clients, the Caltech101 dataset, and Clip-ViT as the feature extractor.
Implementation
To implement this baseline, it is recommended to do the following items in that order:
For first time contributors
first contribution
docPrepare - understand the scope
Verify your implementation
EXTENDED_README.md
that was created in your baseline directoryREADME.md
is ready to be run by someone that is no familiar with your code. Are all step-by-step instructions clear?README.md
and verify everything runs.The text was updated successfully, but these errors were encountered: