Skip to content

ShuchiWu/EmInspector

Repository files navigation

EmInspector

This repository contains the code of the paper "EmInspector: Combating Backdoor Attacks in Federated Self-Supervised Learning Through Embedding Inspection", an unpolished yet functional version created by a novice 🙈. EmInspector is a defense mechanism designed to combat backdoor attacks in federated self-supervised learning by inspecting the representation/embedding space of uploaded local models and excluding suspicious ones. Datasets in .npz format and some of the models we trained can be accessed at datasets & models.

Pre-train an encoder

Following BadEncoder, we pre-train the image encoder using the widely-adopted contrastive learning approach, and build our federated system based on their implementation. Once the training data (primarily CIFAR10, STL10, and CIFAR100 as discussed in our paper) is prepared, you can initiate training with the following script:

python fed_simclr.py

Backdoor the pre-trained model

Since BadEncoder is one of the most renowned and effective backdoor attacks targeting self-supervised models, we employ it to evaluate the robustness of our federated self-supervised learning system. Additionally, considering the distributed nature of the federated system, we also consider another well-known backdoor attack scheme tailored for distributed machine learning systems, named DBA. In our paper, the original BadEncoder is termed as a single-pattern attack, while the DBA-incorporated version is termed as a coordinated-pattern attack (triggers are illustrated below).

To execute the backdoor attack, simply modify the trigger used by each malicious client and run the following script:

python backdoor_fssl.py

Downstream evaluation

For evaluations of the trained image encoders (main task accuracy and backdoor attack success rate), we adopt the linear probing protocol. This approach keeps the parameters of the pre-trained encoder frozen and trains an additional, specific classifier head (an MLP) for each dataset.

python downstream_classifier.py

Citation

If you use this code, please cith the paper:

@article{qian2024eminspector,
  title={EmInspector: Combating Backdoor Attacks in Federated Self-Supervised Learning Through Embedding Inspection},
  author={Qian, Yuwen and Wu, Shuchi and Wei, Kang and Ding, Ming and Xiao, Di and Xiang, Tao and Ma, Chuan and Guo, Song},
  journal={arXiv preprint arXiv:2405.13080},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages