Prepare your keyboards, it’s time for hacking!
Come and try to develop attacks capable of penetrating our defences in a federated learning environment. You’ll have the opportunity to attack our MUSKETEER platform and try to penetrate our defences and affect our federated learning training in 3 different scenarios.
No defences. Participants will work on developing and implementing poisoning attacks on their own for a scenario with a number of honest clients and malicious users in a no defences scenario.
Participants will work on developing and implementing poisoning attacks on their own for a scenario with honest clients and malicious users. In front, there will be a defence method performed by the consortium members.
Participants will work on developing and implementing poisoning attacks on their own. The participants will perform a black box attack against our system.
COVID-19 Update: the current crisis does not allow us to properly welcome the participants in a physical meeting. Therefore, this hackathon will be 100% online.
- Solid Python 3 programming skills, experience with training simple classifier models in Keras, basic understanding of federated learning.
- Familiarity with federated learning will be ideal.
- Participants are required to use their own laptops.
MUSKETEER aims to develop an industrial data platform with scalable algorithms for federated and privacy-preserving machine learning techniques, detection and mitigation of model capable of fairly monetizing datasets according to their real data value. MUSKETEER is an H2020 project that has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 824988.
More information here: https://hopin.com/events/musketeer-2nd-hackathon-attacking-federated-learning-scenarios