An Exploration of Federated Learning for Privacy-Preserving Machine Learning
Lately, privacy concerns have turned into a basic test for firms attempting to safeguard financial models and meet with end-client assumptions. In any case, by definition, no PC framework is absolutely protected. Security issues, for example, data harming and ill-disposed attack, could cause predisp...
Saved in:
Published in: | 2024 5th International Conference on Innovative Trends in Information Technology (ICITIIT) pp. 1 - 6 |
---|---|
Main Authors: | , , , , , , |
Format: | Conference Proceeding |
Language: | English |
Published: |
IEEE
15-03-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Lately, privacy concerns have turned into a basic test for firms attempting to safeguard financial models and meet with end-client assumptions. In any case, by definition, no PC framework is absolutely protected. Security issues, for example, data harming and ill-disposed attack, could cause predisposition in the model predictions. An ML framework that is compatible with federated learning and can protect the privacy of many parties is presented in the survey as PFMLP. Collaborative learning with shared gradients is made easier with PFMLP. The effectiveness of PFMLP in privacy-preserving machine learning across many parties is demonstrated experimentally by models trained using it, which achieve comparable accuracy with deviations continuously around 1%. |
---|---|
DOI: | 10.1109/ICITIIT61487.2024.10580759 |