Efficient and Privacy-Preserving Outsourcing of Gradient Boosting Decision Tree Inference
Recently, outsourcing machine learning inference services to the cloud has become increasingly popular. The inference process, however, remains an open question on how to effectively protect the model owner's proprietary model, the user's sensitive data, and prediction results . In this wo...
Saved in:
Published in: | IEEE transactions on services computing Vol. 17; no. 5; pp. 2334 - 2348 |
---|---|
Main Authors: | , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
IEEE
01-09-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Recently, outsourcing machine learning inference services to the cloud has become increasingly popular. The inference process, however, remains an open question on how to effectively protect the model owner's proprietary model, the user's sensitive data, and prediction results . In this work, we propose an efficient and comprehensive privacy-preserving framework for outsourcing Gradient Boosting Decision Tree (GBDT) inference utilizing pseudorandom function and additively homomorphic encryption. Specifically, we first design a transformation method for GBDT to protect the node and structure privacy of the owner's model. On top of the protected model, we further propose customized comparison and random trees permutation protocols, which substantially boost the computation and reduce the communication cost of the outsourcing inference, while preventing the user from inferring privacy associated with GBDT. Besides, we provide rigorous security analysis, and extensive experiments on 7 real-world datasets and various models demonstrating that our scheme achieves up to 36 times less runtime and 69 times less communication compared to the state-of-the-arts. |
---|---|
ISSN: | 1939-1374 2372-0204 |
DOI: | 10.1109/TSC.2024.3395928 |