FasDL: An Efficient Serverless-Based Training Architecture with Communication Optimization and Resource Configuration

Deploying distributed training workloads of deep learning models atop serverless architecture alleviates the burden of managing servers from deep learning practitioners. However, when supporting deep model training, the current serverless architecture faces the challenges of inefficient communicatio...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on computers pp. 1 - 14
Main Authors: Chen, Xinglei, Cai, Zinuo, Zhang, hanwen, ma, ruhui, Buyya, Rajkumar
Format: Journal Article
Language:English
Published: IEEE 22-10-2024
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deploying distributed training workloads of deep learning models atop serverless architecture alleviates the burden of managing servers from deep learning practitioners. However, when supporting deep model training, the current serverless architecture faces the challenges of inefficient communication patterns and rigid resource configuration that incur subpar and unpredictable training performance. In this paper, we propose FasDL, an efficient serverless-based deep learning training architecture to solve these two challenges. FasDL adopts a novel training framework K-REDUCE to release the communication overhead and accelerate the training. Additionally, FasDL builds a lightweight mathematical model for K-REDUCE training, offering predictable performance and supporting subsequent resource configuration. It achieves the optimal resource configuration by formulating an optimization problem related to system-level and applicationlevel parameters and solving it with a pruning-based heuristic search algorithm. Extensive experiments on AWS Lambda verify a prediction accuracy over 94% and demonstrate performance and cost advantages over the state-of-art architecture LambdaML by up to 16.8% and 28.3% respectively.
ISSN:0018-9340
1557-9956
DOI:10.1109/TC.2024.3485202