MiCS: Near-linear Scaling for Training Gigantic Model on Public

Zhen Zhang, Shuai Zheng, Yida Wang, Justin Chiu, George Karypis, Trishul Chilimbi, Mu Li, Xin Jin

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

Existing general purpose frameworks for gigantic model training, i.e., dense models with billions of parameters, cannot scale efficiently on cloud environment with various networking conditions due to large communication overheads. In this paper, we propose MiCS, which Minimizes the Communication Scale to bring down communication overhead. Specifically, by decreasing the number of participants in a communication collective, MiCS can utilize heterogeneous network bandwidth, reduce network traffic over slower links, reduce the latency of communications for maintaining high network bandwidth utilization, and amortize expensive global gradient synchronization overhead. Our evaluation on AWS shows that the system throughput of MiCS is up to 2.89× that of the state-of-the-art large model training systems. MiCS achieves near-linear scaling efficiency, which is up to 1.27× that of DeepSpeed. MiCS allows us to train a proprietary model with 100 billion parameters on 512 GPUs with 99.4% weak-scaling efficiency, and it is able to saturate over 54.5% theoretical computation power of each GPU on a public cloud with less GPU memory and more restricted networks than DGX-A100 clusters.

Original languageEnglish (US)
Pages (from-to)37-50
Number of pages14
JournalProceedings of the VLDB Endowment
Volume16
Issue number1
DOIs
StatePublished - 2022
Externally publishedYes
Event49th International Conference on Very Large Data Bases, VLDB 2023 - Vancouver, Canada
Duration: Aug 28 2023Sep 1 2023

Bibliographical note

Funding Information:
We sincerely thank the anonymous reviewers for their valuable feedback. We thank the Amazon Search M5 team for providing large clusters. Xin Jin and Shuai Zheng are the corresponding authors. Xin Jin is with the Key Laboratory of High Confidence Software Technologies (Peking University), Ministry of Education. Zhen Zhang is supported in part by NSF grants CNS-1813487 and CCF-1918757. Xin Jin is supported in part by National Natural Science Foundation of China under the grant number 62172008 and National Natural Science Fund for the Excellent Young Scientists Fund Program (Overseas).

Publisher Copyright:
© 2022 VLDB Endowment.

Fingerprint

Dive into the research topics of 'MiCS: Near-linear Scaling for Training Gigantic Model on Public'. Together they form a unique fingerprint.

Cite this