CPM-2: Large-scale cost-effective pre-trained language models
Citations Over TimeTop 10% of 2021 papers
Abstract
In recent years, the size of pre-trained language models (PLMs) has grown by leaps and bounds. However, efficiency issues of these large-scale PLMs limit their utilization in real-world scenarios. We present a suite of cost-effective techniques for the use of PLMs to deal with the efficiency issues of pre-training, fine-tuning, and inference. (1) We introduce knowledge inheritance to accelerate the pre-training process by exploiting existing PLMs instead of training models from scratch. (2) We explore the best practice of prompt tuning with large-scale PLMs. Compared with conventional fine-tuning, prompt tuning significantly reduces the number of task-specific parameters. (3) We implement a new inference toolkit, namely infmoe, for using large-scale PLMs with limited computational resources. Based on our cost-effective pipeline, we pre-train two models: an encoder-decoder bilingual model with 11 billion parameters (CPM-2) and its corresponding MoE version with 198 billion parameters. In our experiments, we compare CPM-2 with mT5 on downstream tasks. Experimental results show that CPM-2 has excellent general language intelligence. Moreover, we validate the efficiency of infmoe when conducting inference of large-scale models having tens of billions of parameters on a single GPU. All source code and model parameters are available at https://github.com/TsinghuaAI/CPM.
Related Papers
- → “Leaps! Leaps! Leaps!”(2007)11 cited
- The “Leaps" of Social Economy & Prior pevelopment of Higher Education(2000)
- Rational Consideration on Developing by Leaps and Bounds the Teaching Contingent in a Newly-established College(2004)
- A Study on the Mechanism of Development by Leaps and Bounds of Regional Economy:Based on Empirical Research on Changshou District of Chongqing(2009)
- Study and Analysis on new Elements-difficulties of Jumps or Leaps With Turns During Flight in Rhythmic Gymnastics(2003)