Skip to content

Commit 380b5e7

Browse files
authored
Update README.md
1 parent ca15d62 commit 380b5e7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@
3232
## News 📢
3333

3434
* **2023.6.12 发布 [PaddleNLP v2.6rc 预览版](https://github.com/PaddlePaddle/PaddleNLP/releases/tag/v2.6.0rc)**
35-
* 🔨 大模型全流程范例:全面支持主流开源大模型[Bloom](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/bloom), [ChatGLM](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/chatglm), [GLM](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/glm), [Llama](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/llama), [OPT](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/opt)的训练和推理;[Trainer API](./docs/trainer.md)新增张量训练能力, 简单配置即可开启分布式训练;新增低参数微调能力[PEFT](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/paddlenlp/peft), 助力大模型高效微调
35+
* 🔨 大模型全流程范例:全面支持主流开源大模型[BLOOM](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/bloom), [ChatGLM](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/chatglm), [GLM](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/glm), [LLaMA](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/llama), [OPT](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/examples/language_model/opt)的训练和推理;[Trainer API](./docs/trainer.md)新增张量训练能力, 简单配置即可开启分布式训练;新增低参数微调能力[PEFT](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/paddlenlp/peft), 助力大模型高效微调
3636

3737
* **2023.1.12 发布 [PaddleNLP v2.5](https://github.com/PaddlePaddle/PaddleNLP/releases/tag/v2.5.0)**
3838
* 🔨 NLP工具:发布 [PPDiffusers](./ppdiffusers) 国产化的扩散模型工具箱,集成多种 Diffusion 模型参数和模型组件,提供了 Diffusion 模型的完整训练流程,支持 Diffusion 模型的高性能 FastDeploy 推理加速 和 多硬件部署(可支持昇腾芯片、昆仑芯部署)

0 commit comments

Comments
 (0)