Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,8 @@ inference: false
|
|
11 |
|
12 |
baichuan-7B是由百川智能开发的一个开源的大规模预训练模型。基于Transformer结构,在大约1.2万亿tokens上训练的70亿参数模型,支持中英双语,上下文窗口长度为4096。在标准的中文和英文权威benchmark(C-EVAL/MMLU)上均取得同尺寸最好的效果。
|
13 |
|
14 |
-
如果希望使用baichuan-7B(如进行推理、Finetune等),我们推荐使用配套代码库[baichuan-7B](https://
|
|
|
15 |
|
16 |
baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligent Technology. Based on the Transformer architecture, it is a model with 7 billion parameters trained on approximately 1.2 trillion tokens. It supports both Chinese and English, with a context window length of 4096. It achieves the best performance of its size on standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).
|
17 |
|
@@ -223,3 +224,6 @@ The superscript in the Model column indicates the source of the results.
|
|
223 |
1:https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
|
224 |
2:https://paperswithcode.com/sota/multi-task-language-understanding-on-mmlu
|
225 |
```
|
|
|
|
|
|
|
|
11 |
|
12 |
baichuan-7B是由百川智能开发的一个开源的大规模预训练模型。基于Transformer结构,在大约1.2万亿tokens上训练的70亿参数模型,支持中英双语,上下文窗口长度为4096。在标准的中文和英文权威benchmark(C-EVAL/MMLU)上均取得同尺寸最好的效果。
|
13 |
|
14 |
+
如果希望使用baichuan-7B(如进行推理、Finetune等),我们推荐使用配套代码库[baichuan-7B](https://
|
15 |
+
.com/baichuan-inc/baichuan-7B)。
|
16 |
|
17 |
baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligent Technology. Based on the Transformer architecture, it is a model with 7 billion parameters trained on approximately 1.2 trillion tokens. It supports both Chinese and English, with a context window length of 4096. It achieves the best performance of its size on standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).
|
18 |
|
|
|
224 |
1:https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
|
225 |
2:https://paperswithcode.com/sota/multi-task-language-understanding-on-mmlu
|
226 |
```
|
227 |
+
|
228 |
+
## Our Group
|
229 |
+
[WeChat](https://huggingface.co/baichuan-inc/baichuan-7B/blob/main/wechat.jpeg)
|