JeongwonChoi
commited on
Commit
β’
28fda5c
1
Parent(s):
0325e40
Update README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,7 @@ datasets:
|
|
23 |
### **Trained On**
|
24 |
|
25 |
- **OS**: Ubuntu 20.04
|
26 |
-
- **GPU**: H100 80GB
|
27 |
- **transformers**: v4.36.2
|
28 |
|
29 |
### **Dataset**
|
@@ -38,16 +38,15 @@ E.g.
|
|
38 |
|
39 |
```python
|
40 |
text = """\
|
41 |
-
### System:
|
42 |
λΉμ μ μ¬λλ€μ΄ μ 보λ₯Ό μ°Ύμ μ μλλ‘ λμμ£Όλ μΈκ³΅μ§λ₯ λΉμμ
λλ€.
|
43 |
|
44 |
-
###
|
45 |
λνλ―Όκ΅μ μλλ μ΄λμΌ?
|
46 |
|
47 |
-
###
|
48 |
λνλ―Όκ΅μ μλλ μμΈμ
λλ€.
|
49 |
|
50 |
-
###
|
51 |
μμΈ μΈκ΅¬λ μ΄ λͺ λͺ
μ΄μΌ?
|
52 |
"""
|
53 |
```
|
@@ -56,9 +55,9 @@ text = """\
|
|
56 |
|
57 |
### **Ko-LLM-Leaderboard**
|
58 |
|
59 |
-
| Model
|
60 |
-
|
|
61 |
-
| **DataVortexM-7B-Instruct-v0.1** | **39.81** | **34.13** | **42.35** | **38.73** | **45.46** | **38.37** |
|
62 |
|
63 |
## **Implementation Code**
|
64 |
|
@@ -70,8 +69,8 @@ from transformers import AutoModelForCausalLM, AutoTokenizer
|
|
70 |
|
71 |
device = "cuda" # the device to load the model onto
|
72 |
|
73 |
-
model = AutoModelForCausalLM.from_pretrained("DataVortexM-7B-Instruct-v0.1")
|
74 |
-
tokenizer = AutoTokenizer.from_pretrained("DataVortexM-7B-Instruct-v0.1")
|
75 |
|
76 |
messages = [
|
77 |
{"role": "system", "content": "λΉμ μ μ¬λλ€μ΄ μ 보λ₯Ό μ°Ύμ μ μλλ‘ λμμ£Όλ μΈκ³΅μ§λ₯ λΉμμ
λλ€."},
|
|
|
23 |
### **Trained On**
|
24 |
|
25 |
- **OS**: Ubuntu 20.04
|
26 |
+
- **GPU**: H100 80GB 4ea
|
27 |
- **transformers**: v4.36.2
|
28 |
|
29 |
### **Dataset**
|
|
|
38 |
|
39 |
```python
|
40 |
text = """\
|
|
|
41 |
λΉμ μ μ¬λλ€μ΄ μ 보λ₯Ό μ°Ύμ μ μλλ‘ λμμ£Όλ μΈκ³΅μ§λ₯ λΉμμ
λλ€.
|
42 |
|
43 |
+
### Instruction:
|
44 |
λνλ―Όκ΅μ μλλ μ΄λμΌ?
|
45 |
|
46 |
+
### Response:
|
47 |
λνλ―Όκ΅μ μλλ μμΈμ
λλ€.
|
48 |
|
49 |
+
### Instruction:
|
50 |
μμΈ μΈκ΅¬λ μ΄ λͺ λͺ
μ΄μΌ?
|
51 |
"""
|
52 |
```
|
|
|
55 |
|
56 |
### **Ko-LLM-Leaderboard**
|
57 |
|
58 |
+
| Model | Average | Ko-ARC | Ko-HellaSwag | Ko-MMLU | Ko-TruthfulQA | Ko-CommonGen V2 |
|
59 |
+
| ---------------------------------------- | --------- | --------- | ------------ | --------- | ------------- | --------------- |
|
60 |
+
| **Edentns/DataVortexM-7B-Instruct-v0.1** | **39.81** | **34.13** | **42.35** | **38.73** | **45.46** | **38.37** |
|
61 |
|
62 |
## **Implementation Code**
|
63 |
|
|
|
69 |
|
70 |
device = "cuda" # the device to load the model onto
|
71 |
|
72 |
+
model = AutoModelForCausalLM.from_pretrained("Edentns/DataVortexM-7B-Instruct-v0.1")
|
73 |
+
tokenizer = AutoTokenizer.from_pretrained("Edentns/DataVortexM-7B-Instruct-v0.1")
|
74 |
|
75 |
messages = [
|
76 |
{"role": "system", "content": "λΉμ μ μ¬λλ€μ΄ μ 보λ₯Ό μ°Ύμ μ μλλ‘ λμμ£Όλ μΈκ³΅μ§λ₯ λΉμμ
λλ€."},
|