mayacinka commited on
Commit
1587166
1 Parent(s): 179703f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -13,9 +13,9 @@ base_model:
13
  - bardsai/jaskier-7b-dpo-v5.6
14
  ---
15
 
16
- # ExpertButtercup-7Bx2_MoE
17
 
18
- ExpertButtercup-7Bx2_MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
19
  * [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B)
20
  * [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6)
21
 
@@ -47,7 +47,7 @@ from transformers import AutoTokenizer
47
  import transformers
48
  import torch
49
 
50
- model = "mayacinka/ExpertButtercup-7Bx2_MoE"
51
 
52
  tokenizer = AutoTokenizer.from_pretrained(model)
53
  pipeline = transformers.pipeline(
 
13
  - bardsai/jaskier-7b-dpo-v5.6
14
  ---
15
 
16
+ # ExpertRamonda-7Bx2_MoE
17
 
18
+ ExpertRamonda-7Bx2_MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
19
  * [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B)
20
  * [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6)
21
 
 
47
  import transformers
48
  import torch
49
 
50
+ model = "mayacinka/ExpertRamonda-7Bx2_MoE"
51
 
52
  tokenizer = AutoTokenizer.from_pretrained(model)
53
  pipeline = transformers.pipeline(