pszemraj commited on
Commit
4454b05
1 Parent(s): 35c49df

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -15
README.md CHANGED
@@ -1,34 +1,33 @@
1
  ---
2
  license: other
 
 
 
 
 
 
3
  ---
4
 
 
5
 
6
- https://huggingface.co/chargoddard/Yi-6B-200K-Llama
 
 
7
 
8
 
9
  ```python
10
- # !pip install -U -q transformers datasets accelerate sentencepiece bitsandbytes
11
  import torch
12
- from huggingface_hub import notebook_login
13
 
14
  # Load model directly
15
  from transformers import AutoModelForCausalLM, AutoTokenizer
16
 
17
- notebook_login()
18
-
19
-
20
- tokenizer = AutoTokenizer.from_pretrained("pszemraj/Yi-6B-200K-Llama-sharded")
21
  model = AutoModelForCausalLM.from_pretrained(
22
  "pszemraj/Yi-6B-200K-Llama-sharded", load_in_4bit=True
23
  )
24
 
25
- prompt = """God, I just love smoking weed-"
26
- "I know, I love smoking weed too," God replied, "but what would you say if I were to tell you that the earth is 10,000 years old, that men and dinosaurs walked the earth together, that there was once a great flood, and that a man named Jonah once walked in the belly of a fish?"
27
- "That's some weird shit," the man said.
28
- "And that's the Gospel, the Good News," God continued, "that Jesus was raised from the dead, that he died for your sins, and that if you have faith in him you will go to heaven when you die."
29
- "What do you mean?" the man asked.
30
- "I mean," God answered, "that if you have any sense at all, you'll roll another joint."
31
- "I don't know why but I'm feeling"""
32
 
33
 
34
  if torch.cuda.is_available():
@@ -46,4 +45,4 @@ tokens = model.generate(
46
  repetition_penalty=1.05,
47
  )
48
  print(tokenizer.decode(tokens[0], skip_special_tokens=True))
49
- ```
 
1
  ---
2
  license: other
3
+ pipeline_tag: text-generation
4
+ tags:
5
+ - sharded
6
+ - yi
7
+ - bf16
8
+ - colab
9
  ---
10
 
11
+ # Yi-6B-200K-Llama - sharded bf16
12
 
13
+ https://huggingface.co/chargoddard/Yi-6B-200K-Llama sharded & in bf16 (colab free GPU loadable)
14
+
15
+ Please note the custom license from [the original](https://huggingface.co/01-ai/Yi-6B-200K) still applies 🌝
16
 
17
 
18
  ```python
19
+ # !pip install -U -q transformers accelerate sentencepiece bitsandbytes
20
  import torch
 
21
 
22
  # Load model directly
23
  from transformers import AutoModelForCausalLM, AutoTokenizer
24
 
25
+ tokenizer = AutoTokenizer.from_pretrained("pszemraj/Yi-6B-200K-Llama-sharded", use_fast=False)
 
 
 
26
  model = AutoModelForCausalLM.from_pretrained(
27
  "pszemraj/Yi-6B-200K-Llama-sharded", load_in_4bit=True
28
  )
29
 
30
+ prompt = "Custom non-commercial licenses are just so fun, aren't they? The best thing about them is"
 
 
 
 
 
 
31
 
32
 
33
  if torch.cuda.is_available():
 
45
  repetition_penalty=1.05,
46
  )
47
  print(tokenizer.decode(tokens[0], skip_special_tokens=True))
48
+ ```