SAELens
ArthurConmyGDM commited on
Commit
fded926
1 Parent(s): 75faebf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -1
README.md CHANGED
@@ -15,7 +15,20 @@ See our [landing page](https://huggingface.co/google/gemma-scope) for details on
15
  - `9b-pt-`: These SAEs were trained on Gemma v2 9B base model.
16
  - `mlp`: These SAEs were trained on the model's MLP sublayer outputs.
17
 
18
- ## 3. Point of Contact
 
 
 
 
 
 
 
 
 
 
 
 
 
19
 
20
  Point of contact: Arthur Conmy
21
 
@@ -27,3 +40,7 @@ Contact by email:
27
 
28
  HuggingFace account:
29
  https://huggingface.co/ArthurConmyGDM
 
 
 
 
 
15
  - `9b-pt-`: These SAEs were trained on Gemma v2 9B base model.
16
  - `mlp`: These SAEs were trained on the model's MLP sublayer outputs.
17
 
18
+ # 3. How can I use these SAEs straight away?
19
+
20
+ ```python
21
+ from sae_lens import SAE # pip install sae-lens
22
+
23
+ sae, cfg_dict, sparsity = SAE.from_pretrained(
24
+ release = "gemma-scope-9b-pt-mlp-canonical",
25
+ sae_id = "layer_0/width_16k/canonical",
26
+ )
27
+ ```
28
+
29
+ See https://github.com/jbloomAus/SAELens for details on this library.
30
+
31
+ # 4. Point of Contact
32
 
33
  Point of contact: Arthur Conmy
34
 
 
40
 
41
  HuggingFace account:
42
  https://huggingface.co/ArthurConmyGDM
43
+
44
+ # 5. Citation
45
+
46
+ Paper: https://arxiv.org/abs/2408.05147