jeiku commited on
Commit
821c3ee
1 Parent(s): 5048298

Upload 8 files

Browse files
.gitattributes CHANGED
@@ -33,3 +33,10 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ Kielbasa-f16.gguf filter=lfs diff=lfs merge=lfs -text
37
+ Kielbasa-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
38
+ Kielbasa-Q3_K.gguf filter=lfs diff=lfs merge=lfs -text
39
+ Kielbasa-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
40
+ Kielbasa-Q4_K.gguf filter=lfs diff=lfs merge=lfs -text
41
+ Kielbasa-Q5_K.gguf filter=lfs diff=lfs merge=lfs -text
42
+ Kielbasa-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
Kielbasa-Q2_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:043c95426a7cf58e77682a948c19f1c7dc54066aec47f83d1e9de36a4d3e3a39
3
+ size 1083755840
Kielbasa-Q3_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:53e2592e18488fe59205aad1b61d9c4c2337aa81674dce8313ac191b1e02afe0
3
+ size 1391419200
Kielbasa-Q4_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0ceac89f2b828457f65b0270bca801f82bf210b70723df1413853e779f47e153
3
+ size 1708595520
Kielbasa-Q4_K_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b1ac85f196e2d04e26b08d9937e9261815595f581bb0c66b01a57808bf3ff736
3
+ size 1620695360
Kielbasa-Q5_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2678004809832a5d38deb57331487a70683a54b4eb5dc5cf3c2d18f4cefb92bf
3
+ size 1993390400
Kielbasa-Q6_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:059484f97b750ba2ddb2de30403fa1c93c8dac9fcfae0643733ff864d85a3713
3
+ size 2295984960
Kielbasa-f16.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:74b1c68014d35996f7f698411a1a0c5ff212597079db478f34eeb2666146fd09
3
+ size 5593341696
README.md ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - jeiku/Rosa_v1_3B
4
+ - jeiku/PIPPA_128_StableLM
5
+ - jeiku/Rosa_v1_3B
6
+ - jeiku/Theory_of_Mind_RP_128_StableLM
7
+ - jeiku/Rosa_v1_3B
8
+ - jeiku/Bluemoon_cleaned_StableLM
9
+ - jeiku/Rosa_v1_3B
10
+ - jeiku/Alpaca_128_StableLM
11
+ - jeiku/Rosa_v1_3B
12
+ - jeiku/Toxic_DPO_StableLM
13
+ - jeiku/Rosa_v1_3B
14
+ - jeiku/Rosa_v1_3B
15
+ - jeiku/LimaRP_StableLM
16
+ - jeiku/Rosa_v1_3B
17
+ - jeiku/Gnosis_256_StableLM
18
+ - jeiku/Rosa_v1_3B
19
+ - jeiku/Humiliation_StableLM
20
+ - jeiku/Rosa_v1_3B
21
+ - jeiku/Theory_of_Mind_128_StableLM
22
+ - jeiku/Rosa_v1_3B
23
+ - jeiku/Futa_Erotica_StableLM
24
+ - jeiku/Rosa_v1_3B
25
+ - jeiku/Everything_v3_128_StableLM
26
+ - jeiku/Rosa_v1_3B
27
+ - jeiku/No_Robots_Alpaca_StableLM
28
+ library_name: transformers
29
+ tags:
30
+ - mergekit
31
+ - merge
32
+
33
+ ---
34
+ # kielbasa
35
+
36
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
37
+
38
+ ## Merge Details
39
+ ### Merge Method
40
+
41
+ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) as a base.
42
+
43
+ ### Models Merged
44
+
45
+ The following models were included in the merge:
46
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/PIPPA_128_StableLM](https://huggingface.co/jeiku/PIPPA_128_StableLM)
47
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Theory_of_Mind_RP_128_StableLM](https://huggingface.co/jeiku/Theory_of_Mind_RP_128_StableLM)
48
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Bluemoon_cleaned_StableLM](https://huggingface.co/jeiku/Bluemoon_cleaned_StableLM)
49
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Alpaca_128_StableLM](https://huggingface.co/jeiku/Alpaca_128_StableLM)
50
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Toxic_DPO_StableLM](https://huggingface.co/jeiku/Toxic_DPO_StableLM)
51
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/LimaRP_StableLM](https://huggingface.co/jeiku/LimaRP_StableLM)
52
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Gnosis_256_StableLM](https://huggingface.co/jeiku/Gnosis_256_StableLM)
53
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Humiliation_StableLM](https://huggingface.co/jeiku/Humiliation_StableLM)
54
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Theory_of_Mind_128_StableLM](https://huggingface.co/jeiku/Theory_of_Mind_128_StableLM)
55
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Futa_Erotica_StableLM](https://huggingface.co/jeiku/Futa_Erotica_StableLM)
56
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/Everything_v3_128_StableLM](https://huggingface.co/jeiku/Everything_v3_128_StableLM)
57
+ * [jeiku/Rosa_v1_3B](https://huggingface.co/jeiku/Rosa_v1_3B) + [jeiku/No_Robots_Alpaca_StableLM](https://huggingface.co/jeiku/No_Robots_Alpaca_StableLM)
58
+
59
+ ### Configuration
60
+
61
+ The following YAML configuration was used to produce this model:
62
+
63
+ ```yaml
64
+ merge_method: dare_ties
65
+ base_model: jeiku/Rosa_v1_3B
66
+ parameters:
67
+ normalize: true
68
+ models:
69
+ - model: jeiku/Rosa_v1_3B+jeiku/No_Robots_Alpaca_StableLM
70
+ parameters:
71
+ weight: 1
72
+ - model: jeiku/Rosa_v1_3B+jeiku/Toxic_DPO_StableLM
73
+ parameters:
74
+ weight: 1
75
+ - model: jeiku/Rosa_v1_3B+jeiku/Alpaca_128_StableLM
76
+ parameters:
77
+ weight: 1
78
+ - model: jeiku/Rosa_v1_3B+jeiku/Everything_v3_128_StableLM
79
+ parameters:
80
+ weight: 1
81
+ - model: jeiku/Rosa_v1_3B+jeiku/Futa_Erotica_StableLM
82
+ parameters:
83
+ weight: 1
84
+ - model: jeiku/Rosa_v1_3B+jeiku/Gnosis_256_StableLM
85
+ parameters:
86
+ weight: 1
87
+ - model: jeiku/Rosa_v1_3B+jeiku/Humiliation_StableLM
88
+ parameters:
89
+ weight: 1
90
+ - model: jeiku/Rosa_v1_3B+jeiku/Theory_of_Mind_128_StableLM
91
+ parameters:
92
+ weight: 1
93
+ - model: jeiku/Rosa_v1_3B+jeiku/PIPPA_128_StableLM
94
+ parameters:
95
+ weight: 1
96
+ - model: jeiku/Rosa_v1_3B+jeiku/LimaRP_StableLM
97
+ parameters:
98
+ weight: 1
99
+ - model: jeiku/Rosa_v1_3B+jeiku/Theory_of_Mind_RP_128_StableLM
100
+ parameters:
101
+ weight: 1
102
+ - model: jeiku/Rosa_v1_3B+jeiku/Bluemoon_cleaned_StableLM
103
+ parameters:
104
+ weight: 1
105
+ dtype: float16
106
+
107
+ ```