eval_name
stringlengths
12
96
Precision
stringclasses
3 values
Type
stringclasses
5 values
T
stringclasses
5 values
Weight type
stringclasses
2 values
Architecture
stringclasses
39 values
Model
stringlengths
355
605
fullname
stringlengths
4
87
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
50.3
Hub License
stringclasses
23 values
Hub ❤️
int64
0
5.66k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
62.8
MATH Lvl 5 Raw
float64
0
0.41
MATH Lvl 5
float64
0
41.2
GPQA Raw
float64
0.22
0.41
GPQA
float64
0
20.9
MUSR Raw
float64
0.3
0.58
MUSR
float64
0
34.6
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.7
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
87 values
Generation
int64
0
6
Base Model
stringlengths
4
79
DeepMount00_Llama-3.1-8b-Ita_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-8b-Ita
5ede1e388b6b15bc06acd364a8f805fe9ed16db9
25.963617
3
8
false
true
true
false
false
0.536484
53.648431
0.517
31.333639
0.152568
15.256798
0.306208
7.494407
0.448719
15.15651
0.396027
32.891918
false
2024-08-13
2024-08-23
2
meta-llama/Meta-Llama-3.1-8B
DreadPoor_Aspire-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire-8B-model_stock
5c23cb2aff877d0b7bdcfa4de43d1bc8a1852de0
28.283991
apache-2.0
1
8
true
false
true
false
true
0.714062
71.406202
0.527825
32.53427
0.129909
12.990937
0.314597
8.612975
0.42125
13.45625
0.37633
30.70331
false
2024-09-16
2024-09-17
1
DreadPoor/Aspire-8B-model_stock (Merge)
DreadPoor_Heart_Stolen-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Heart_Stolen-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Heart_Stolen-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Heart_Stolen-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Heart_Stolen-8B-Model_Stock
6d77987af7115c7455ddb072c48316815b018999
28.98304
apache-2.0
2
8
true
false
true
false
true
0.724453
72.445334
0.539544
34.444822
0.146526
14.652568
0.317114
8.948546
0.416229
12.361979
0.379405
31.044991
false
2024-09-09
2024-09-10
1
DreadPoor/Heart_Stolen-8B-Model_Stock (Merge)
DreadPoor_Heart_Stolen-ALT-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Heart_Stolen-ALT-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Heart_Stolen-ALT-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Heart_Stolen-ALT-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Heart_Stolen-ALT-8B-Model_Stock
03d1d70cb7eb5a743468b97c9c580028df487564
27.527959
apache-2.0
2
8
true
false
true
false
true
0.718358
71.83584
0.526338
32.354424
0.135952
13.595166
0.301174
6.823266
0.4055
9.754167
0.377244
30.804891
false
2024-09-11
2024-09-11
1
DreadPoor/Heart_Stolen-ALT-8B-Model_Stock (Merge)
DreadPoor_Irina-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Irina-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Irina-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Irina-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Irina-8B-model_stock
b282e3ab449d71a31f48b8c13eb43a4435968728
25.161035
0
8
false
true
true
false
true
0.67994
67.994034
0.523664
32.08833
0.090634
9.063444
0.284396
4.58613
0.400292
8.636458
0.35738
28.597813
false
2024-08-30
0
DreadPoor/Irina-8B-model_stock
DreadPoor_ONeil-model_stock-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/ONeil-model_stock-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/ONeil-model_stock-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__ONeil-model_stock-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/ONeil-model_stock-8B
d4b84956211fd57b85122fe0c6f88b2cb9a9c86a
26.784851
apache-2.0
2
8
true
false
true
false
true
0.678566
67.85662
0.554834
36.412613
0.092145
9.214502
0.305369
7.38255
0.417344
10.967969
0.359874
28.874852
false
2024-07-06
2024-07-15
1
DreadPoor/ONeil-model_stock-8B (Merge)
DreadPoor_Sellen-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Sellen-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Sellen-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Sellen-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Sellen-8B-model_stock
accde7145d81a428c782695ea61eebc608efd980
26.173645
0
8
false
true
true
false
true
0.711289
71.128938
0.523168
31.360979
0.120846
12.084592
0.274329
3.243848
0.396042
10.671875
0.356965
28.55164
false
2024-08-27
0
DreadPoor/Sellen-8B-model_stock
DreadPoor_Trinas_Nectar-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Trinas_Nectar-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Trinas_Nectar-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Trinas_Nectar-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Trinas_Nectar-8B-model_stock
cb46b8431872557904d83fc5aa1b90dabeb74acc
27.270692
apache-2.0
2
8
true
false
true
false
true
0.725927
72.592721
0.525612
31.975094
0.137462
13.746224
0.286074
4.809843
0.406771
11.413021
0.361785
29.087249
false
2024-08-16
2024-08-27
1
DreadPoor/Trinas_Nectar-8B-model_stock (Merge)
EleutherAI_gpt-j-6b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-j-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-j-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-j-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-j-6b
47e169305d2e8376be1d31e765533382721b2cc1
6.545236
apache-2.0
1,410
6
true
true
true
false
false
0.252219
25.221856
0.319104
4.912818
0.012085
1.208459
0.245805
0
0.36575
5.252083
0.124086
2.676197
true
2022-03-02
2024-08-19
0
EleutherAI/gpt-j-6b
EleutherAI_gpt-neo-1.3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-1.3B
dbe59a7f4a88d01d1ba9798d78dbe3fe038792c8
5.32815
mit
254
1
true
true
true
false
false
0.207905
20.790503
0.303923
3.024569
0.006798
0.679758
0.255872
0.782998
0.381656
4.873698
0.116356
1.817376
true
2022-03-02
2024-06-12
0
EleutherAI/gpt-neo-1.3B
EleutherAI_gpt-neo-125m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-125m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-125m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-125m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-125m
21def0189f5705e2521767faed922f1f15e7d7db
4.382146
mit
177
0
true
true
true
false
false
0.190544
19.054442
0.311516
3.436739
0.004532
0.453172
0.253356
0.447427
0.359333
2.616667
0.10256
0.284427
true
2022-03-02
2024-08-10
0
EleutherAI/gpt-neo-125m
EleutherAI_gpt-neo-2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-2.7B
e24fa291132763e59f4a5422741b424fb5d59056
6.342931
mit
418
2
true
true
true
false
false
0.258963
25.896289
0.313952
4.178603
0.005287
0.528701
0.26594
2.12528
0.355365
3.520573
0.116273
1.808141
true
2022-03-02
2024-06-12
0
EleutherAI/gpt-neo-2.7B
EleutherAI_gpt-neox-20b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neox-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neox-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neox-20b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neox-20b
c292233c833e336628618a88a648727eb3dff0a7
5.990641
apache-2.0
518
20
true
true
true
false
false
0.258688
25.868806
0.316504
4.929114
0.006042
0.60423
0.243289
0
0.364667
2.816667
0.115525
1.72503
true
2022-04-07
2024-06-09
0
EleutherAI/gpt-neox-20b
EleutherAI_pythia-12b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-12b
35c9d7f32fbb108fb8b5bdd574eb03369d1eed49
5.93396
apache-2.0
131
12
true
true
true
false
false
0.247148
24.714757
0.317965
4.987531
0.009063
0.906344
0.246644
0
0.364698
3.78724
0.110871
1.20789
true
2023-02-28
2024-06-12
0
EleutherAI/pythia-12b
EleutherAI_pythia-160m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-160m
50f5173d932e8e61f858120bcb800b97af589f46
5.617102
apache-2.0
25
0
true
true
true
false
false
0.181552
18.155162
0.297044
2.198832
0.002266
0.226586
0.258389
1.118568
0.417938
10.675521
0.111951
1.32794
true
2023-02-08
2024-06-09
0
EleutherAI/pythia-160m
EleutherAI_pythia-2.8b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-2.8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-2.8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-2.8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-2.8b
2a259cdd96a4beb1cdf467512e3904197345f6a9
5.441653
apache-2.0
28
2
true
true
true
false
false
0.217322
21.732226
0.322409
5.077786
0.006798
0.679758
0.25
0
0.348573
3.638281
0.113697
1.521868
true
2023-02-13
2024-06-12
0
EleutherAI/pythia-2.8b
EleutherAI_pythia-410m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-410m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-410m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-410m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-410m
9879c9b5f8bea9051dcb0e68dff21493d67e9d4f
5.113779
apache-2.0
21
0
true
true
true
false
false
0.219545
21.954525
0.302813
2.715428
0.003021
0.302115
0.259228
1.230425
0.357813
3.059896
0.112783
1.420287
true
2023-02-13
2024-06-09
0
EleutherAI/pythia-410m
EleutherAI_pythia-6.9b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-6.9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-6.9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-6.9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-6.9b
f271943e880e60c0c715fd10e4dc74ec4e31eb44
5.853254
apache-2.0
43
6
true
true
true
false
false
0.228114
22.811363
0.323229
5.881632
0.007553
0.755287
0.251678
0.223714
0.359052
3.814844
0.114694
1.632683
true
2023-02-14
2024-06-12
0
EleutherAI/pythia-6.9b
Enno-Ai_EnnoAi-Pro-French-Llama-3-8B-v0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-French-Llama-3-8B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
328722ae96e3a112ec900dbe77d410788a526c5c
15.180945
creativeml-openrail-m
0
8
true
true
true
false
true
0.418881
41.888079
0.407495
16.875928
0.006042
0.60423
0.270973
2.796421
0.417
10.758333
0.263464
18.162677
false
2024-06-27
2024-06-30
0
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
Enno-Ai_EnnoAi-Pro-Llama-3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3-8B
6a5d745bdd304753244fe601e2a958d37d13cd71
12.174667
creativeml-openrail-m
0
8
true
true
true
false
true
0.319538
31.953772
0.415158
17.507545
0.001511
0.151057
0.261745
1.565996
0.407052
9.08151
0.215093
12.788121
false
2024-07-01
2024-07-08
0
Enno-Ai/EnnoAi-Pro-Llama-3-8B
Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
cf29b8b484a909132e3a1f85ce891d28347c0d13
17.498882
creativeml-openrail-m
0
8
true
true
true
false
true
0.508257
50.825698
0.410058
16.668386
0.010574
1.057402
0.265101
2.013423
0.423573
12.313281
0.299036
22.1151
false
2024-06-26
2024-06-26
0
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
Enno-Ai_EnnoAi-Pro-Llama-3.1-8B-v0.9_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3.1-8B-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
c740871122fd471a1a225cf2b4368e333752d74c
14.945694
apache-2.0
0
8
true
true
true
false
true
0.468915
46.89147
0.416027
17.498296
0
0
0.26594
2.12528
0.383177
5.430469
0.259558
17.72865
false
2024-08-22
2024-09-06
0
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
EnnoAi_EnnoAi-Pro-Llama-3.1-8B-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EnnoAi__EnnoAi-Pro-Llama-3.1-8B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
c740871122fd471a1a225cf2b4368e333752d74c
14.97109
apache-2.0
0
8
true
true
true
false
true
0.470438
47.043844
0.416027
17.498296
0
0
0.26594
2.12528
0.383177
5.430469
0.259558
17.72865
false
2024-08-22
2024-09-06
0
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
Epiculous_Azure_Dusk-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Epiculous/Azure_Dusk-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Azure_Dusk-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Azure_Dusk-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Epiculous/Azure_Dusk-v0.2
ebddf1b2efbe7f9cae066d263b0991ded89c88e8
14.025651
apache-2.0
6
12
true
true
true
false
true
0.346716
34.67156
0.411972
17.396414
0.016616
1.661631
0.260906
1.454139
0.383458
6.365625
0.303441
22.604536
false
2024-09-09
2024-09-14
0
Epiculous/Azure_Dusk-v0.2
Epiculous_Crimson_Dawn-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Epiculous/Crimson_Dawn-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Crimson_Dawn-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Crimson_Dawn-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Epiculous/Crimson_Dawn-v0.2
4cceb1e25026afef241ad5325097e88eccd8f37a
14.8216
apache-2.0
7
12
true
true
true
false
true
0.310345
31.034544
0.448238
21.688249
0.02719
2.719033
0.276007
3.467562
0.415177
10.897135
0.272108
19.123079
false
2024-09-02
2024-09-05
0
Epiculous/Crimson_Dawn-v0.2
Epiculous_Violet_Twilight-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Epiculous/Violet_Twilight-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Violet_Twilight-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Violet_Twilight-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Epiculous/Violet_Twilight-v0.2
30c8bad3c1f565150afbf2fc90cacf4f45d096f6
18.527597
apache-2.0
1
12
true
false
true
false
true
0.453178
45.317757
0.461455
23.940537
0.02719
2.719033
0.26594
2.12528
0.429938
13.608854
0.311087
23.454122
false
2024-09-12
2024-09-16
0
Epiculous/Violet_Twilight-v0.2
EpistemeAI_Alpaca-Llama3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Alpaca-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Alpaca-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Alpaca-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Alpaca-Llama3.1-8B
3152dfa17322dff7c6af6dbf3daceaf5db51e230
13.833989
apache-2.0
0
8
true
true
true
false
false
0.159869
15.986915
0.475526
25.935227
0.041541
4.154079
0.290268
5.369128
0.34026
6.599219
0.324634
24.959368
false
2024-09-11
2024-08-13
2
meta-llama/Meta-Llama-3.1-8B
EpistemeAI_Athena-gemma-2-2b-it_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Athena-gemma-2-2b-it
661c1dc6a1a096222e33416e099bd02b7b970405
14.269153
apache-2.0
2
2
true
true
true
false
false
0.313417
31.341729
0.426423
19.417818
0.032477
3.247734
0.268456
2.46085
0.435052
13.348177
0.242188
15.798611
false
2024-08-29
2024-09-06
2
unsloth/gemma-2-9b-it-bnb-4bit
EpistemeAI_Athena-gemma-2-2b-it-Philos_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Athena-gemma-2-2b-it-Philos
dea2b35d496bd32ed3c88d42ff3022654153f2e1
15.097481
apache-2.0
0
2
true
true
true
false
true
0.462095
46.209502
0.379478
13.212088
0.003021
0.302115
0.28104
4.138702
0.431365
12.853906
0.224817
13.868573
false
2024-09-05
2024-09-05
1
unsloth/gemma-2-2b-it-bnb-4bit
EpistemeAI_Athene-codegemma-2-7b-it-alpaca-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athene-codegemma-2-7b-it-alpaca-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3
9c26e1242a11178b53937bc0e9a744ef6141e05a
17.213317
apache-2.0
0
7
true
true
true
false
false
0.402994
40.299406
0.433192
20.873795
0.055891
5.589124
0.280201
4.026846
0.450302
14.854427
0.258727
17.636303
false
2024-09-06
2024-09-06
2
EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1
EpistemeAI_FineLlama3.1-8B-Instruct_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/EpistemeAI/FineLlama3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/FineLlama3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__FineLlama3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/FineLlama3.1-8B-Instruct
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
11.050434
llama3.1
1
14
true
true
true
false
false
0.08001
8.000993
0.455736
23.506619
0.023414
2.34139
0.280201
4.026846
0.348167
4.954167
0.311253
23.472592
false
2024-08-10
2024-08-10
0
EpistemeAI/FineLlama3.1-8B-Instruct
EpistemeAI_Fireball-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-12B
e2ed12c3244f2502321fb20e76dfc72ad7817d6e
15.446415
apache-2.0
1
12
true
true
true
false
false
0.18335
18.335018
0.511089
30.666712
0.035498
3.549849
0.261745
1.565996
0.423635
12.521094
0.334358
26.03982
false
2024-08-20
2024-08-21
2
EpistemeAI/Fireball-Mistral-Nemo-Base-2407-sft-v2.1
EpistemeAI_Fireball-12B-v1.13a-philosophers_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B-v1.13a-philosophers" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B-v1.13a-philosophers</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-v1.13a-philosophers-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-12B-v1.13a-philosophers
7fa824d4a40abca3f8c75d432ea151dc0d1d67d6
14.34016
apache-2.0
2
12
true
true
true
false
false
0.087553
8.755325
0.51027
30.336233
0.03852
3.851964
0.301174
6.823266
0.408073
9.975781
0.336686
26.298389
false
2024-08-28
2024-09-03
1
EpistemeAI/Fireball-Mistral-Nemo-12B-cot-orcas
EpistemeAI_Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200
27d67626304954db71f21fec9e7fc516421274ec
20.878152
apache-2.0
0
8
true
true
true
false
false
0.457724
45.772439
0.48384
26.377774
0.108006
10.800604
0.300336
6.711409
0.394458
6.907292
0.358295
28.699394
false
2024-09-16
2024-09-16
3
unsloth/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta
2851384717556dd6ac14c00ed87aac1f267eb263
24.902349
apache-2.0
0
8
true
true
true
false
true
0.727401
72.740107
0.486489
26.897964
0.132175
13.217523
0.280201
4.026846
0.361938
4.275521
0.354305
28.256132
false
2024-09-12
2024-09-14
4
unsloth/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2
b19336101aa5f4807d1574f4c11eebc1c1a1c34e
22.323891
apache-2.0
0
8
true
true
true
false
false
0.467316
46.731561
0.493203
28.247009
0.110272
11.02719
0.286074
4.809843
0.462365
16.995573
0.335189
26.132166
false
2024-09-14
2024-09-14
2
unsloth/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Mistral-Nemo-Base-2407-v1-DPO2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Mistral-Nemo-Base-2407-v1-DPO2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2
2cf732fbffefdf37341b946edd7995f14d3f9487
15.251223
apache-2.0
0
12
true
true
true
false
false
0.186073
18.607295
0.496777
28.567825
0.030967
3.096677
0.291946
5.592841
0.40401
9.501302
0.335273
26.141401
false
2024-08-19
2024-08-19
1
EpistemeAI/Fireball-Nemo-Base-2407-sft-v1
EpistemeAI2_Athene-codegemma-2-7b-it-alpaca-v1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Athene-codegemma-2-7b-it-alpaca-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2
21b31062334a316b50680e8c3a141a72e4c30b61
15.579922
apache-2.0
0
7
true
true
true
false
false
0.435118
43.511771
0.417542
18.97137
0.033988
3.398792
0.270973
2.796421
0.416969
10.38776
0.229721
14.413416
false
2024-08-26
2024-08-26
2
EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1
EpistemeAI2_Fireball-12B-v1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-12B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-12B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-12B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-12B-v1.2
57af42edf8232189ee99e9a21e33a0c306e3f561
15.061817
apache-2.0
1
12
true
true
true
false
false
0.135539
13.553926
0.501858
29.776014
0.033233
3.323263
0.298658
6.487696
0.417313
11.264062
0.333693
25.965943
false
2024-08-27
2024-08-28
1
EpistemeAI/Fireball-Mistral-Nemo-12B-v2b
EpistemeAI2_Fireball-Alpaca-Llama3.1-8B-Philos_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos
3dcca4cf9bdd9003c8dc91f5c78cefef1d4ae0d7
22.388028
apache-2.0
1
8
true
true
true
false
false
0.49864
49.864027
0.497758
29.259226
0.108761
10.876133
0.292785
5.704698
0.427667
11.891667
0.340592
26.732417
false
2024-08-29
2024-08-29
2
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.01-8B-Philos_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.01-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos
f97293ed5cec7fb9482b16600259967c6c923e4b
21.39091
apache-2.0
0
8
true
true
true
false
false
0.421179
42.117914
0.495611
28.628475
0.125378
12.537764
0.288591
5.145414
0.437062
13.432813
0.338348
26.483082
false
2024-09-03
2024-09-03
2
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.03-8B-Philos_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.03-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos
6e60f783f80f7d126b8e4f2b417e14dea63d2c4f
20.09834
apache-2.0
0
8
true
true
true
false
false
0.388081
38.80814
0.495087
27.992549
0.117825
11.782477
0.278523
3.803132
0.42801
12.034635
0.335522
26.169105
false
2024-09-04
2024-09-04
2
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.04-8B-Philos_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.04-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos
efd0c251373e1a2fa2bc8cead502c03ff6dc7c8b
20.804991
apache-2.0
0
8
true
true
true
false
false
0.40844
40.843961
0.493001
27.963798
0.102719
10.271903
0.290268
5.369128
0.437219
13.685677
0.340259
26.695479
false
2024-09-05
2024-09-05
2
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo
3e76f190b505b515479cc25e92f8229c2b05159f
21.641046
apache-2.0
0
8
true
true
true
false
false
0.486576
48.657562
0.488077
27.207177
0.117069
11.706949
0.297819
6.375839
0.393188
6.848437
0.361453
29.05031
false
2024-09-09
2024-09-09
4
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math
0b2842bddfa6c308f67eb5a20daf04536a4e6d1a
21.719108
apache-2.0
0
8
true
true
true
false
false
0.507908
50.790791
0.484702
26.901201
0.104985
10.498489
0.296141
6.152125
0.406302
7.854427
0.353059
28.117612
false
2024-09-10
2024-09-10
3
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Relection_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Relection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Relection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Relection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Relection
dc900138b4406353b7e84251bc8649d70c16f13f
20.705803
apache-2.0
0
8
true
true
true
false
false
0.395226
39.522578
0.495531
27.571611
0.113293
11.329305
0.299497
6.599553
0.404813
10.401563
0.359292
28.81021
false
2024-09-16
2024-09-16
5
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1
c57c786426123635baf6c8b4d30638d2053f4565
22.259426
apache-2.0
0
8
true
true
true
false
false
0.531638
53.163828
0.482793
26.763685
0.108761
10.876133
0.29698
6.263982
0.410302
8.454427
0.352311
28.034501
false
2024-09-13
2024-09-13
3
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Llama-3.1-8B-Philos-Relection_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Relection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Relection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Llama-3.1-8B-Philos-Relection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Relection
4b0b75d9235886e8a947c45b94f87c5a65a81467
20.238252
apache-2.0
0
8
true
true
true
false
false
0.359605
35.960474
0.489769
27.769796
0.120091
12.009063
0.307886
7.718121
0.395729
9.632813
0.355053
28.339243
false
2024-09-17
2024-09-17
4
unsloth/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-MathMistral-Nemo-Base-2407-v2dpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-MathMistral-Nemo-Base-2407-v2dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo
6b7d851c66359f39d16da6fbcf810b816dc6e4bc
11.294454
apache-2.0
1
11
true
true
true
false
true
0.30972
30.972043
0.432764
21.145528
0.032477
3.247734
0.263423
1.789709
0.402958
8.969792
0.114777
1.641918
false
2024-08-21
2024-08-24
2
unsloth/Mistral-Nemo-Base-2407-bnb-4bit
Eric111_CatunaMayo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Eric111/CatunaMayo
23337893381293975cbcc35f75b634954fbcefaf
21.148098
apache-2.0
0
7
true
false
true
false
false
0.407416
40.741566
0.524364
33.299426
0.077039
7.703927
0.291946
5.592841
0.45399
15.348698
0.317819
24.202128
false
2024-02-15
2024-07-03
0
Eric111/CatunaMayo
Eric111_CatunaMayo-DPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Eric111/CatunaMayo-DPO
6bdbe06c10d57d152dd8a79a71edd8e30135b689
21.154416
apache-2.0
0
7
true
false
true
false
false
0.421454
42.145396
0.522399
33.089952
0.073263
7.326284
0.291946
5.592841
0.445031
14.66224
0.316988
24.109781
false
2024-02-21
2024-06-27
0
Eric111/CatunaMayo-DPO
Etherll_Herplete-LLM-Llama-3.1-8b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Herplete-LLM-Llama-3.1-8b
b3829cf437216f099c031a9ab5e4c8ec974766dd
19.588708
4
8
false
true
true
false
true
0.467191
46.71915
0.501343
28.952591
0.027946
2.794562
0.286074
4.809843
0.386
6.683333
0.348155
27.572769
false
2024-08-24
2024-08-29
1
Etherll/Herplete-LLM-Llama-3.1-8b (Merge)
Etherll_Replete-LLM-V3-Llama-3.1-8b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Replete-LLM-V3-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Replete-LLM-V3-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Replete-LLM-V3-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Replete-LLM-V3-Llama-3.1-8b
e79849d72f70ef74677ed81a8885403973b2470c
17.927882
4
8
false
true
true
false
true
0.526292
52.629246
0.454338
22.902455
0.000755
0.075529
0.268456
2.46085
0.351646
2.055729
0.346991
27.443484
false
2024-08-24
2024-08-26
1
Etherll/Replete-LLM-V3-Llama-3.1-8b (Merge)
Eurdem_Defne-llama3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Eurdem/Defne-llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eurdem/Defne-llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eurdem__Defne-llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Eurdem/Defne-llama3.1-8B
7832ba3066636bf4dab3e7d658c0b3ded12491ae
24.805902
llama3.1
2
8
true
true
true
false
false
0.503612
50.361153
0.532098
32.822381
0.141239
14.123867
0.296141
6.152125
0.433094
13.536719
0.386553
31.83917
false
2024-07-29
2024-08-14
0
Eurdem/Defne-llama3.1-8B
FallenMerick_Chewy-Lemon-Cookie-11B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/FallenMerick/Chewy-Lemon-Cookie-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FallenMerick/Chewy-Lemon-Cookie-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FallenMerick__Chewy-Lemon-Cookie-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FallenMerick/Chewy-Lemon-Cookie-11B
0f5d0d6d218b3ef034f58eba32d6fe7ac4c237ae
21.905256
cc-by-4.0
0
10
true
false
true
false
false
0.487524
48.752421
0.525112
33.0143
0.046073
4.607251
0.279362
3.914989
0.454552
15.952344
0.326712
25.190233
false
2024-06-06
2024-06-27
1
FallenMerick/Chewy-Lemon-Cookie-11B (Merge)
Felladrin_Llama-160M-Chat-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Felladrin/Llama-160M-Chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Llama-160M-Chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Llama-160M-Chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Felladrin/Llama-160M-Chat-v1
e7f50665676821867ee7dfad32d0ca9fb68fc6bc
4.101061
apache-2.0
14
0
true
true
true
false
true
0.157546
15.754642
0.303608
3.166756
0
0
0.25755
1.006711
0.366125
3.165625
0.113614
1.512633
false
2023-12-20
2024-07-23
1
JackFram/llama-160m
Felladrin_Minueza-32M-UltraChat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Felladrin/Minueza-32M-UltraChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Minueza-32M-UltraChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Minueza-32M-UltraChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Felladrin/Minueza-32M-UltraChat
28506b99c5902d2215eb378ec91d4226a7396c49
3.848727
apache-2.0
3
0
true
true
true
false
true
0.137563
13.756278
0.294148
2.43729
0
0
0.255872
0.782998
0.374187
4.640104
0.113281
1.475694
false
2024-02-27
2024-07-23
1
Felladrin/Minueza-32M-Base
FuJhen_ft-openhermes-25-mistral-7b-irca-dpo-pairs_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__ft-openhermes-25-mistral-7b-irca-dpo-pairs-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs
24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33
19.615525
apache-2.0
0
14
true
true
true
false
true
0.542004
54.20041
0.477303
26.596861
0.001511
0.151057
0.278523
3.803132
0.417375
11.205208
0.295628
21.73648
false
2024-09-12
2024-09-12
1
FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs (Merge)
FuJhen_mistral-instruct-7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/mistral-instruct-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral-instruct-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral-instruct-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/mistral-instruct-7B-DPO
e0bc86c23ce5aae1db576c8cca6f06f1f73af2db
18.96659
apache-2.0
0
14
true
true
true
false
true
0.496842
49.684171
0.462391
24.925827
0.034743
3.47432
0.277685
3.691275
0.401563
9.428646
0.303358
22.595301
false
2024-09-12
2024-09-12
1
FuJhen/mistral-instruct-7B-DPO (Merge)
FuJhen_mistral_7b_v0.1_structedData_e2e_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_e2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_e2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_e2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/mistral_7b_v0.1_structedData_e2e
7231864981174d9bee8c7687c24c8344414eae6b
10.871547
apache-2.0
0
7
true
true
true
false
false
0.172684
17.268403
0.411391
18.062424
0.002266
0.226586
0.279362
3.914989
0.372292
5.636458
0.281084
20.12042
false
2024-09-13
2024-09-13
1
FuJhen/mistral_7b_v0.1_structedData_e2e (Merge)
FuJhen_mistral_7b_v0.1_structedData_viggo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_viggo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_viggo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_viggo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/mistral_7b_v0.1_structedData_viggo
7231864981174d9bee8c7687c24c8344414eae6b
12.302113
apache-2.0
0
14
true
true
true
false
false
0.178329
17.832906
0.452386
23.960172
0.020393
2.039275
0.283557
4.474273
0.373813
3.926563
0.294215
21.579492
false
2024-09-13
2024-09-13
1
FuJhen/mistral_7b_v0.1_structedData_viggo (Merge)
GalrionSoftworks_MN-LooseCannon-12B-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/GalrionSoftworks/MN-LooseCannon-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MN-LooseCannon-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MN-LooseCannon-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
GalrionSoftworks/MN-LooseCannon-12B-v1
21.784548
6
12
false
true
true
false
true
0.541779
54.177915
0.512818
29.976062
0.064955
6.495468
0.285235
4.697987
0.413844
10.963802
0.319564
24.396055
false
2024-08-09
2024-09-05
1
GalrionSoftworks/MN-LooseCannon-12B-v1 (Merge)
GalrionSoftworks_MagnusIntellectus-12B-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/GalrionSoftworks/MagnusIntellectus-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MagnusIntellectus-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MagnusIntellectus-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
GalrionSoftworks/MagnusIntellectus-12B-v1
fc83cb3eec2f8328448c5fe3cb830fc77983a6b9
21.546709
4
12
false
true
true
false
true
0.442137
44.213686
0.532301
33.262254
0.05136
5.135952
0.284396
4.58613
0.442802
15.183594
0.342088
26.898641
false
2024-08-13
2024-09-05
1
GalrionSoftworks/MagnusIntellectus-12B-v1 (Merge)
GritLM_GritLM-7B-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/GritLM/GritLM-7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
GritLM/GritLM-7B-KTO
b5c48669508c1de18c698460c187f64e90e7df44
19.147778
apache-2.0
4
7
true
true
true
false
true
0.531013
53.101327
0.485294
27.904318
0.021903
2.190332
0.297819
6.375839
0.371021
6.644271
0.268035
18.670582
false
2024-04-16
2024-08-04
0
GritLM/GritLM-7B-KTO
GritLM_GritLM-8x7B-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/GritLM/GritLM-8x7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-8x7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-8x7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
GritLM/GritLM-8x7B-KTO
938913477064fcc498757c5136d9899bb6e713ed
25.624487
apache-2.0
3
46
true
true
true
false
true
0.571405
57.140498
0.58203
40.826162
0.085347
8.534743
0.296141
6.152125
0.421656
11.673698
0.364777
29.419696
false
2024-04-17
2024-08-04
0
GritLM/GritLM-8x7B-KTO
Gryphe_Pantheon-RP-1.0-8b-Llama-3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.0-8b-Llama-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.0-8b-Llama-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.0-8b-Llama-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Gryphe/Pantheon-RP-1.0-8b-Llama-3
70a6df202c9df9abdc6928bec5a5ab47f2667aee
16.684301
apache-2.0
43
8
true
true
true
false
true
0.393252
39.325213
0.453908
23.631915
0.052115
5.21148
0.276007
3.467562
0.38324
5.504948
0.306682
22.964687
false
2024-05-08
2024-06-27
1
meta-llama/Meta-Llama-3-8B
Gryphe_Pantheon-RP-1.5-12b-Nemo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.5-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.5-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.5-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Gryphe/Pantheon-RP-1.5-12b-Nemo
00107381f05f69666772d88a1b11affe77c94a47
21.23563
apache-2.0
27
12
true
true
true
false
true
0.476308
47.630842
0.519582
31.750144
0.043807
4.380665
0.272651
3.020134
0.442031
15.053906
0.330203
25.578088
false
2024-07-25
2024-08-04
1
mistralai/Mistral-Nemo-Base-2407
Gryphe_Pantheon-RP-1.6-12b-Nemo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Gryphe/Pantheon-RP-1.6-12b-Nemo
60cf38ae0367baf314e3cce748d9a199adfea557
20.314837
apache-2.0
11
12
true
true
true
false
true
0.448057
44.805671
0.520401
31.687344
0.030967
3.096677
0.277685
3.691275
0.42876
12.928385
0.331117
25.679669
false
2024-08-18
2024-08-31
1
mistralai/Mistral-Nemo-Base-2407
Gryphe_Pantheon-RP-1.6-12b-Nemo-KTO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO
6cb6d8d9a7352d71f539ab5053987e058c090443
21.319424
apache-2.0
3
12
true
true
true
false
true
0.463619
46.361875
0.527698
33.0322
0.03852
3.851964
0.295302
6.040268
0.424792
12.165625
0.338182
26.464613
false
2024-08-28
2024-08-31
1
mistralai/Mistral-Nemo-Base-2407
HiroseKoichi_Llama-Salad-4x8B-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/HiroseKoichi/Llama-Salad-4x8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HiroseKoichi/Llama-Salad-4x8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HiroseKoichi__Llama-Salad-4x8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HiroseKoichi/Llama-Salad-4x8B-V3
a343915429779efbd1478f01ba1f7fd9d8d226c0
24.746468
llama3
4
24
true
false
false
false
true
0.665352
66.535238
0.524465
31.928849
0.085347
8.534743
0.302852
7.04698
0.374031
6.453906
0.351812
27.979093
false
2024-06-17
2024-06-26
0
HiroseKoichi/Llama-Salad-4x8B-V3
HuggingFaceH4_zephyr-7b-alpha_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-alpha
2ce2d025864af849b3e5029e2ec9d568eeda892d
18.5341
mit
1,092
7
true
true
true
false
true
0.519148
51.914808
0.458786
23.955291
0.015106
1.510574
0.297819
6.375839
0.394958
7.503125
0.279505
19.944962
true
2023-10-09
2024-06-12
1
mistralai/Mistral-7B-v0.1
HuggingFaceH4_zephyr-7b-beta_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-beta
b70e0c9a2d9e14bd1e812d3c398e5f313e93b473
17.716709
mit
1,570
7
true
true
true
false
true
0.495043
49.504315
0.431582
21.487542
0.024169
2.416918
0.290268
5.369128
0.392542
7.734375
0.278092
19.787973
true
2023-10-26
2024-06-12
1
mistralai/Mistral-7B-v0.1
HuggingFaceH4_zephyr-7b-gemma-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-gemma-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-gemma-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-gemma-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-gemma-v0.1
03b3427d0ed07d2e0f86c0a7e53d82d4beef9540
15.778281
other
121
8
true
true
true
false
true
0.336374
33.637415
0.462374
23.751163
0.066465
6.646526
0.294463
5.928412
0.373969
4.179427
0.284741
20.526743
true
2024-03-01
2024-06-12
2
google/gemma-7b
HuggingFaceH4_zephyr-orpo-141b-A35b-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-orpo-141b-A35b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1
a3be084543d278e61b64cd600f28157afc79ffd3
33.773496
apache-2.0
260
140
true
true
true
false
true
0.651089
65.108911
0.629044
47.503796
0.183535
18.353474
0.378356
17.114094
0.446521
14.715104
0.45861
39.845597
true
2024-04-10
2024-06-12
1
mistral-community/Mixtral-8x22B-v0.1
HuggingFaceTB_SmolLM-1.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-1.7B
673a07602ca1191e5bc2ddda428e2f608a0a14c0
5.425399
apache-2.0
150
1
true
true
true
false
false
0.236157
23.615673
0.318052
4.411128
0.007553
0.755287
0.241611
0
0.342094
2.128385
0.114777
1.641918
false
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-1.7B
HuggingFaceTB_SmolLM-1.7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-1.7B-Instruct
0ad161e59935a9a691dfde2818df8b98786f30a7
5.138222
apache-2.0
96
1
true
true
true
false
true
0.234783
23.47826
0.288511
2.080374
0
0
0.260067
1.342282
0.348667
2.083333
0.116606
1.84508
false
2024-07-15
2024-07-18
1
HuggingFaceTB/SmolLM-1.7B
HuggingFaceTB_SmolLM-135M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M
eec6e461571fba3e197a57c298f60b75422eae02
6.838197
apache-2.0
156
0
true
true
true
false
false
0.212476
21.247623
0.304605
3.2854
0.006798
0.679758
0.258389
1.118568
0.436604
13.342188
0.112201
1.355644
false
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-135M
HuggingFaceTB_SmolLM-135M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M-Instruct
8ca7af58e27777cae460ad8ca3ab9db15f5c160d
4.234041
apache-2.0
85
0
true
true
true
false
true
0.15962
15.961981
0.288511
2.080374
0
0
0.264262
1.901566
0.367396
3.624479
0.116523
1.835845
false
2024-07-15
2024-07-18
1
HuggingFaceTB/SmolLM-135M
HuggingFaceTB_SmolLM-360M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M
318cc630b73730bfd712e5873063156ffb8936b5
6.147596
apache-2.0
50
0
true
true
true
false
false
0.213351
21.335058
0.306452
3.284915
0.004532
0.453172
0.267617
2.348993
0.401781
8.089323
0.112367
1.374113
false
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-360M
HuggingFaceTB_SmolLM-360M-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M-Instruct
8e951de8c220295ea4f85d078c4e320df7137535
4.706784
apache-2.0
64
0
true
true
true
false
true
0.195165
19.516549
0.288511
2.080374
0
0
0.264262
1.901566
0.347177
2.897135
0.116606
1.84508
false
2024-07-15
2024-08-20
1
HuggingFaceTB/SmolLM-360M
IDEA-CCNL_Ziya-LLaMA-13B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IDEA-CCNL/Ziya-LLaMA-13B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IDEA-CCNL__Ziya-LLaMA-13B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
IDEA-CCNL/Ziya-LLaMA-13B-v1
64d931f346e1a49ea3bbca07a83137075bab1c66
3.906425
gpl-3.0
273
13
true
true
true
false
false
0.169686
16.968643
0.287703
1.463617
0
0
0.249161
0
0.375052
3.88151
0.110123
1.124778
true
2023-05-16
2024-06-12
0
IDEA-CCNL/Ziya-LLaMA-13B-v1
Intel_neural-chat-7b-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3
fc679274dfcd28a8b6087634f71af7ed2a0659c4
17.943646
apache-2.0
65
7
true
true
true
false
false
0.277797
27.779736
0.504832
30.205692
0.021903
2.190332
0.291946
5.592841
0.50549
23.019531
0.269864
18.873744
true
2023-10-25
2024-06-12
1
mistralai/Mistral-7B-v0.1
Intel_neural-chat-7b-v3-1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-1
c0d379a49c1c0579529d5e6f2e936ddb759552a8
21.004986
apache-2.0
542
7
true
true
true
false
false
0.46869
46.868974
0.505157
29.739752
0.031722
3.172205
0.290268
5.369128
0.497896
22.236979
0.267786
18.642878
true
2023-11-14
2024-06-12
1
mistralai/Mistral-7B-v0.1
Intel_neural-chat-7b-v3-2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-2
0d8f77647810d21d935ea90c66d6339b85e65a75
21.433647
apache-2.0
56
7
true
true
true
false
false
0.49884
49.883975
0.503223
30.237458
0.045317
4.531722
0.290268
5.369128
0.489521
20.056771
0.266705
18.522828
true
2023-11-21
2024-06-12
0
Intel/neural-chat-7b-v3-2
Intel_neural-chat-7b-v3-3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-3
bdd31cf498d13782cc7497cba5896996ce429f91
19.99112
apache-2.0
74
7
true
true
true
false
false
0.476259
47.625855
0.487662
27.753851
0.006798
0.679758
0.28943
5.257271
0.485958
20.578125
0.262467
18.051862
true
2023-12-09
2024-06-12
2
mistralai/Mistral-7B-v0.1
Isaak-Carter_JOSIEv4o-8b-stage1-v4_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/JOSIEv4o-8b-stage1-v4
a8380a7be51b547761824e524b3d95ac73203122
15.567377
apache-2.0
1
8
true
true
true
false
false
0.255266
25.526603
0.472497
25.787276
0.046828
4.682779
0.291946
5.592841
0.365438
6.079687
0.331616
25.735077
false
2024-08-03
2024-08-03
0
Isaak-Carter/JOSIEv4o-8b-stage1-v4
Isaak-Carter_JOSIEv4o-8b-stage1-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/JOSIEv4o-8b-stage1-v4
a8380a7be51b547761824e524b3d95ac73203122
15.305979
apache-2.0
1
8
true
true
true
false
false
0.247697
24.769722
0.475807
25.919578
0.03852
3.851964
0.291107
5.480984
0.364104
6.346354
0.329205
25.467272
false
2024-08-03
2024-08-03
0
Isaak-Carter/JOSIEv4o-8b-stage1-v4
Jimmy19991222_Llama-3-Instruct-8B-SimPO-v0.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2
53a517ceaef324efc3626be44140b4f18a010591
24.229596
0
8
false
true
true
false
true
0.654037
65.403684
0.498371
29.123823
0.04003
4.003021
0.314597
8.612975
0.40125
8.389583
0.3686
29.844489
false
2024-09-06
0
Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun
00c02a823b4ff1a6cfcded6085ba9630df633998
23.742176
llama3
0
8
true
true
true
false
true
0.671722
67.172214
0.48798
27.755229
0.036254
3.625378
0.294463
5.928412
0.404073
8.709115
0.363364
29.262707
false
2024-09-17
2024-09-18
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4
de8bb28ad7a9d1158f318a4461dc47ad03e6e560
22.814724
0
8
false
true
true
false
true
0.628458
62.845805
0.498609
29.329732
0.016616
1.661631
0.292785
5.704698
0.401375
9.071875
0.354471
28.274601
false
2024-09-06
0
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun
e9692d8dbe30273839763757aa9ef07a5fcf0c59
24.033145
llama3
0
8
true
true
true
false
true
0.66775
66.775046
0.494046
28.390676
0.04003
4.003021
0.306208
7.494407
0.398708
8.005208
0.365775
29.530511
false
2024-09-14
2024-09-15
1
meta-llama/Meta-Llama-3-8B-Instruct
Josephgflowers_Cinder-Phi-2-V1-F16-gguf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/Cinder-Phi-2-V1-F16-gguf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Cinder-Phi-2-V1-F16-gguf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Cinder-Phi-2-V1-F16-gguf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/Cinder-Phi-2-V1-F16-gguf
85629ec9b18efee31d07630664e7a3815121badf
10.855703
mit
4
2
true
true
true
false
true
0.235657
23.565695
0.439662
22.453402
0
0
0.281879
4.250559
0.343458
1.965625
0.21609
12.898936
false
2024-02-25
2024-06-26
0
Josephgflowers/Cinder-Phi-2-V1-F16-gguf
Josephgflowers_TinyLlama-Cinder-Agent-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama-Cinder-Agent-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama-Cinder-Agent-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama-Cinder-Agent-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama-Cinder-Agent-v1
a9cd8b48bfe30f29bb1f819213da9a4c41eee67f
5.816564
mit
0
1
true
true
true
false
true
0.266956
26.695612
0.311604
3.804167
0.003776
0.377644
0.244128
0
0.339458
2.232292
0.116107
1.789672
false
2024-05-21
2024-06-26
4
Josephgflowers/TinyLlama-3T-Cinder-v1.2
Josephgflowers_TinyLlama_v1.1_math_code-world-test-1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama_v1.1_math_code-world-test-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama_v1.1_math_code-world-test-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama_v1.1_math_code-world-test-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama_v1.1_math_code-world-test-1
6f7c2aaf0b8723bc6a1dc23a4a1ff0ec24dc11ec
1.826578
mit
0
1
true
true
true
false
false
0.007844
0.784363
0.314635
4.164017
0.009063
0.906344
0.23406
0
0.349906
3.638281
0.113198
1.46646
false
2024-06-23
2024-09-09
0
Josephgflowers/TinyLlama_v1.1_math_code-world-test-1
KSU-HW-SEC_Llama3-70b-SVA-FT-1415_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-1415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-1415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-1415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-1415
1c09728455567898116d2d9cfb6cbbbbd4ee730c
35.80453
0
70
false
true
true
false
false
0.617991
61.799137
0.665015
51.328741
0.200906
20.090634
0.375
16.666667
0.456542
17.801042
0.524269
47.140957
false
2024-09-08
2024-09-08
0
KSU-HW-SEC/Llama3-70b-SVA-FT-1415
KSU-HW-SEC_Llama3-70b-SVA-FT-500_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-500
856a23f28aeada23d1135c86a37e05524307e8ed
35.613833
0
70
false
true
true
false
false
0.610522
61.05223
0.669224
51.887026
0.193353
19.335347
0.380872
17.449664
0.451146
16.993229
0.522689
46.965499
false
2024-09-08
2024-09-08
0
KSU-HW-SEC/Llama3-70b-SVA-FT-500
KSU-HW-SEC_Llama3-70b-SVA-FT-final_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-final-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-final
391bbd94173b34975d1aa2c7356977a630253b75
35.779134
0
70
false
true
true
false
false
0.616468
61.646764
0.665015
51.328741
0.200906
20.090634
0.375
16.666667
0.456542
17.801042
0.524269
47.140957
false
2024-09-08
2024-09-08
0
KSU-HW-SEC/Llama3-70b-SVA-FT-final
KSU-HW-SEC_Llama3.1-70b-SVA-FT-1000step_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3.1-70b-SVA-FT-1000step-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step
b195fea0d8f350ff29243d4e88654b1baa5af79e
40.334851
0
70
false
true
true
false
false
0.723804
72.380395
0.690312
55.485365
0.296073
29.607251
0.395973
19.463087
0.459177
17.830469
0.525183
47.242538
false
2024-09-08
2024-09-08
0
KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step
Kquant03_CognitiveFusion2-4x7B-BF16_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Kquant03/CognitiveFusion2-4x7B-BF16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kquant03/CognitiveFusion2-4x7B-BF16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kquant03__CognitiveFusion2-4x7B-BF16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kquant03/CognitiveFusion2-4x7B-BF16
db45b86c462bb93db7ba4f2c3fe3517582c859a1
15.515762
apache-2.0
3
24
true
false
false
false
true
0.356657
35.6657
0.410783
17.689003
0.050604
5.060423
0.286074
4.809843
0.414552
9.952344
0.279255
19.917258
false
2024-04-06
2024-07-31
0
Kquant03/CognitiveFusion2-4x7B-BF16
Kukedlc_NeuralExperiment-7b-MagicCoder-v7.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
43ea8d27d652dc15e4d27f665c5d636a5937780b
17.917888
apache-2.0
6
7
true
true
true
false
true
0.455251
45.525096
0.398845
16.386034
0.061178
6.117825
0.296141
6.152125
0.428198
13.058073
0.282414
20.268174
false
2024-03-07
2024-07-30
0
Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5