lombardata
commited on
Commit
•
6f06500
1
Parent(s):
bfab170
Upload README.md
Browse files
README.md
CHANGED
@@ -1,164 +1,293 @@
|
|
|
|
1 |
---
|
2 |
-
|
3 |
-
|
|
|
4 |
tags:
|
|
|
|
|
5 |
- generated_from_trainer
|
6 |
-
|
7 |
-
- accuracy
|
8 |
model-index:
|
9 |
- name: dinov2-large-2024_05_27-_batch-size32_epochs150_freeze
|
10 |
results: []
|
11 |
---
|
12 |
|
13 |
-
|
14 |
-
should probably proofread and complete it, then remove this comment. -->
|
15 |
-
|
16 |
-
# dinov2-large-2024_05_27-_batch-size32_epochs150_freeze
|
17 |
|
18 |
-
This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
|
19 |
-
It achieves the following results on the evaluation set:
|
20 |
- Loss: 0.1247
|
21 |
- F1 Micro: 0.8153
|
22 |
- F1 Macro: 0.7021
|
23 |
- Roc Auc: 0.8747
|
24 |
- Accuracy: 0.3144
|
25 |
-
- Learning Rate: 0.0000
|
26 |
|
27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
|
29 |
-
|
30 |
|
31 |
-
|
|
|
32 |
|
33 |
-
|
34 |
|
35 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
36 |
|
37 |
-
|
38 |
|
39 |
-
|
40 |
|
41 |
-
|
42 |
|
43 |
The following hyperparameters were used during training:
|
44 |
-
|
45 |
-
-
|
46 |
-
-
|
47 |
-
-
|
48 |
-
-
|
49 |
-
-
|
50 |
-
-
|
51 |
-
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
|
115 |
-
|
116 |
-
|
117 |
-
|
118 |
-
|
119 |
-
|
120 |
-
|
121 |
-
|
122 |
-
|
123 |
-
|
124 |
-
|
125 |
-
|
126 |
-
|
127 |
-
|
128 |
-
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
|
136 |
-
|
137 |
-
|
138 |
-
|
139 |
-
|
140 |
-
|
141 |
-
|
142 |
-
|
143 |
-
|
144 |
-
|
145 |
-
|
146 |
-
|
147 |
-
|
148 |
-
|
149 |
-
|
150 |
-
|
151 |
-
|
152 |
-
|
153 |
-
|
154 |
-
|
155 |
-
|
156 |
-
|
157 |
-
|
158 |
-
|
159 |
-
|
160 |
-
|
161 |
-
|
162 |
-
|
163 |
-
|
164 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
---
|
3 |
+
language:
|
4 |
+
- eng
|
5 |
+
license: wtfpl
|
6 |
tags:
|
7 |
+
- multilabel-image-classification
|
8 |
+
- multilabel
|
9 |
- generated_from_trainer
|
10 |
+
base_model: facebook/dinov2-large
|
|
|
11 |
model-index:
|
12 |
- name: dinov2-large-2024_05_27-_batch-size32_epochs150_freeze
|
13 |
results: []
|
14 |
---
|
15 |
|
16 |
+
DinoVd'eau is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set:
|
|
|
|
|
|
|
17 |
|
|
|
|
|
18 |
- Loss: 0.1247
|
19 |
- F1 Micro: 0.8153
|
20 |
- F1 Macro: 0.7021
|
21 |
- Roc Auc: 0.8747
|
22 |
- Accuracy: 0.3144
|
|
|
23 |
|
24 |
+
---
|
25 |
+
|
26 |
+
# Model description
|
27 |
+
DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
|
28 |
+
|
29 |
+
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
|
30 |
+
|
31 |
+
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
|
32 |
|
33 |
+
---
|
34 |
|
35 |
+
# Intended uses & limitations
|
36 |
+
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
|
37 |
|
38 |
+
---
|
39 |
|
40 |
+
# Training and evaluation data
|
41 |
+
Details on the number of images for each class are given in the following table:
|
42 |
+
| Class | train | val | test | Total |
|
43 |
+
|:-------------------------|--------:|------:|-------:|--------:|
|
44 |
+
| Acropore_branched | 1488 | 465 | 455 | 2408 |
|
45 |
+
| Acropore_digitised | 566 | 169 | 153 | 888 |
|
46 |
+
| Acropore_sub_massive | 147 | 48 | 48 | 243 |
|
47 |
+
| Acropore_tabular | 997 | 290 | 302 | 1589 |
|
48 |
+
| Algae_assembly | 2537 | 859 | 842 | 4238 |
|
49 |
+
| Algae_drawn_up | 368 | 121 | 131 | 620 |
|
50 |
+
| Algae_limestone | 1651 | 559 | 562 | 2772 |
|
51 |
+
| Algae_sodding | 3155 | 980 | 982 | 5117 |
|
52 |
+
| Atra/Leucospilota | 1090 | 359 | 343 | 1792 |
|
53 |
+
| Bleached_coral | 219 | 69 | 72 | 360 |
|
54 |
+
| Blurred | 190 | 63 | 67 | 320 |
|
55 |
+
| Dead_coral | 1981 | 644 | 639 | 3264 |
|
56 |
+
| Fish | 2029 | 657 | 635 | 3321 |
|
57 |
+
| Homo_sapiens | 160 | 63 | 59 | 282 |
|
58 |
+
| Human_object | 156 | 61 | 53 | 270 |
|
59 |
+
| Living_coral | 854 | 289 | 271 | 1414 |
|
60 |
+
| Millepore | 383 | 129 | 125 | 637 |
|
61 |
+
| No_acropore_encrusting | 420 | 153 | 152 | 725 |
|
62 |
+
| No_acropore_foliaceous | 204 | 44 | 38 | 286 |
|
63 |
+
| No_acropore_massive | 1017 | 345 | 343 | 1705 |
|
64 |
+
| No_acropore_solitary | 195 | 54 | 54 | 303 |
|
65 |
+
| No_acropore_sub_massive | 1383 | 445 | 428 | 2256 |
|
66 |
+
| Rock | 4469 | 1499 | 1489 | 7457 |
|
67 |
+
| Rubble | 3089 | 1011 | 1023 | 5123 |
|
68 |
+
| Sand | 5840 | 1949 | 1930 | 9719 |
|
69 |
+
| Sea_cucumber | 1413 | 445 | 436 | 2294 |
|
70 |
+
| Sea_urchins | 327 | 107 | 111 | 545 |
|
71 |
+
| Sponge | 269 | 104 | 97 | 470 |
|
72 |
+
| Syringodium_isoetifolium | 1214 | 388 | 393 | 1995 |
|
73 |
+
| Thalassodendron_ciliatum | 781 | 262 | 260 | 1303 |
|
74 |
+
| Useless | 579 | 193 | 193 | 965 |
|
75 |
|
76 |
+
---
|
77 |
|
78 |
+
# Training procedure
|
79 |
|
80 |
+
## Training hyperparameters
|
81 |
|
82 |
The following hyperparameters were used during training:
|
83 |
+
|
84 |
+
- **Number of Epochs**: 150
|
85 |
+
- **Learning Rate**: 0.001
|
86 |
+
- **Train Batch Size**: 32
|
87 |
+
- **Eval Batch Size**: 32
|
88 |
+
- **Optimizer**: Adam
|
89 |
+
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
|
90 |
+
- **Freeze Encoder**: Yes
|
91 |
+
- **Data Augmentation**: Yes
|
92 |
+
|
93 |
+
|
94 |
+
## Data Augmentation
|
95 |
+
Data were augmented using the following transformations :
|
96 |
+
|
97 |
+
Train Transforms
|
98 |
+
- **PreProcess**: No additional parameters
|
99 |
+
- **Resize**: probability=1.00
|
100 |
+
- **RandomHorizontalFlip**: probability=0.25
|
101 |
+
- **RandomVerticalFlip**: probability=0.25
|
102 |
+
- **ColorJiggle**: probability=0.25
|
103 |
+
- **RandomPerspective**: probability=0.25
|
104 |
+
- **Normalize**: probability=1.00
|
105 |
+
|
106 |
+
Val Transforms
|
107 |
+
- **PreProcess**: No additional parameters
|
108 |
+
- **Resize**: probability=1.00
|
109 |
+
- **Normalize**: probability=1.00
|
110 |
+
|
111 |
+
|
112 |
+
|
113 |
+
## Training results
|
114 |
+
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
|
115 |
+
--- | --- | --- | --- | --- | ---
|
116 |
+
1.0 | 0.1701383888721466 | 0.232726022688209 | 0.7380235658381353 | 0.4712591871520079 | 0.001
|
117 |
+
1.8315018315018317 | N/A | N/A | N/A | N/A | 0.001
|
118 |
+
2.0 | 0.15890291333198547 | 0.24888277758679958 | 0.7568708574323469 | 0.5722657636852799 | 0.001
|
119 |
+
3.0 | 0.15134122967720032 | 0.24785149535922998 | 0.7723932964583505 | 0.6104117366005594 | 0.001
|
120 |
+
3.663003663003663 | N/A | N/A | N/A | N/A | 0.001
|
121 |
+
4.0 | 0.15164224803447723 | 0.24853901684427637 | 0.7608496532472631 | 0.599745200497783 | 0.001
|
122 |
+
5.0 | 0.15243493020534515 | 0.24750773461670678 | 0.7692371752165224 | 0.5935106518877853 | 0.001
|
123 |
+
5.4945054945054945 | N/A | N/A | N/A | N/A | 0.001
|
124 |
+
6.0 | 0.14673969149589539 | 0.24269508422138192 | 0.7718080548414739 | 0.613788061665736 | 0.001
|
125 |
+
7.0 | 0.1506606936454773 | 0.2509453420419388 | 0.7732481363152289 | 0.6309898748773293 | 0.001
|
126 |
+
7.326007326007326 | N/A | N/A | N/A | N/A | 0.001
|
127 |
+
8.0 | 0.14430351555347443 | 0.2609144035751117 | 0.7828014555188422 | 0.6403740520777896 | 0.001
|
128 |
+
9.0 | 0.14617429673671722 | 0.25128910278446204 | 0.781416038551835 | 0.6366498775226099 | 0.001
|
129 |
+
9.157509157509157 | N/A | N/A | N/A | N/A | 0.001
|
130 |
+
10.0 | 0.14414818584918976 | 0.2688209006531454 | 0.7794745970641737 | 0.6254472467805158 | 0.001
|
131 |
+
10.989010989010989 | N/A | N/A | N/A | N/A | 0.001
|
132 |
+
11.0 | 0.14589238166809082 | 0.2595393606050189 | 0.7780349253103302 | 0.6357434835994208 | 0.001
|
133 |
+
12.0 | 0.14458976686000824 | 0.2554142316947405 | 0.7823495795575149 | 0.638389910481932 | 0.001
|
134 |
+
12.820512820512821 | N/A | N/A | N/A | N/A | 0.001
|
135 |
+
13.0 | 0.14142437279224396 | 0.2561017531797869 | 0.786284091383703 | 0.6574365741219022 | 0.001
|
136 |
+
14.0 | 0.1581379920244217 | 0.24682021313166036 | 0.7766990291262137 | 0.6245865910731833 | 0.001
|
137 |
+
14.652014652014651 | N/A | N/A | N/A | N/A | 0.001
|
138 |
+
15.0 | 0.1447945237159729 | 0.2598831213475421 | 0.7859620485615181 | 0.6552072842486797 | 0.001
|
139 |
+
16.0 | 0.1438169628381729 | 0.2605706428325885 | 0.7853051058530511 | 0.6495757946819554 | 0.001
|
140 |
+
16.483516483516482 | N/A | N/A | N/A | N/A | 0.001
|
141 |
+
17.0 | 0.14359386265277863 | 0.2506015812994156 | 0.7824457675812967 | 0.6310969679900952 | 0.001
|
142 |
+
18.0 | 0.1412857472896576 | 0.2564455139223101 | 0.7848311343456975 | 0.6531950395959965 | 0.001
|
143 |
+
18.315018315018314 | N/A | N/A | N/A | N/A | 0.001
|
144 |
+
19.0 | 0.14079046249389648 | 0.26022688209006534 | 0.7833830386020918 | 0.6486819478687708 | 0.001
|
145 |
+
20.0 | 0.14640754461288452 | 0.26400825025782054 | 0.7775968460747342 | 0.6262168341318395 | 0.001
|
146 |
+
20.146520146520146 | N/A | N/A | N/A | N/A | 0.001
|
147 |
+
21.0 | 0.1412632316350937 | 0.2653832932279134 | 0.7890916719110552 | 0.6582044080070929 | 0.001
|
148 |
+
21.978021978021978 | N/A | N/A | N/A | N/A | 0.001
|
149 |
+
22.0 | 0.14168681204319 | 0.2543829494671708 | 0.7871090517954659 | 0.6586947128782558 | 0.001
|
150 |
+
23.0 | 0.1393543779850006 | 0.269852182880715 | 0.7863651704353696 | 0.6427873985434494 | 0.001
|
151 |
+
23.80952380952381 | N/A | N/A | N/A | N/A | 0.001
|
152 |
+
24.0 | 0.14052371680736542 | 0.2588518391199725 | 0.7857706852844616 | 0.6618962794412713 | 0.001
|
153 |
+
25.0 | 0.1392364352941513 | 0.2653832932279134 | 0.7897693920335429 | 0.653320279245233 | 0.001
|
154 |
+
25.641025641025642 | N/A | N/A | N/A | N/A | 0.001
|
155 |
+
26.0 | 0.14239099621772766 | 0.27019594362323823 | 0.7838044308632545 | 0.6529431984792132 | 0.001
|
156 |
+
27.0 | 0.1386287957429886 | 0.2671020969405294 | 0.7974886125815585 | 0.6810613208979668 | 0.001
|
157 |
+
27.47252747252747 | N/A | N/A | N/A | N/A | 0.001
|
158 |
+
28.0 | 0.15519200265407562 | 0.2650395324853902 | 0.7791304347826087 | 0.6474807711800876 | 0.001
|
159 |
+
29.0 | 0.14190098643302917 | 0.27019594362323823 | 0.7913651213762871 | 0.6550381793679035 | 0.001
|
160 |
+
29.304029304029303 | N/A | N/A | N/A | N/A | 0.001
|
161 |
+
30.0 | 0.13986903429031372 | 0.2767273977311791 | 0.7857173292428311 | 0.663185953977854 | 0.001
|
162 |
+
31.0 | 0.13765402138233185 | 0.27260226882090066 | 0.7881844380403459 | 0.6554744698272669 | 0.001
|
163 |
+
31.135531135531135 | N/A | N/A | N/A | N/A | 0.001
|
164 |
+
32.0 | 0.13866138458251953 | 0.2677896184255758 | 0.7914770376499792 | 0.6596978272887946 | 0.001
|
165 |
+
32.967032967032964 | N/A | N/A | N/A | N/A | 0.001
|
166 |
+
33.0 | 0.13930276036262512 | 0.2605706428325885 | 0.7887546855476885 | 0.6583814800932023 | 0.001
|
167 |
+
34.0 | 0.1374826431274414 | 0.2763836369886559 | 0.795303262082937 | 0.6636727922636001 | 0.001
|
168 |
+
34.798534798534796 | N/A | N/A | N/A | N/A | 0.001
|
169 |
+
35.0 | 0.14001137018203735 | 0.25850807837744927 | 0.7860775988902434 | 0.6442491093834092 | 0.001
|
170 |
+
36.0 | 0.13899104297161102 | 0.26916466139566864 | 0.7890085033301218 | 0.6541220211466265 | 0.001
|
171 |
+
36.63003663003663 | N/A | N/A | N/A | N/A | 0.001
|
172 |
+
37.0 | 0.14101693034172058 | 0.2667583361980062 | 0.788356222091162 | 0.6602790790864311 | 0.001
|
173 |
+
38.0 | 0.13849563896656036 | 0.2633207287727741 | 0.7864065343433915 | 0.6508514926081754 | 0.001
|
174 |
+
38.46153846153846 | N/A | N/A | N/A | N/A | 0.001
|
175 |
+
39.0 | 0.14249388873577118 | 0.26263320728772777 | 0.7819844457738655 | 0.6513021077089046 | 0.001
|
176 |
+
40.0 | 0.1512959599494934 | 0.2633207287727741 | 0.7819497946916141 | 0.6421624481517915 | 0.001
|
177 |
+
40.29304029304029 | N/A | N/A | N/A | N/A | 0.0001
|
178 |
+
41.0 | 0.1416281908750534 | 0.27157098659333107 | 0.795353889863792 | 0.6708412782877394 | 0.0001
|
179 |
+
42.0 | 0.13480685651302338 | 0.2811962873839807 | 0.8014906832298136 | 0.6820172839356666 | 0.0001
|
180 |
+
42.124542124542124 | N/A | N/A | N/A | N/A | 0.0001
|
181 |
+
43.0 | 0.1342025101184845 | 0.2756961155036095 | 0.8014919187733112 | 0.681931169239128 | 0.0001
|
182 |
+
43.956043956043956 | N/A | N/A | N/A | N/A | 0.0001
|
183 |
+
44.0 | 0.1327475756406784 | 0.2811962873839807 | 0.8019789631231031 | 0.683693351140427 | 0.0001
|
184 |
+
45.0 | 0.1318245828151703 | 0.2811962873839807 | 0.8049446006284108 | 0.6900135704395078 | 0.0001
|
185 |
+
45.78754578754579 | N/A | N/A | N/A | N/A | 0.0001
|
186 |
+
46.0 | 0.13027183711528778 | 0.28910278446201443 | 0.8063969585520062 | 0.6920134474185277 | 0.0001
|
187 |
+
47.0 | 0.12985946238040924 | 0.284977655551736 | 0.8065087538619978 | 0.6938459582689339 | 0.0001
|
188 |
+
47.61904761904762 | N/A | N/A | N/A | N/A | 0.0001
|
189 |
+
48.0 | 0.12981055676937103 | 0.2853214162942592 | 0.8031727379553465 | 0.6917397436201066 | 0.0001
|
190 |
+
49.0 | 0.1301460713148117 | 0.2839463733241664 | 0.8081048867699644 | 0.6980761423122126 | 0.0001
|
191 |
+
49.45054945054945 | N/A | N/A | N/A | N/A | 0.0001
|
192 |
+
50.0 | 0.1294524371623993 | 0.2829150910965968 | 0.8056895691232739 | 0.6968263757426811 | 0.0001
|
193 |
+
51.0 | 0.12989668548107147 | 0.2846338948092128 | 0.8078541374474054 | 0.6981227572539419 | 0.0001
|
194 |
+
51.282051282051285 | N/A | N/A | N/A | N/A | 0.0001
|
195 |
+
52.0 | 0.13097986578941345 | 0.284977655551736 | 0.809621541745341 | 0.7032059573412642 | 0.0001
|
196 |
+
53.0 | 0.12910524010658264 | 0.288415262976968 | 0.8082875892525485 | 0.6952081515364695 | 0.0001
|
197 |
+
53.11355311355312 | N/A | N/A | N/A | N/A | 0.0001
|
198 |
+
54.0 | 0.1276824176311493 | 0.2860089377793056 | 0.8055729885778838 | 0.6914506394370794 | 0.0001
|
199 |
+
54.94505494505494 | N/A | N/A | N/A | N/A | 0.0001
|
200 |
+
55.0 | 0.12751279771327972 | 0.28979030594706084 | 0.8091508143727464 | 0.7051415507931676 | 0.0001
|
201 |
+
56.0 | 0.12798655033111572 | 0.2911653489171537 | 0.8077718065316246 | 0.6990943862949641 | 0.0001
|
202 |
+
56.776556776556774 | N/A | N/A | N/A | N/A | 0.0001
|
203 |
+
57.0 | 0.1279618740081787 | 0.29150910965967686 | 0.8107930240210597 | 0.7001268142729874 | 0.0001
|
204 |
+
58.0 | 0.1280883550643921 | 0.290134066689584 | 0.8108946874106743 | 0.7039327958876614 | 0.0001
|
205 |
+
58.608058608058606 | N/A | N/A | N/A | N/A | 0.0001
|
206 |
+
59.0 | 0.1287168562412262 | 0.2873839807493984 | 0.8071845383437488 | 0.699653006099352 | 0.0001
|
207 |
+
60.0 | 0.1270500272512436 | 0.28875902371949125 | 0.8103491168421926 | 0.7042073996338176 | 0.0001
|
208 |
+
60.43956043956044 | N/A | N/A | N/A | N/A | 0.0001
|
209 |
+
61.0 | 0.1269637793302536 | 0.28944654520453766 | 0.8072888368788399 | 0.6994480698947442 | 0.0001
|
210 |
+
62.0 | 0.12639474868774414 | 0.28979030594706084 | 0.8124407826982492 | 0.7105518005302388 | 0.0001
|
211 |
+
62.27106227106227 | N/A | N/A | N/A | N/A | 0.0001
|
212 |
+
63.0 | 0.12643341720104218 | 0.2918528704022001 | 0.8093336660843524 | 0.7042257858113937 | 0.0001
|
213 |
+
64.0 | 0.12570597231388092 | 0.2918528704022001 | 0.8119739624362535 | 0.7054117610081568 | 0.0001
|
214 |
+
64.1025641025641 | N/A | N/A | N/A | N/A | 0.0001
|
215 |
+
65.0 | 0.12599390745162964 | 0.29322791337229287 | 0.8103770839396333 | 0.7040599127700347 | 0.0001
|
216 |
+
65.93406593406593 | N/A | N/A | N/A | N/A | 0.0001
|
217 |
+
66.0 | 0.12674611806869507 | 0.29769680302509455 | 0.8141795311606633 | 0.7083351143800681 | 0.0001
|
218 |
+
67.0 | 0.12676431238651276 | 0.28979030594706084 | 0.8090950582963362 | 0.6998024530144022 | 0.0001
|
219 |
+
67.76556776556777 | N/A | N/A | N/A | N/A | 0.0001
|
220 |
+
68.0 | 0.12638631463050842 | 0.2928841526297697 | 0.8127327032445482 | 0.7034736625177254 | 0.0001
|
221 |
+
69.0 | 0.12608103454113007 | 0.2952904778274321 | 0.8131967584022379 | 0.7078892431331377 | 0.0001
|
222 |
+
69.59706959706959 | N/A | N/A | N/A | N/A | 0.0001
|
223 |
+
70.0 | 0.12582050263881683 | 0.29150910965967686 | 0.8136722606120435 | 0.7081157868651535 | 0.0001
|
224 |
+
71.0 | 0.12533149123191833 | 0.2918528704022001 | 0.8123295595405339 | 0.7044517956080781 | 1e-05
|
225 |
+
71.42857142857143 | N/A | N/A | N/A | N/A | 1e-05
|
226 |
+
72.0 | 0.1258901059627533 | 0.2966655207975249 | 0.8159506713723581 | 0.7099295458861072 | 1e-05
|
227 |
+
73.0 | 0.12526649236679077 | 0.2949467170849089 | 0.8159496670343587 | 0.7116557450872655 | 1e-05
|
228 |
+
73.26007326007326 | N/A | N/A | N/A | N/A | 1e-05
|
229 |
+
74.0 | 0.12490212172269821 | 0.29769680302509455 | 0.8156100747030249 | 0.7159515864206437 | 1e-05
|
230 |
+
75.0 | 0.12504002451896667 | 0.2966655207975249 | 0.8135426082669078 | 0.7082306828309269 | 1e-05
|
231 |
+
75.0915750915751 | N/A | N/A | N/A | N/A | 1e-05
|
232 |
+
76.0 | 0.12634462118148804 | 0.2966655207975249 | 0.8099675513769865 | 0.6998917153140419 | 1e-05
|
233 |
+
76.92307692307692 | N/A | N/A | N/A | N/A | 1e-05
|
234 |
+
77.0 | 0.1249643936753273 | 0.2966655207975249 | 0.8142915811088296 | 0.7104044870773909 | 1e-05
|
235 |
+
78.0 | 0.12509745359420776 | 0.2939154348573393 | 0.812339968613199 | 0.7076718539561497 | 1e-05
|
236 |
+
78.75457875457876 | N/A | N/A | N/A | N/A | 1e-05
|
237 |
+
79.0 | 0.12465520948171616 | 0.29838432451014096 | 0.8147326016360423 | 0.7097766100728804 | 1e-05
|
238 |
+
80.0 | 0.12526248395442963 | 0.2990718459951873 | 0.8166140393490405 | 0.7133791911991404 | 1e-05
|
239 |
+
80.58608058608058 | N/A | N/A | N/A | N/A | 1e-05
|
240 |
+
81.0 | 0.12510864436626434 | 0.2952904778274321 | 0.8121923983622152 | 0.705898272950067 | 1e-05
|
241 |
+
82.0 | 0.12532733380794525 | 0.29975936748023374 | 0.8150326797385622 | 0.7095032932540235 | 1e-05
|
242 |
+
82.41758241758242 | N/A | N/A | N/A | N/A | 1e-05
|
243 |
+
83.0 | 0.12474868446588516 | 0.29597799931247853 | 0.815855206584497 | 0.7124383950303705 | 1e-05
|
244 |
+
84.0 | 0.12511858344078064 | 0.3007906497078034 | 0.8175330467926365 | 0.7138847615465347 | 1e-05
|
245 |
+
84.24908424908425 | N/A | N/A | N/A | N/A | 1e-05
|
246 |
+
85.0 | 0.12457013875246048 | 0.2966655207975249 | 0.8132141082960754 | 0.7054571621251418 | 1e-05
|
247 |
+
86.0 | 0.1251869946718216 | 0.2946029563423857 | 0.8143732269868025 | 0.7142702846379808 | 1e-05
|
248 |
+
86.08058608058609 | N/A | N/A | N/A | N/A | 1e-05
|
249 |
+
87.0 | 0.12492978572845459 | 0.2935716741148161 | 0.8135328455150868 | 0.7081357577756824 | 1e-05
|
250 |
+
87.91208791208791 | N/A | N/A | N/A | N/A | 1e-05
|
251 |
+
88.0 | 0.12513719499111176 | 0.2990718459951873 | 0.815831263487927 | 0.7099379006276698 | 1e-05
|
252 |
+
89.0 | 0.12514576315879822 | 0.2963217600550017 | 0.8143914473684211 | 0.7092910188720426 | 1e-05
|
253 |
+
89.74358974358974 | N/A | N/A | N/A | N/A | 1e-05
|
254 |
+
90.0 | 0.1244530975818634 | 0.2942591955998625 | 0.8134516195584898 | 0.7121664381657501 | 1e-05
|
255 |
+
91.0 | 0.12501013278961182 | 0.2990718459951873 | 0.8153902768123646 | 0.7106178930468596 | 1e-05
|
256 |
+
91.57509157509158 | N/A | N/A | N/A | N/A | 1e-05
|
257 |
+
92.0 | 0.12525025010108948 | 0.2973530422825713 | 0.8163049232398094 | 0.7140173113811211 | 1e-05
|
258 |
+
93.0 | 0.12471849471330643 | 0.29872808525266414 | 0.8148661314641998 | 0.7129019083206937 | 1e-05
|
259 |
+
93.4065934065934 | N/A | N/A | N/A | N/A | 1e-05
|
260 |
+
94.0 | 0.12515641748905182 | 0.2980405637676177 | 0.8141884924726748 | 0.7053935701419592 | 1e-05
|
261 |
+
95.0 | 0.12481416761875153 | 0.30147817119284975 | 0.8165906870726147 | 0.7134995447430972 | 1e-05
|
262 |
+
95.23809523809524 | N/A | N/A | N/A | N/A | 1e-05
|
263 |
+
96.0 | 0.12492986023426056 | 0.2980405637676177 | 0.8160666176830762 | 0.7110442004495683 | 1e-05
|
264 |
+
97.0 | 0.12459924072027206 | 0.30147817119284975 | 0.8168590473093806 | 0.7158597011246477 | 1.0000000000000002e-06
|
265 |
+
97.06959706959707 | N/A | N/A | N/A | N/A | 1.0000000000000002e-06
|
266 |
+
98.0 | 0.12447398155927658 | 0.29975936748023374 | 0.8149457415323906 | 0.707122866121441 | 1.0000000000000002e-06
|
267 |
+
98.9010989010989 | N/A | N/A | N/A | N/A | 1.0000000000000002e-06
|
268 |
+
99.0 | 0.12462905794382095 | 0.30216569267789617 | 0.8165748111859562 | 0.7182970295785608 | 1.0000000000000002e-06
|
269 |
+
100.0 | 0.12463195621967316 | 0.30147817119284975 | 0.8161644284310514 | 0.7136275002413193 | 1.0000000000000002e-06
|
270 |
+
|
271 |
+
|
272 |
+
---
|
273 |
+
|
274 |
+
# CO2 Emissions
|
275 |
+
|
276 |
+
The estimated CO2 emissions for training this model are documented below:
|
277 |
+
|
278 |
+
- **Emissions**: 1.562242452449767 grams of CO2
|
279 |
+
- **Source**: Code Carbon
|
280 |
+
- **Training Type**: fine-tuning
|
281 |
+
- **Geographical Location**: Brest, France
|
282 |
+
- **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
|
283 |
+
|
284 |
+
|
285 |
+
---
|
286 |
+
|
287 |
+
# Framework Versions
|
288 |
+
|
289 |
+
- **Transformers**: 4.41.1
|
290 |
+
- **Pytorch**: 2.3.0+cu121
|
291 |
+
- **Datasets**: 2.19.1
|
292 |
+
- **Tokenizers**: 0.19.1
|
293 |
+
|