lombardata commited on
Commit
6f06500
1 Parent(s): bfab170

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +269 -140
README.md CHANGED
@@ -1,164 +1,293 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-large
 
4
  tags:
 
 
5
  - generated_from_trainer
6
- metrics:
7
- - accuracy
8
  model-index:
9
  - name: dinov2-large-2024_05_27-_batch-size32_epochs150_freeze
10
  results: []
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
-
16
- # dinov2-large-2024_05_27-_batch-size32_epochs150_freeze
17
 
18
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
- It achieves the following results on the evaluation set:
20
  - Loss: 0.1247
21
  - F1 Micro: 0.8153
22
  - F1 Macro: 0.7021
23
  - Roc Auc: 0.8747
24
  - Accuracy: 0.3144
25
- - Learning Rate: 0.0000
26
 
27
- ## Model description
 
 
 
 
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
 
32
 
33
- More information needed
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 32
46
- - eval_batch_size: 32
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 150
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
56
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
57
- | No log | 1.0 | 273 | 0.1701 | 0.7380 | 0.4713 | 0.8226 | 0.2327 | 0.001 |
58
- | 0.2748 | 2.0 | 546 | 0.1589 | 0.7569 | 0.5723 | 0.8372 | 0.2489 | 0.001 |
59
- | 0.2748 | 3.0 | 819 | 0.1513 | 0.7724 | 0.6104 | 0.8516 | 0.2479 | 0.001 |
60
- | 0.1714 | 4.0 | 1092 | 0.1516 | 0.7608 | 0.5997 | 0.8311 | 0.2485 | 0.001 |
61
- | 0.1714 | 5.0 | 1365 | 0.1524 | 0.7692 | 0.5935 | 0.8475 | 0.2475 | 0.001 |
62
- | 0.1661 | 6.0 | 1638 | 0.1467 | 0.7718 | 0.6138 | 0.8414 | 0.2427 | 0.001 |
63
- | 0.1661 | 7.0 | 1911 | 0.1507 | 0.7732 | 0.6310 | 0.8437 | 0.2509 | 0.001 |
64
- | 0.1611 | 8.0 | 2184 | 0.1443 | 0.7828 | 0.6404 | 0.8537 | 0.2609 | 0.001 |
65
- | 0.1611 | 9.0 | 2457 | 0.1462 | 0.7814 | 0.6366 | 0.8571 | 0.2513 | 0.001 |
66
- | 0.1606 | 10.0 | 2730 | 0.1441 | 0.7795 | 0.6254 | 0.8489 | 0.2688 | 0.001 |
67
- | 0.1592 | 11.0 | 3003 | 0.1459 | 0.7780 | 0.6357 | 0.8495 | 0.2595 | 0.001 |
68
- | 0.1592 | 12.0 | 3276 | 0.1446 | 0.7823 | 0.6384 | 0.8601 | 0.2554 | 0.001 |
69
- | 0.1582 | 13.0 | 3549 | 0.1414 | 0.7863 | 0.6574 | 0.8560 | 0.2561 | 0.001 |
70
- | 0.1582 | 14.0 | 3822 | 0.1581 | 0.7767 | 0.6246 | 0.8491 | 0.2468 | 0.001 |
71
- | 0.1575 | 15.0 | 4095 | 0.1448 | 0.7860 | 0.6552 | 0.8622 | 0.2599 | 0.001 |
72
- | 0.1575 | 16.0 | 4368 | 0.1438 | 0.7853 | 0.6496 | 0.8571 | 0.2606 | 0.001 |
73
- | 0.158 | 17.0 | 4641 | 0.1436 | 0.7824 | 0.6311 | 0.8547 | 0.2506 | 0.001 |
74
- | 0.158 | 18.0 | 4914 | 0.1413 | 0.7848 | 0.6532 | 0.8570 | 0.2564 | 0.001 |
75
- | 0.1571 | 19.0 | 5187 | 0.1408 | 0.7834 | 0.6487 | 0.8529 | 0.2602 | 0.001 |
76
- | 0.1571 | 20.0 | 5460 | 0.1464 | 0.7776 | 0.6262 | 0.8444 | 0.2640 | 0.001 |
77
- | 0.1579 | 21.0 | 5733 | 0.1413 | 0.7891 | 0.6582 | 0.8563 | 0.2654 | 0.001 |
78
- | 0.1564 | 22.0 | 6006 | 0.1417 | 0.7871 | 0.6587 | 0.8590 | 0.2544 | 0.001 |
79
- | 0.1564 | 23.0 | 6279 | 0.1394 | 0.7864 | 0.6428 | 0.8533 | 0.2699 | 0.001 |
80
- | 0.1554 | 24.0 | 6552 | 0.1405 | 0.7858 | 0.6619 | 0.8571 | 0.2589 | 0.001 |
81
- | 0.1554 | 25.0 | 6825 | 0.1392 | 0.7898 | 0.6533 | 0.8568 | 0.2654 | 0.001 |
82
- | 0.1554 | 26.0 | 7098 | 0.1424 | 0.7838 | 0.6529 | 0.8503 | 0.2702 | 0.001 |
83
- | 0.1554 | 27.0 | 7371 | 0.1386 | 0.7975 | 0.6811 | 0.8670 | 0.2671 | 0.001 |
84
- | 0.156 | 28.0 | 7644 | 0.1552 | 0.7791 | 0.6475 | 0.8544 | 0.2650 | 0.001 |
85
- | 0.156 | 29.0 | 7917 | 0.1419 | 0.7914 | 0.6550 | 0.8615 | 0.2702 | 0.001 |
86
- | 0.1548 | 30.0 | 8190 | 0.1399 | 0.7857 | 0.6632 | 0.8503 | 0.2767 | 0.001 |
87
- | 0.1548 | 31.0 | 8463 | 0.1377 | 0.7882 | 0.6555 | 0.8530 | 0.2726 | 0.001 |
88
- | 0.1554 | 32.0 | 8736 | 0.1387 | 0.7915 | 0.6597 | 0.8614 | 0.2678 | 0.001 |
89
- | 0.1551 | 33.0 | 9009 | 0.1393 | 0.7888 | 0.6584 | 0.8581 | 0.2606 | 0.001 |
90
- | 0.1551 | 34.0 | 9282 | 0.1375 | 0.7953 | 0.6637 | 0.8637 | 0.2764 | 0.001 |
91
- | 0.1544 | 35.0 | 9555 | 0.1400 | 0.7861 | 0.6442 | 0.8541 | 0.2585 | 0.001 |
92
- | 0.1544 | 36.0 | 9828 | 0.1390 | 0.7890 | 0.6541 | 0.8567 | 0.2692 | 0.001 |
93
- | 0.1555 | 37.0 | 10101 | 0.1410 | 0.7884 | 0.6603 | 0.8548 | 0.2668 | 0.001 |
94
- | 0.1555 | 38.0 | 10374 | 0.1385 | 0.7864 | 0.6509 | 0.8525 | 0.2633 | 0.001 |
95
- | 0.1547 | 39.0 | 10647 | 0.1425 | 0.7820 | 0.6513 | 0.8475 | 0.2626 | 0.001 |
96
- | 0.1547 | 40.0 | 10920 | 0.1513 | 0.7819 | 0.6422 | 0.8500 | 0.2633 | 0.001 |
97
- | 0.1527 | 41.0 | 11193 | 0.1416 | 0.7954 | 0.6708 | 0.8608 | 0.2716 | 0.0001 |
98
- | 0.1527 | 42.0 | 11466 | 0.1348 | 0.8015 | 0.6820 | 0.8667 | 0.2812 | 0.0001 |
99
- | 0.1455 | 43.0 | 11739 | 0.1342 | 0.8015 | 0.6819 | 0.8665 | 0.2757 | 0.0001 |
100
- | 0.1416 | 44.0 | 12012 | 0.1327 | 0.8020 | 0.6837 | 0.8658 | 0.2812 | 0.0001 |
101
- | 0.1416 | 45.0 | 12285 | 0.1318 | 0.8049 | 0.6900 | 0.8690 | 0.2812 | 0.0001 |
102
- | 0.1402 | 46.0 | 12558 | 0.1303 | 0.8064 | 0.6920 | 0.8700 | 0.2891 | 0.0001 |
103
- | 0.1402 | 47.0 | 12831 | 0.1299 | 0.8065 | 0.6938 | 0.8709 | 0.2850 | 0.0001 |
104
- | 0.1387 | 48.0 | 13104 | 0.1298 | 0.8032 | 0.6917 | 0.8638 | 0.2853 | 0.0001 |
105
- | 0.1387 | 49.0 | 13377 | 0.1301 | 0.8081 | 0.6981 | 0.8725 | 0.2839 | 0.0001 |
106
- | 0.1391 | 50.0 | 13650 | 0.1295 | 0.8057 | 0.6968 | 0.8677 | 0.2829 | 0.0001 |
107
- | 0.1391 | 51.0 | 13923 | 0.1299 | 0.8079 | 0.6981 | 0.8713 | 0.2846 | 0.0001 |
108
- | 0.1374 | 52.0 | 14196 | 0.1310 | 0.8096 | 0.7032 | 0.8737 | 0.2850 | 0.0001 |
109
- | 0.1374 | 53.0 | 14469 | 0.1291 | 0.8083 | 0.6952 | 0.8714 | 0.2884 | 0.0001 |
110
- | 0.1367 | 54.0 | 14742 | 0.1277 | 0.8056 | 0.6915 | 0.8660 | 0.2860 | 0.0001 |
111
- | 0.1364 | 55.0 | 15015 | 0.1275 | 0.8092 | 0.7051 | 0.8706 | 0.2898 | 0.0001 |
112
- | 0.1364 | 56.0 | 15288 | 0.1280 | 0.8078 | 0.6991 | 0.8706 | 0.2912 | 0.0001 |
113
- | 0.135 | 57.0 | 15561 | 0.1280 | 0.8108 | 0.7001 | 0.8737 | 0.2915 | 0.0001 |
114
- | 0.135 | 58.0 | 15834 | 0.1281 | 0.8109 | 0.7039 | 0.8759 | 0.2901 | 0.0001 |
115
- | 0.1345 | 59.0 | 16107 | 0.1287 | 0.8072 | 0.6997 | 0.8700 | 0.2874 | 0.0001 |
116
- | 0.1345 | 60.0 | 16380 | 0.1271 | 0.8103 | 0.7042 | 0.8712 | 0.2888 | 0.0001 |
117
- | 0.134 | 61.0 | 16653 | 0.1270 | 0.8073 | 0.6994 | 0.8672 | 0.2894 | 0.0001 |
118
- | 0.134 | 62.0 | 16926 | 0.1264 | 0.8124 | 0.7106 | 0.8742 | 0.2898 | 0.0001 |
119
- | 0.1331 | 63.0 | 17199 | 0.1264 | 0.8093 | 0.7042 | 0.8697 | 0.2919 | 0.0001 |
120
- | 0.1331 | 64.0 | 17472 | 0.1257 | 0.8120 | 0.7054 | 0.8721 | 0.2919 | 0.0001 |
121
- | 0.1327 | 65.0 | 17745 | 0.1260 | 0.8104 | 0.7041 | 0.8704 | 0.2932 | 0.0001 |
122
- | 0.1319 | 66.0 | 18018 | 0.1267 | 0.8142 | 0.7083 | 0.8777 | 0.2977 | 0.0001 |
123
- | 0.1319 | 67.0 | 18291 | 0.1268 | 0.8091 | 0.6998 | 0.8703 | 0.2898 | 0.0001 |
124
- | 0.1319 | 68.0 | 18564 | 0.1264 | 0.8127 | 0.7035 | 0.8764 | 0.2929 | 0.0001 |
125
- | 0.1319 | 69.0 | 18837 | 0.1261 | 0.8132 | 0.7079 | 0.8750 | 0.2953 | 0.0001 |
126
- | 0.1308 | 70.0 | 19110 | 0.1258 | 0.8137 | 0.7081 | 0.8753 | 0.2915 | 0.0001 |
127
- | 0.1308 | 71.0 | 19383 | 0.1253 | 0.8123 | 0.7045 | 0.8733 | 0.2919 | 1e-05 |
128
- | 0.1294 | 72.0 | 19656 | 0.1259 | 0.8160 | 0.7099 | 0.8807 | 0.2967 | 1e-05 |
129
- | 0.1294 | 73.0 | 19929 | 0.1253 | 0.8159 | 0.7117 | 0.8786 | 0.2949 | 1e-05 |
130
- | 0.1287 | 74.0 | 20202 | 0.1249 | 0.8156 | 0.7160 | 0.8786 | 0.2977 | 1e-05 |
131
- | 0.1287 | 75.0 | 20475 | 0.1250 | 0.8135 | 0.7082 | 0.8756 | 0.2967 | 1e-05 |
132
- | 0.1282 | 76.0 | 20748 | 0.1263 | 0.8100 | 0.6999 | 0.8700 | 0.2967 | 1e-05 |
133
- | 0.1285 | 77.0 | 21021 | 0.1250 | 0.8143 | 0.7104 | 0.8761 | 0.2967 | 1e-05 |
134
- | 0.1285 | 78.0 | 21294 | 0.1251 | 0.8123 | 0.7077 | 0.8734 | 0.2939 | 1e-05 |
135
- | 0.1281 | 79.0 | 21567 | 0.1247 | 0.8147 | 0.7098 | 0.8761 | 0.2984 | 1e-05 |
136
- | 0.1281 | 80.0 | 21840 | 0.1253 | 0.8166 | 0.7134 | 0.8817 | 0.2991 | 1e-05 |
137
- | 0.1281 | 81.0 | 22113 | 0.1251 | 0.8122 | 0.7059 | 0.8729 | 0.2953 | 1e-05 |
138
- | 0.1281 | 82.0 | 22386 | 0.1253 | 0.8150 | 0.7095 | 0.8781 | 0.2998 | 1e-05 |
139
- | 0.1269 | 83.0 | 22659 | 0.1247 | 0.8159 | 0.7124 | 0.8778 | 0.2960 | 1e-05 |
140
- | 0.1269 | 84.0 | 22932 | 0.1251 | 0.8175 | 0.7139 | 0.8817 | 0.3008 | 1e-05 |
141
- | 0.1267 | 85.0 | 23205 | 0.1246 | 0.8132 | 0.7055 | 0.8732 | 0.2967 | 1e-05 |
142
- | 0.1267 | 86.0 | 23478 | 0.1252 | 0.8144 | 0.7143 | 0.8758 | 0.2946 | 1e-05 |
143
- | 0.1274 | 87.0 | 23751 | 0.1249 | 0.8135 | 0.7081 | 0.8754 | 0.2936 | 1e-05 |
144
- | 0.1263 | 88.0 | 24024 | 0.1251 | 0.8158 | 0.7099 | 0.8795 | 0.2991 | 1e-05 |
145
- | 0.1263 | 89.0 | 24297 | 0.1251 | 0.8144 | 0.7093 | 0.8758 | 0.2963 | 1e-05 |
146
- | 0.1272 | 90.0 | 24570 | 0.1245 | 0.8135 | 0.7122 | 0.8743 | 0.2943 | 1e-05 |
147
- | 0.1272 | 91.0 | 24843 | 0.1250 | 0.8154 | 0.7106 | 0.8780 | 0.2991 | 1e-05 |
148
- | 0.1275 | 92.0 | 25116 | 0.1253 | 0.8163 | 0.7140 | 0.8797 | 0.2974 | 1e-05 |
149
- | 0.1275 | 93.0 | 25389 | 0.1247 | 0.8149 | 0.7129 | 0.8787 | 0.2987 | 1e-05 |
150
- | 0.1257 | 94.0 | 25662 | 0.1252 | 0.8142 | 0.7054 | 0.8748 | 0.2980 | 1e-05 |
151
- | 0.1257 | 95.0 | 25935 | 0.1248 | 0.8166 | 0.7135 | 0.8800 | 0.3015 | 1e-05 |
152
- | 0.1271 | 96.0 | 26208 | 0.1249 | 0.8161 | 0.7110 | 0.8789 | 0.2980 | 1e-05 |
153
- | 0.1271 | 97.0 | 26481 | 0.1246 | 0.8169 | 0.7159 | 0.8806 | 0.3015 | 0.0000 |
154
- | 0.1272 | 98.0 | 26754 | 0.1245 | 0.8149 | 0.7071 | 0.8762 | 0.2998 | 0.0000 |
155
- | 0.126 | 99.0 | 27027 | 0.1246 | 0.8166 | 0.7183 | 0.8791 | 0.3022 | 0.0000 |
156
- | 0.126 | 100.0 | 27300 | 0.1246 | 0.8162 | 0.7136 | 0.8781 | 0.3015 | 0.0000 |
157
-
158
-
159
- ### Framework versions
160
-
161
- - Transformers 4.41.1
162
- - Pytorch 2.3.0+cu121
163
- - Datasets 2.19.1
164
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: wtfpl
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: facebook/dinov2-large
 
11
  model-index:
12
  - name: dinov2-large-2024_05_27-_batch-size32_epochs150_freeze
13
  results: []
14
  ---
15
 
16
+ DinoVd'eau is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set:
 
 
 
17
 
 
 
18
  - Loss: 0.1247
19
  - F1 Micro: 0.8153
20
  - F1 Macro: 0.7021
21
  - Roc Auc: 0.8747
22
  - Accuracy: 0.3144
 
23
 
24
+ ---
25
+
26
+ # Model description
27
+ DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
+
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
+
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
 
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
 
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the number of images for each class are given in the following table:
42
+ | Class | train | val | test | Total |
43
+ |:-------------------------|--------:|------:|-------:|--------:|
44
+ | Acropore_branched | 1488 | 465 | 455 | 2408 |
45
+ | Acropore_digitised | 566 | 169 | 153 | 888 |
46
+ | Acropore_sub_massive | 147 | 48 | 48 | 243 |
47
+ | Acropore_tabular | 997 | 290 | 302 | 1589 |
48
+ | Algae_assembly | 2537 | 859 | 842 | 4238 |
49
+ | Algae_drawn_up | 368 | 121 | 131 | 620 |
50
+ | Algae_limestone | 1651 | 559 | 562 | 2772 |
51
+ | Algae_sodding | 3155 | 980 | 982 | 5117 |
52
+ | Atra/Leucospilota | 1090 | 359 | 343 | 1792 |
53
+ | Bleached_coral | 219 | 69 | 72 | 360 |
54
+ | Blurred | 190 | 63 | 67 | 320 |
55
+ | Dead_coral | 1981 | 644 | 639 | 3264 |
56
+ | Fish | 2029 | 657 | 635 | 3321 |
57
+ | Homo_sapiens | 160 | 63 | 59 | 282 |
58
+ | Human_object | 156 | 61 | 53 | 270 |
59
+ | Living_coral | 854 | 289 | 271 | 1414 |
60
+ | Millepore | 383 | 129 | 125 | 637 |
61
+ | No_acropore_encrusting | 420 | 153 | 152 | 725 |
62
+ | No_acropore_foliaceous | 204 | 44 | 38 | 286 |
63
+ | No_acropore_massive | 1017 | 345 | 343 | 1705 |
64
+ | No_acropore_solitary | 195 | 54 | 54 | 303 |
65
+ | No_acropore_sub_massive | 1383 | 445 | 428 | 2256 |
66
+ | Rock | 4469 | 1499 | 1489 | 7457 |
67
+ | Rubble | 3089 | 1011 | 1023 | 5123 |
68
+ | Sand | 5840 | 1949 | 1930 | 9719 |
69
+ | Sea_cucumber | 1413 | 445 | 436 | 2294 |
70
+ | Sea_urchins | 327 | 107 | 111 | 545 |
71
+ | Sponge | 269 | 104 | 97 | 470 |
72
+ | Syringodium_isoetifolium | 1214 | 388 | 393 | 1995 |
73
+ | Thalassodendron_ciliatum | 781 | 262 | 260 | 1303 |
74
+ | Useless | 579 | 193 | 193 | 965 |
75
 
76
+ ---
77
 
78
+ # Training procedure
79
 
80
+ ## Training hyperparameters
81
 
82
  The following hyperparameters were used during training:
83
+
84
+ - **Number of Epochs**: 150
85
+ - **Learning Rate**: 0.001
86
+ - **Train Batch Size**: 32
87
+ - **Eval Batch Size**: 32
88
+ - **Optimizer**: Adam
89
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
90
+ - **Freeze Encoder**: Yes
91
+ - **Data Augmentation**: Yes
92
+
93
+
94
+ ## Data Augmentation
95
+ Data were augmented using the following transformations :
96
+
97
+ Train Transforms
98
+ - **PreProcess**: No additional parameters
99
+ - **Resize**: probability=1.00
100
+ - **RandomHorizontalFlip**: probability=0.25
101
+ - **RandomVerticalFlip**: probability=0.25
102
+ - **ColorJiggle**: probability=0.25
103
+ - **RandomPerspective**: probability=0.25
104
+ - **Normalize**: probability=1.00
105
+
106
+ Val Transforms
107
+ - **PreProcess**: No additional parameters
108
+ - **Resize**: probability=1.00
109
+ - **Normalize**: probability=1.00
110
+
111
+
112
+
113
+ ## Training results
114
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
115
+ --- | --- | --- | --- | --- | ---
116
+ 1.0 | 0.1701383888721466 | 0.232726022688209 | 0.7380235658381353 | 0.4712591871520079 | 0.001
117
+ 1.8315018315018317 | N/A | N/A | N/A | N/A | 0.001
118
+ 2.0 | 0.15890291333198547 | 0.24888277758679958 | 0.7568708574323469 | 0.5722657636852799 | 0.001
119
+ 3.0 | 0.15134122967720032 | 0.24785149535922998 | 0.7723932964583505 | 0.6104117366005594 | 0.001
120
+ 3.663003663003663 | N/A | N/A | N/A | N/A | 0.001
121
+ 4.0 | 0.15164224803447723 | 0.24853901684427637 | 0.7608496532472631 | 0.599745200497783 | 0.001
122
+ 5.0 | 0.15243493020534515 | 0.24750773461670678 | 0.7692371752165224 | 0.5935106518877853 | 0.001
123
+ 5.4945054945054945 | N/A | N/A | N/A | N/A | 0.001
124
+ 6.0 | 0.14673969149589539 | 0.24269508422138192 | 0.7718080548414739 | 0.613788061665736 | 0.001
125
+ 7.0 | 0.1506606936454773 | 0.2509453420419388 | 0.7732481363152289 | 0.6309898748773293 | 0.001
126
+ 7.326007326007326 | N/A | N/A | N/A | N/A | 0.001
127
+ 8.0 | 0.14430351555347443 | 0.2609144035751117 | 0.7828014555188422 | 0.6403740520777896 | 0.001
128
+ 9.0 | 0.14617429673671722 | 0.25128910278446204 | 0.781416038551835 | 0.6366498775226099 | 0.001
129
+ 9.157509157509157 | N/A | N/A | N/A | N/A | 0.001
130
+ 10.0 | 0.14414818584918976 | 0.2688209006531454 | 0.7794745970641737 | 0.6254472467805158 | 0.001
131
+ 10.989010989010989 | N/A | N/A | N/A | N/A | 0.001
132
+ 11.0 | 0.14589238166809082 | 0.2595393606050189 | 0.7780349253103302 | 0.6357434835994208 | 0.001
133
+ 12.0 | 0.14458976686000824 | 0.2554142316947405 | 0.7823495795575149 | 0.638389910481932 | 0.001
134
+ 12.820512820512821 | N/A | N/A | N/A | N/A | 0.001
135
+ 13.0 | 0.14142437279224396 | 0.2561017531797869 | 0.786284091383703 | 0.6574365741219022 | 0.001
136
+ 14.0 | 0.1581379920244217 | 0.24682021313166036 | 0.7766990291262137 | 0.6245865910731833 | 0.001
137
+ 14.652014652014651 | N/A | N/A | N/A | N/A | 0.001
138
+ 15.0 | 0.1447945237159729 | 0.2598831213475421 | 0.7859620485615181 | 0.6552072842486797 | 0.001
139
+ 16.0 | 0.1438169628381729 | 0.2605706428325885 | 0.7853051058530511 | 0.6495757946819554 | 0.001
140
+ 16.483516483516482 | N/A | N/A | N/A | N/A | 0.001
141
+ 17.0 | 0.14359386265277863 | 0.2506015812994156 | 0.7824457675812967 | 0.6310969679900952 | 0.001
142
+ 18.0 | 0.1412857472896576 | 0.2564455139223101 | 0.7848311343456975 | 0.6531950395959965 | 0.001
143
+ 18.315018315018314 | N/A | N/A | N/A | N/A | 0.001
144
+ 19.0 | 0.14079046249389648 | 0.26022688209006534 | 0.7833830386020918 | 0.6486819478687708 | 0.001
145
+ 20.0 | 0.14640754461288452 | 0.26400825025782054 | 0.7775968460747342 | 0.6262168341318395 | 0.001
146
+ 20.146520146520146 | N/A | N/A | N/A | N/A | 0.001
147
+ 21.0 | 0.1412632316350937 | 0.2653832932279134 | 0.7890916719110552 | 0.6582044080070929 | 0.001
148
+ 21.978021978021978 | N/A | N/A | N/A | N/A | 0.001
149
+ 22.0 | 0.14168681204319 | 0.2543829494671708 | 0.7871090517954659 | 0.6586947128782558 | 0.001
150
+ 23.0 | 0.1393543779850006 | 0.269852182880715 | 0.7863651704353696 | 0.6427873985434494 | 0.001
151
+ 23.80952380952381 | N/A | N/A | N/A | N/A | 0.001
152
+ 24.0 | 0.14052371680736542 | 0.2588518391199725 | 0.7857706852844616 | 0.6618962794412713 | 0.001
153
+ 25.0 | 0.1392364352941513 | 0.2653832932279134 | 0.7897693920335429 | 0.653320279245233 | 0.001
154
+ 25.641025641025642 | N/A | N/A | N/A | N/A | 0.001
155
+ 26.0 | 0.14239099621772766 | 0.27019594362323823 | 0.7838044308632545 | 0.6529431984792132 | 0.001
156
+ 27.0 | 0.1386287957429886 | 0.2671020969405294 | 0.7974886125815585 | 0.6810613208979668 | 0.001
157
+ 27.47252747252747 | N/A | N/A | N/A | N/A | 0.001
158
+ 28.0 | 0.15519200265407562 | 0.2650395324853902 | 0.7791304347826087 | 0.6474807711800876 | 0.001
159
+ 29.0 | 0.14190098643302917 | 0.27019594362323823 | 0.7913651213762871 | 0.6550381793679035 | 0.001
160
+ 29.304029304029303 | N/A | N/A | N/A | N/A | 0.001
161
+ 30.0 | 0.13986903429031372 | 0.2767273977311791 | 0.7857173292428311 | 0.663185953977854 | 0.001
162
+ 31.0 | 0.13765402138233185 | 0.27260226882090066 | 0.7881844380403459 | 0.6554744698272669 | 0.001
163
+ 31.135531135531135 | N/A | N/A | N/A | N/A | 0.001
164
+ 32.0 | 0.13866138458251953 | 0.2677896184255758 | 0.7914770376499792 | 0.6596978272887946 | 0.001
165
+ 32.967032967032964 | N/A | N/A | N/A | N/A | 0.001
166
+ 33.0 | 0.13930276036262512 | 0.2605706428325885 | 0.7887546855476885 | 0.6583814800932023 | 0.001
167
+ 34.0 | 0.1374826431274414 | 0.2763836369886559 | 0.795303262082937 | 0.6636727922636001 | 0.001
168
+ 34.798534798534796 | N/A | N/A | N/A | N/A | 0.001
169
+ 35.0 | 0.14001137018203735 | 0.25850807837744927 | 0.7860775988902434 | 0.6442491093834092 | 0.001
170
+ 36.0 | 0.13899104297161102 | 0.26916466139566864 | 0.7890085033301218 | 0.6541220211466265 | 0.001
171
+ 36.63003663003663 | N/A | N/A | N/A | N/A | 0.001
172
+ 37.0 | 0.14101693034172058 | 0.2667583361980062 | 0.788356222091162 | 0.6602790790864311 | 0.001
173
+ 38.0 | 0.13849563896656036 | 0.2633207287727741 | 0.7864065343433915 | 0.6508514926081754 | 0.001
174
+ 38.46153846153846 | N/A | N/A | N/A | N/A | 0.001
175
+ 39.0 | 0.14249388873577118 | 0.26263320728772777 | 0.7819844457738655 | 0.6513021077089046 | 0.001
176
+ 40.0 | 0.1512959599494934 | 0.2633207287727741 | 0.7819497946916141 | 0.6421624481517915 | 0.001
177
+ 40.29304029304029 | N/A | N/A | N/A | N/A | 0.0001
178
+ 41.0 | 0.1416281908750534 | 0.27157098659333107 | 0.795353889863792 | 0.6708412782877394 | 0.0001
179
+ 42.0 | 0.13480685651302338 | 0.2811962873839807 | 0.8014906832298136 | 0.6820172839356666 | 0.0001
180
+ 42.124542124542124 | N/A | N/A | N/A | N/A | 0.0001
181
+ 43.0 | 0.1342025101184845 | 0.2756961155036095 | 0.8014919187733112 | 0.681931169239128 | 0.0001
182
+ 43.956043956043956 | N/A | N/A | N/A | N/A | 0.0001
183
+ 44.0 | 0.1327475756406784 | 0.2811962873839807 | 0.8019789631231031 | 0.683693351140427 | 0.0001
184
+ 45.0 | 0.1318245828151703 | 0.2811962873839807 | 0.8049446006284108 | 0.6900135704395078 | 0.0001
185
+ 45.78754578754579 | N/A | N/A | N/A | N/A | 0.0001
186
+ 46.0 | 0.13027183711528778 | 0.28910278446201443 | 0.8063969585520062 | 0.6920134474185277 | 0.0001
187
+ 47.0 | 0.12985946238040924 | 0.284977655551736 | 0.8065087538619978 | 0.6938459582689339 | 0.0001
188
+ 47.61904761904762 | N/A | N/A | N/A | N/A | 0.0001
189
+ 48.0 | 0.12981055676937103 | 0.2853214162942592 | 0.8031727379553465 | 0.6917397436201066 | 0.0001
190
+ 49.0 | 0.1301460713148117 | 0.2839463733241664 | 0.8081048867699644 | 0.6980761423122126 | 0.0001
191
+ 49.45054945054945 | N/A | N/A | N/A | N/A | 0.0001
192
+ 50.0 | 0.1294524371623993 | 0.2829150910965968 | 0.8056895691232739 | 0.6968263757426811 | 0.0001
193
+ 51.0 | 0.12989668548107147 | 0.2846338948092128 | 0.8078541374474054 | 0.6981227572539419 | 0.0001
194
+ 51.282051282051285 | N/A | N/A | N/A | N/A | 0.0001
195
+ 52.0 | 0.13097986578941345 | 0.284977655551736 | 0.809621541745341 | 0.7032059573412642 | 0.0001
196
+ 53.0 | 0.12910524010658264 | 0.288415262976968 | 0.8082875892525485 | 0.6952081515364695 | 0.0001
197
+ 53.11355311355312 | N/A | N/A | N/A | N/A | 0.0001
198
+ 54.0 | 0.1276824176311493 | 0.2860089377793056 | 0.8055729885778838 | 0.6914506394370794 | 0.0001
199
+ 54.94505494505494 | N/A | N/A | N/A | N/A | 0.0001
200
+ 55.0 | 0.12751279771327972 | 0.28979030594706084 | 0.8091508143727464 | 0.7051415507931676 | 0.0001
201
+ 56.0 | 0.12798655033111572 | 0.2911653489171537 | 0.8077718065316246 | 0.6990943862949641 | 0.0001
202
+ 56.776556776556774 | N/A | N/A | N/A | N/A | 0.0001
203
+ 57.0 | 0.1279618740081787 | 0.29150910965967686 | 0.8107930240210597 | 0.7001268142729874 | 0.0001
204
+ 58.0 | 0.1280883550643921 | 0.290134066689584 | 0.8108946874106743 | 0.7039327958876614 | 0.0001
205
+ 58.608058608058606 | N/A | N/A | N/A | N/A | 0.0001
206
+ 59.0 | 0.1287168562412262 | 0.2873839807493984 | 0.8071845383437488 | 0.699653006099352 | 0.0001
207
+ 60.0 | 0.1270500272512436 | 0.28875902371949125 | 0.8103491168421926 | 0.7042073996338176 | 0.0001
208
+ 60.43956043956044 | N/A | N/A | N/A | N/A | 0.0001
209
+ 61.0 | 0.1269637793302536 | 0.28944654520453766 | 0.8072888368788399 | 0.6994480698947442 | 0.0001
210
+ 62.0 | 0.12639474868774414 | 0.28979030594706084 | 0.8124407826982492 | 0.7105518005302388 | 0.0001
211
+ 62.27106227106227 | N/A | N/A | N/A | N/A | 0.0001
212
+ 63.0 | 0.12643341720104218 | 0.2918528704022001 | 0.8093336660843524 | 0.7042257858113937 | 0.0001
213
+ 64.0 | 0.12570597231388092 | 0.2918528704022001 | 0.8119739624362535 | 0.7054117610081568 | 0.0001
214
+ 64.1025641025641 | N/A | N/A | N/A | N/A | 0.0001
215
+ 65.0 | 0.12599390745162964 | 0.29322791337229287 | 0.8103770839396333 | 0.7040599127700347 | 0.0001
216
+ 65.93406593406593 | N/A | N/A | N/A | N/A | 0.0001
217
+ 66.0 | 0.12674611806869507 | 0.29769680302509455 | 0.8141795311606633 | 0.7083351143800681 | 0.0001
218
+ 67.0 | 0.12676431238651276 | 0.28979030594706084 | 0.8090950582963362 | 0.6998024530144022 | 0.0001
219
+ 67.76556776556777 | N/A | N/A | N/A | N/A | 0.0001
220
+ 68.0 | 0.12638631463050842 | 0.2928841526297697 | 0.8127327032445482 | 0.7034736625177254 | 0.0001
221
+ 69.0 | 0.12608103454113007 | 0.2952904778274321 | 0.8131967584022379 | 0.7078892431331377 | 0.0001
222
+ 69.59706959706959 | N/A | N/A | N/A | N/A | 0.0001
223
+ 70.0 | 0.12582050263881683 | 0.29150910965967686 | 0.8136722606120435 | 0.7081157868651535 | 0.0001
224
+ 71.0 | 0.12533149123191833 | 0.2918528704022001 | 0.8123295595405339 | 0.7044517956080781 | 1e-05
225
+ 71.42857142857143 | N/A | N/A | N/A | N/A | 1e-05
226
+ 72.0 | 0.1258901059627533 | 0.2966655207975249 | 0.8159506713723581 | 0.7099295458861072 | 1e-05
227
+ 73.0 | 0.12526649236679077 | 0.2949467170849089 | 0.8159496670343587 | 0.7116557450872655 | 1e-05
228
+ 73.26007326007326 | N/A | N/A | N/A | N/A | 1e-05
229
+ 74.0 | 0.12490212172269821 | 0.29769680302509455 | 0.8156100747030249 | 0.7159515864206437 | 1e-05
230
+ 75.0 | 0.12504002451896667 | 0.2966655207975249 | 0.8135426082669078 | 0.7082306828309269 | 1e-05
231
+ 75.0915750915751 | N/A | N/A | N/A | N/A | 1e-05
232
+ 76.0 | 0.12634462118148804 | 0.2966655207975249 | 0.8099675513769865 | 0.6998917153140419 | 1e-05
233
+ 76.92307692307692 | N/A | N/A | N/A | N/A | 1e-05
234
+ 77.0 | 0.1249643936753273 | 0.2966655207975249 | 0.8142915811088296 | 0.7104044870773909 | 1e-05
235
+ 78.0 | 0.12509745359420776 | 0.2939154348573393 | 0.812339968613199 | 0.7076718539561497 | 1e-05
236
+ 78.75457875457876 | N/A | N/A | N/A | N/A | 1e-05
237
+ 79.0 | 0.12465520948171616 | 0.29838432451014096 | 0.8147326016360423 | 0.7097766100728804 | 1e-05
238
+ 80.0 | 0.12526248395442963 | 0.2990718459951873 | 0.8166140393490405 | 0.7133791911991404 | 1e-05
239
+ 80.58608058608058 | N/A | N/A | N/A | N/A | 1e-05
240
+ 81.0 | 0.12510864436626434 | 0.2952904778274321 | 0.8121923983622152 | 0.705898272950067 | 1e-05
241
+ 82.0 | 0.12532733380794525 | 0.29975936748023374 | 0.8150326797385622 | 0.7095032932540235 | 1e-05
242
+ 82.41758241758242 | N/A | N/A | N/A | N/A | 1e-05
243
+ 83.0 | 0.12474868446588516 | 0.29597799931247853 | 0.815855206584497 | 0.7124383950303705 | 1e-05
244
+ 84.0 | 0.12511858344078064 | 0.3007906497078034 | 0.8175330467926365 | 0.7138847615465347 | 1e-05
245
+ 84.24908424908425 | N/A | N/A | N/A | N/A | 1e-05
246
+ 85.0 | 0.12457013875246048 | 0.2966655207975249 | 0.8132141082960754 | 0.7054571621251418 | 1e-05
247
+ 86.0 | 0.1251869946718216 | 0.2946029563423857 | 0.8143732269868025 | 0.7142702846379808 | 1e-05
248
+ 86.08058608058609 | N/A | N/A | N/A | N/A | 1e-05
249
+ 87.0 | 0.12492978572845459 | 0.2935716741148161 | 0.8135328455150868 | 0.7081357577756824 | 1e-05
250
+ 87.91208791208791 | N/A | N/A | N/A | N/A | 1e-05
251
+ 88.0 | 0.12513719499111176 | 0.2990718459951873 | 0.815831263487927 | 0.7099379006276698 | 1e-05
252
+ 89.0 | 0.12514576315879822 | 0.2963217600550017 | 0.8143914473684211 | 0.7092910188720426 | 1e-05
253
+ 89.74358974358974 | N/A | N/A | N/A | N/A | 1e-05
254
+ 90.0 | 0.1244530975818634 | 0.2942591955998625 | 0.8134516195584898 | 0.7121664381657501 | 1e-05
255
+ 91.0 | 0.12501013278961182 | 0.2990718459951873 | 0.8153902768123646 | 0.7106178930468596 | 1e-05
256
+ 91.57509157509158 | N/A | N/A | N/A | N/A | 1e-05
257
+ 92.0 | 0.12525025010108948 | 0.2973530422825713 | 0.8163049232398094 | 0.7140173113811211 | 1e-05
258
+ 93.0 | 0.12471849471330643 | 0.29872808525266414 | 0.8148661314641998 | 0.7129019083206937 | 1e-05
259
+ 93.4065934065934 | N/A | N/A | N/A | N/A | 1e-05
260
+ 94.0 | 0.12515641748905182 | 0.2980405637676177 | 0.8141884924726748 | 0.7053935701419592 | 1e-05
261
+ 95.0 | 0.12481416761875153 | 0.30147817119284975 | 0.8165906870726147 | 0.7134995447430972 | 1e-05
262
+ 95.23809523809524 | N/A | N/A | N/A | N/A | 1e-05
263
+ 96.0 | 0.12492986023426056 | 0.2980405637676177 | 0.8160666176830762 | 0.7110442004495683 | 1e-05
264
+ 97.0 | 0.12459924072027206 | 0.30147817119284975 | 0.8168590473093806 | 0.7158597011246477 | 1.0000000000000002e-06
265
+ 97.06959706959707 | N/A | N/A | N/A | N/A | 1.0000000000000002e-06
266
+ 98.0 | 0.12447398155927658 | 0.29975936748023374 | 0.8149457415323906 | 0.707122866121441 | 1.0000000000000002e-06
267
+ 98.9010989010989 | N/A | N/A | N/A | N/A | 1.0000000000000002e-06
268
+ 99.0 | 0.12462905794382095 | 0.30216569267789617 | 0.8165748111859562 | 0.7182970295785608 | 1.0000000000000002e-06
269
+ 100.0 | 0.12463195621967316 | 0.30147817119284975 | 0.8161644284310514 | 0.7136275002413193 | 1.0000000000000002e-06
270
+
271
+
272
+ ---
273
+
274
+ # CO2 Emissions
275
+
276
+ The estimated CO2 emissions for training this model are documented below:
277
+
278
+ - **Emissions**: 1.562242452449767 grams of CO2
279
+ - **Source**: Code Carbon
280
+ - **Training Type**: fine-tuning
281
+ - **Geographical Location**: Brest, France
282
+ - **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
283
+
284
+
285
+ ---
286
+
287
+ # Framework Versions
288
+
289
+ - **Transformers**: 4.41.1
290
+ - **Pytorch**: 2.3.0+cu121
291
+ - **Datasets**: 2.19.1
292
+ - **Tokenizers**: 0.19.1
293
+