lombardata commited on
Commit
ad2a4c1
1 Parent(s): 06d25aa

Model save

Browse files
Files changed (2) hide show
  1. README.md +116 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,116 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/dinov2-large
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: dinov2-large-linearhead-2024_03_06-with_data_aug_batch-size32_epochs93_freeze
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # dinov2-large-linearhead-2024_03_06-with_data_aug_batch-size32_epochs93_freeze
17
+
18
+ This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.0975
21
+ - F1 Micro: 0.8499
22
+ - F1 Macro: 0.8118
23
+ - Roc Auc: 0.9047
24
+ - Accuracy: 0.5522
25
+ - Learning Rate: 0.0000
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 5e-05
45
+ - train_batch_size: 32
46
+ - eval_batch_size: 32
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 93
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
55
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
56
+ | No log | 1.0 | 274 | 0.1280 | 0.7998 | 0.7183 | 0.8765 | 0.4798 | 0.001 |
57
+ | 0.1493 | 2.0 | 548 | 0.1210 | 0.8095 | 0.7528 | 0.8793 | 0.5080 | 0.001 |
58
+ | 0.1493 | 3.0 | 822 | 0.1156 | 0.8119 | 0.7609 | 0.8667 | 0.5237 | 0.001 |
59
+ | 0.1156 | 4.0 | 1096 | 0.1167 | 0.8240 | 0.7881 | 0.9000 | 0.5003 | 0.001 |
60
+ | 0.1156 | 5.0 | 1370 | 0.1073 | 0.8367 | 0.7939 | 0.8968 | 0.5254 | 0.001 |
61
+ | 0.1069 | 6.0 | 1644 | 0.1082 | 0.8319 | 0.7918 | 0.8911 | 0.5341 | 0.001 |
62
+ | 0.1069 | 7.0 | 1918 | 0.1065 | 0.8367 | 0.7811 | 0.8963 | 0.5390 | 0.001 |
63
+ | 0.1026 | 8.0 | 2192 | 0.1071 | 0.8364 | 0.8001 | 0.8959 | 0.5351 | 0.001 |
64
+ | 0.1026 | 9.0 | 2466 | 0.1101 | 0.8346 | 0.7925 | 0.9113 | 0.5111 | 0.001 |
65
+ | 0.1 | 10.0 | 2740 | 0.1074 | 0.8345 | 0.7808 | 0.8973 | 0.5320 | 0.001 |
66
+ | 0.0964 | 11.0 | 3014 | 0.1079 | 0.8375 | 0.7967 | 0.8985 | 0.5292 | 0.001 |
67
+ | 0.0964 | 12.0 | 3288 | 0.1070 | 0.8353 | 0.7908 | 0.8951 | 0.5341 | 0.001 |
68
+ | 0.0949 | 13.0 | 3562 | 0.1060 | 0.8371 | 0.7926 | 0.8987 | 0.5296 | 0.001 |
69
+ | 0.0949 | 14.0 | 3836 | 0.1035 | 0.8443 | 0.7987 | 0.9007 | 0.5438 | 0.001 |
70
+ | 0.0926 | 15.0 | 4110 | 0.1099 | 0.8363 | 0.8004 | 0.9060 | 0.5118 | 0.001 |
71
+ | 0.0926 | 16.0 | 4384 | 0.1086 | 0.8327 | 0.7991 | 0.8886 | 0.5355 | 0.001 |
72
+ | 0.0911 | 17.0 | 4658 | 0.1084 | 0.8333 | 0.7967 | 0.8952 | 0.5209 | 0.001 |
73
+ | 0.0911 | 18.0 | 4932 | 0.1083 | 0.8358 | 0.7976 | 0.8968 | 0.5344 | 0.001 |
74
+ | 0.0902 | 19.0 | 5206 | 0.1129 | 0.8301 | 0.7799 | 0.8829 | 0.5233 | 0.001 |
75
+ | 0.0902 | 20.0 | 5480 | 0.1033 | 0.8464 | 0.8107 | 0.9065 | 0.5400 | 0.001 |
76
+ | 0.0896 | 21.0 | 5754 | 0.1091 | 0.8375 | 0.8014 | 0.9018 | 0.5233 | 0.001 |
77
+ | 0.0881 | 22.0 | 6028 | 0.1040 | 0.8412 | 0.7987 | 0.8995 | 0.5383 | 0.001 |
78
+ | 0.0881 | 23.0 | 6302 | 0.1090 | 0.8385 | 0.7908 | 0.9012 | 0.5278 | 0.001 |
79
+ | 0.0874 | 24.0 | 6576 | 0.1078 | 0.8338 | 0.7961 | 0.8917 | 0.5313 | 0.001 |
80
+ | 0.0874 | 25.0 | 6850 | 0.1054 | 0.8455 | 0.8077 | 0.9023 | 0.5501 | 0.001 |
81
+ | 0.0864 | 26.0 | 7124 | 0.1085 | 0.8346 | 0.7913 | 0.8860 | 0.5348 | 0.001 |
82
+ | 0.0864 | 27.0 | 7398 | 0.0994 | 0.8486 | 0.8134 | 0.9040 | 0.5487 | 0.0001 |
83
+ | 0.0793 | 28.0 | 7672 | 0.0989 | 0.8495 | 0.8123 | 0.9039 | 0.5532 | 0.0001 |
84
+ | 0.0793 | 29.0 | 7946 | 0.0986 | 0.8485 | 0.8107 | 0.9028 | 0.5511 | 0.0001 |
85
+ | 0.0751 | 30.0 | 8220 | 0.0986 | 0.8510 | 0.8188 | 0.9080 | 0.5501 | 0.0001 |
86
+ | 0.0751 | 31.0 | 8494 | 0.0990 | 0.8488 | 0.8139 | 0.9034 | 0.5539 | 0.0001 |
87
+ | 0.0753 | 32.0 | 8768 | 0.0983 | 0.8510 | 0.8181 | 0.9048 | 0.5505 | 0.0001 |
88
+ | 0.0748 | 33.0 | 9042 | 0.0987 | 0.8494 | 0.8110 | 0.9018 | 0.5539 | 0.0001 |
89
+ | 0.0748 | 34.0 | 9316 | 0.0980 | 0.8501 | 0.8117 | 0.9045 | 0.5515 | 0.0001 |
90
+ | 0.0748 | 35.0 | 9590 | 0.0981 | 0.8502 | 0.8133 | 0.9064 | 0.5505 | 0.0001 |
91
+ | 0.0748 | 36.0 | 9864 | 0.0984 | 0.8507 | 0.8130 | 0.9045 | 0.5536 | 0.0001 |
92
+ | 0.0745 | 37.0 | 10138 | 0.0983 | 0.8507 | 0.8145 | 0.9067 | 0.5484 | 0.0001 |
93
+ | 0.0745 | 38.0 | 10412 | 0.0986 | 0.8486 | 0.8107 | 0.9011 | 0.5546 | 0.0001 |
94
+ | 0.0749 | 39.0 | 10686 | 0.0986 | 0.8491 | 0.8140 | 0.9029 | 0.5508 | 0.0001 |
95
+ | 0.0749 | 40.0 | 10960 | 0.0982 | 0.8487 | 0.8114 | 0.9002 | 0.5553 | 0.0001 |
96
+ | 0.0752 | 41.0 | 11234 | 0.0976 | 0.8505 | 0.8131 | 0.9058 | 0.5508 | 1e-05 |
97
+ | 0.0734 | 42.0 | 11508 | 0.0977 | 0.8500 | 0.8128 | 0.9046 | 0.5515 | 1e-05 |
98
+ | 0.0734 | 43.0 | 11782 | 0.0975 | 0.8498 | 0.8118 | 0.9053 | 0.5515 | 1e-05 |
99
+ | 0.0736 | 44.0 | 12056 | 0.0976 | 0.8495 | 0.8118 | 0.9046 | 0.5522 | 1e-05 |
100
+ | 0.0736 | 45.0 | 12330 | 0.0975 | 0.8503 | 0.8119 | 0.9053 | 0.5508 | 1e-05 |
101
+ | 0.0731 | 46.0 | 12604 | 0.0976 | 0.8498 | 0.8119 | 0.9046 | 0.5511 | 1e-05 |
102
+ | 0.0731 | 47.0 | 12878 | 0.0975 | 0.8500 | 0.8115 | 0.9046 | 0.5518 | 1e-05 |
103
+ | 0.0736 | 48.0 | 13152 | 0.0975 | 0.8505 | 0.8141 | 0.9052 | 0.5511 | 1e-05 |
104
+ | 0.0736 | 49.0 | 13426 | 0.0975 | 0.8504 | 0.8144 | 0.9053 | 0.5518 | 1e-05 |
105
+ | 0.073 | 50.0 | 13700 | 0.0975 | 0.8502 | 0.8138 | 0.9052 | 0.5518 | 0.0000 |
106
+ | 0.073 | 51.0 | 13974 | 0.0975 | 0.8499 | 0.8123 | 0.9049 | 0.5515 | 0.0000 |
107
+ | 0.0732 | 52.0 | 14248 | 0.0975 | 0.8500 | 0.8119 | 0.9049 | 0.5515 | 0.0000 |
108
+ | 0.0732 | 53.0 | 14522 | 0.0975 | 0.8499 | 0.8118 | 0.9047 | 0.5522 | 0.0000 |
109
+
110
+
111
+ ### Framework versions
112
+
113
+ - Transformers 4.36.2
114
+ - Pytorch 2.1.0+cu118
115
+ - Datasets 2.14.5
116
+ - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:832effd6b1672bb1edd2066f8d243f4edaf6b9b6d237ef74fb77b1ebc5e62d7c
3
  size 1217722832
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d96f1bb739bca07ec7660d51ddd59d2c14144542e840b22e1f631614d18cb795
3
  size 1217722832