mchl-labs commited on
Commit
6503af4
1 Parent(s): 5881e5e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -1
README.md CHANGED
@@ -4,4 +4,39 @@ language:
4
  - it
5
  - en
6
  library_name: transformers
7
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  - it
5
  - en
6
  library_name: transformers
7
+ ---
8
+
9
+ # Stambecco 🦌: Italian Instruction-following LLaMA Model
10
+
11
+ Stambecco is a Italian Instruction-following model based on the [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/) model.
12
+ It comes in two versions: 7b and 13b parameters.
13
+
14
+ It is trained on an Italian version of the [GPT-4-LLM](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM) dataset, a dataset of `GPT-4` generated instruction-following data.
15
+
16
+ This repo contains a low-rank adapter for LLaMA-7b.
17
+
18
+ For more information, please visit [the project's website](https://github.com/mchl-labs/stambecco).
19
+
20
+ ### 💪 Training hyperparameters
21
+
22
+ The following hyperparameters were used during training:
23
+ - learning_rate: 0.0003
24
+ - train_batch_size: 4
25
+ - eval_batch_size: 8
26
+ - gradient_accumulation_steps: 32
27
+ - total_train_batch_size: 128
28
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
29
+ - lr_scheduler_type: linear
30
+ - lr_scheduler_warmup_steps: 100
31
+ - num_epochs: 10
32
+ - mixed_precision_training: Native AMP
33
+ - LoRA R: 8
34
+ - LoRA target modules: q_proj, v_proj
35
+
36
+
37
+ ## Intended uses & limitations
38
+
39
+ **Usage and License Notices**: Same as [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca), Stambecco is intended and licensed for research use only. The models should not be used outside of research purposes.
40
+
41
+ Please note that it is highly possible that the model output contains biased, conspiracist, offensive, or otherwise inappropriate and potentially harmful content.
42
+ The model is intended for **research purposes only** and should be used with caution at your own risk. **Production usage is not allowed.**