charioteer commited on
Commit
a17616e
1 Parent(s): 9de5aac

Update README.md

Browse files

Add the training parameters section.

Files changed (1) hide show
  1. README.md +36 -0
README.md CHANGED
@@ -24,6 +24,42 @@ pipeline_tag: text-generation
24
  - Initializing the DPO Trainer and training the model
25
  - Saving the finetuned model and tokenizer
26
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27
  ## Intended Use
28
 
29
  The Neural-phi2 model is intended to be used as a general-purpose language model for a variety of natural language processing tasks, such as text generation, summarization, and question answering. It may be particularly useful in applications where the model needs to generate coherent and contextually appropriate responses, such as in chatbots or virtual assistants.
 
24
  - Initializing the DPO Trainer and training the model
25
  - Saving the finetuned model and tokenizer
26
 
27
+ ## Training Parameters
28
+
29
+ This section outlines the key training parameters used to finetune the Phi2 model from Microsoft using the Direct Preference Optimization (DPO) technique on the `distilabel-intel-orca-dpo-pairs` dataset, resulting in the Neural-phi2 model.
30
+
31
+ - **SFT Model Name**: `phi2-sft-alpaca_loraemb-right-pad`
32
+ - **New Model Name**: `Neural-phi2-v2`
33
+ - **Dataset**: `argilla/distilabel-intel-orca-dpo-pairs`
34
+ - **Tokenizer**: Custom tokenizer created from the `phi2-sft-alpaca_loraemb-right-pad` model
35
+ - **Quantization Config**:
36
+ - `load_in_4bit=True`
37
+ - `bnb_4bit_quant_type="nf4"`
38
+ - `bnb_4bit_compute_dtype=torch.float16`
39
+ - **LoRA Config**:
40
+ - `r=16`
41
+ - `lora_alpha=64`
42
+ - `lora_dropout=0.05`
43
+ - `bias="none"`
44
+ - `task_type="CAUSAL_LM"`
45
+ - `target_modules=["q_proj", "k_proj", "v_proj", "dense", "fc1", "fc2"]`
46
+ - **Training Arguments**:
47
+ - `per_device_train_batch_size=1`
48
+ - `gradient_accumulation_steps=8`
49
+ - `gradient_checkpointing=True`
50
+ - `learning_rate=5e-7`
51
+ - `lr_scheduler_type="linear"`
52
+ - `max_steps=500`
53
+ - `optim="paged_adamw_32bit"`
54
+ - `warmup_steps=100`
55
+ - `bf16=True`
56
+ - `report_to="wandb"`
57
+ - **DPO Trainer**:
58
+ - `loss_type="sigmoid"`
59
+ - `beta=0.1`
60
+ - `max_prompt_length=768`
61
+ - `max_length=1024`
62
+
63
  ## Intended Use
64
 
65
  The Neural-phi2 model is intended to be used as a general-purpose language model for a variety of natural language processing tasks, such as text generation, summarization, and question answering. It may be particularly useful in applications where the model needs to generate coherent and contextually appropriate responses, such as in chatbots or virtual assistants.