Slerp-CM-mist-dpo / README.md
siddartha-abacus's picture
Update README.md
ea3b28f
|
raw
history blame
No virus
662 Bytes
---
license: apache-2.0
tags:
- merge
---
Slerp Merge of cookinai/CatMacaroni-Slerp and mncai/mistral-7b-dpo-v5
.yaml file for mergekit
```yaml
slices:
- sources:
- model: cookinai/CatMacaroni-Slerp
layer_range: [0, 32]
- model: mncai/mistral-7b-dpo-v5
layer_range: [0, 32]
merge_method: slerp
base_model: mncai/mistral-7b-dpo-v5
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: float16
```
Models chose to achive a mix of performance on reasoning datasets like GSM8k and conversational tasks.