v000000's picture
Update README.md
f8f2934 verified
metadata
base_model:
  - Sao10K/Fimbulvetr-11B-v2
  - TheDrummer/Moistral-11B-v3
  - Himitsui/MedMitsu-Instruct-11B
  - Himitsui/Kaiju-11B
  - migtissera/Synthia-v3.0-11B
  - jeiku/Re-Host_Limarp_Mistral
library_name: transformers
tags:
  - mergekit
  - merge
  - solar
  - llama
  - not-for-all-audiences

SyntheticMoist-v2

RP Model, Solar. Higher density+LimaRP led to better performance, Use Alpaca/Vicuna.

image/png

Thanks mradermacher for the quants!

Quants

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using Sao10K/Fimbulvetr-11B-v2 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Himitsui/MedMitsu-Instruct-11B
    parameters:
      weight: 0.13
      density: 0.60
  - model: Himitsui/Kaiju-11B
    parameters:
      weight: 0.22
      density: 0.73
  - model: migtissera/Synthia-v3.0-11B+jeiku/Re-Host_Limarp_Mistral
    parameters:
      weight: 0.28
      density: 0.80
  - model: TheDrummer/Moistral-11B-v3
    parameters:
      weight: 0.37
      density: 0.85
merge_method: dare_ties
base_model: Sao10K/Fimbulvetr-11B-v2
parameters:
  int8_mask: true
dtype: bfloat16