--- base_model: - IntervitensInc/Mistral-Nemo-Base-2407-chatml - nbeerbower/mistral-nemo-bophades-12B - nbeerbower/mistral-nemo-wissenschaft-12B - elinas/Chronos-Gold-12B-1.0 - Fizzarolli/MN-12b-Sunrose - nbeerbower/mistral-nemo-gutenberg-12B-v4 - anthracite-org/magnum-12b-v2.5-kto library_name: transformers tags: - mergekit - merge --- ![Made with NovelAI](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1/resolve/main/magmell.png) *[Welcome, brave one; you've come a long mile.](https://www.youtube.com/watch?v=dgGEuC1F3oE)* [Official GGUFs](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1-GGUF) [More from mradermacher](https://huggingface.co/mradermacher/MN-12B-Mag-Mell-R1-GGUF/tree/main) # MN-12B-Mag-Mell-R1 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details Multi-stage SLERP merge, DARE-TIES'd together. Intended to be a general purpose "Best of Nemo" model for any fictional, creative use case. Inspired by hyper-merges like [Tiefighter](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter) and [Umbral Mind.](https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B) Mag Mell is composed of 3 intermediate parts: - Hero (RP, kink/trope coverage): [Chronos Gold](https://huggingface.co/elinas/Chronos-Gold-12B-1.0), [Sunrose](https://huggingface.co/Fizzarolli/MN-12b-Sunrose). - Monk (Intelligence, groundedness): [Bophades](https://huggingface.co/nbeerbower/mistral-nemo-bophades-12B), [Wissenschaft](https://huggingface.co/nbeerbower/mistral-nemo-wissenschaft-12B). - Deity (Prose, flair): [Gutenberg v4](https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v4), [Magnum 2.5 KTO](https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto). I've been dreaming about this merge since Nemo tunes started coming out in earnest. From our testing, Mag Mell demonstrates worldbuilding capabilities unlike any model in its class, comparable to old adventuring models like Tiefighter, and prose that exhibits minimal "slop" (not bad for no finetuning,) frequently devising electrifying metaphors that left us consistently astonished. Use ChatML formatting. Early testing versions had a tendency to leak tokens, but this should be more or less hammered out. I don't want to toot my own bugle though; I'm really proud of how this came out, but please leave your feedback, good or bad. Special thanks as usual to Toaster for his feedback and Fizz for helping fund compute, as well as the KoboldAI Discord for their resources. ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [IntervitensInc/Mistral-Nemo-Base-2407-chatml](https://huggingface.co/IntervitensInc/Mistral-Nemo-Base-2407-chatml) as a base. ### Models Merged The following models were included in the merge: * IntervitensInc/Mistral-Nemo-Base-2407-chatml * nbeerbower/mistral-nemo-bophades-12B * nbeerbower/mistral-nemo-wissenschaft-12B * elinas/Chronos-Gold-12B-1.0 * Fizzarolli/MN-12b-Sunrose * nbeerbower/mistral-nemo-gutenberg-12B-v4 * anthracite-org/magnum-12b-v2.5-kto ### Configuration The following YAML configurations were used to produce this model: #### Monk: ```yaml models: - model: nbeerbower/mistral-nemo-bophades-12B - model: nbeerbower/mistral-nemo-wissenschaft-12B merge_method: slerp base_model: nbeerbower/mistral-nemo-bophades-12B parameters: t: [0.1, 0.2, 0.4, 0.6, 0.6, 0.4, 0.2, 0.1] dtype: bfloat16 tokenizer_source: base ``` #### Hero: ```yaml models: - model: elinas/Chronos-Gold-12B-1.0 - model: Fizzarolli/MN-12b-Sunrose merge_method: slerp base_model: elinas/Chronos-Gold-12B-1.0 parameters: t: [0.1, 0.2, 0.4, 0.6, 0.6, 0.4, 0.2, 0.1] dtype: bfloat16 tokenizer_source: base ``` #### Deity: ```yaml models: - model: nbeerbower/mistral-nemo-gutenberg-12B-v4 - model: anthracite-org/magnum-12b-v2.5-kto merge_method: slerp base_model: nbeerbower/mistral-nemo-gutenberg-12B-v4 parameters: t: [0, 0.1, 0.2, 0.25, 0.25, 0.2, 0.1, 0] dtype: bfloat16 tokenizer_source: base ``` #### Mag Mell: ```yaml models: - model: monk parameters: density: 0.7 weight: 0.5 - model: hero parameters: density: 0.9 weight: 1 - model: deity parameters: density: 0.5 weight: 0.7 merge_method: dare_ties base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml tokenizer_source: base ``` `In Irish mythology, Mag Mell (modern spelling: Magh Meall, meaning 'delightful plain') is one of the names for the Celtic Otherworld, a mythical realm achievable through death and/or glory... Never explicitly stated in any surviving mythological account to be an afterlife; rather, it is usually portrayed as a paradise populated by deities, which is occasionally visited by some adventurous mortals. In its island guise, it was visited by various legendary Irish heroes and monks, forming the basis of the adventure myth or echtrae...`