160b - Magec4ai-Command-R-Plus

#39
by SabinStargem - opened

This is a self-merge of CR+. With any luck, this would make the model even brighter.

https://huggingface.co/nitky/Megac4ai-command-r-plus/tree/main

Let's see how it fares in my queue then :)

mradermacher changed discussion status to closed

I forgot to mention that current llama.cpp apparently has no working cr+ support, I don't know if it will affect this model.

Yup, it's affected. We'd have to wait for a fix:

raise NotImplementedError("BPE pre-tokenizer was not recognized - update get_vocab_base_pre()")

I'm manually convetring it with an older version of llama.cpp. Hope the result is ok...

Sign up or log in to comment