license: mit | |
pipeline_tag: text-generation | |
language: | |
- sv | |
- en | |
tags: | |
- pretrained | |
widget: | |
- text: "Jag tycker att det är roligt med" | |
# Model Card for Mistral-7B-v0.1-flashback-v2 | |
Mistral-7B-v0.1-flashback-v2 model is a continuation of the pretraining process for the base Mistral-7B-v0.1 model, utilizing around 40GB of forum threads from the Swedish website flashback.org. | |
It is a full finetune for one epoch. | |