timpal0l's picture
Update README.md
9c8ada5 verified
|
raw
history blame
No virus
531 Bytes
---
license: mit
pipeline_tag: text-generation
language:
- sv
- en
tags:
- pretrained
widget:
- text: "Jag tycker att det är roligt med"
---
# 🐈‍⬛ Mistral-7B-v0.1-flashback-v2
![](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2/resolve/main/flashcat.png?download=true)
Mistral-7B-v0.1-flashback-v2 model is a continuation of the pretraining process for the base Mistral-7B-v0.1 model, utilizing around 40GB of forum threads from the Swedish website flashback.org.
It is a full finetune for one epoch.