File size: 427 Bytes
44748cd
 
2711647
 
 
 
 
 
 
 
44748cd
2711647
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
license: mit
pipeline_tag: text-generation
language:
  - sv
  - en
tags:
- pretrained
widget:
- text: "Jag tycker att det är roligt med"
---

# Model Card for Mistral-7B-v0.1-flashback-v2

Mistral-7B-v0.1-flashback-v2 model is a continuation of the pretraining process for the base Mistral-7B-v0.1 model, utilizing around 40GB of forum threads from the Swedish website flashback.org. 
It is a full finetune for one epoch.