--- license: mit --- # Miquliz-120b-v2.0-FP8-dynamic ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6303ca537373aacccd85d8a7/vmCAhJCpF0dITtCVxlYET.jpeg) This quant was made for [infermatic.ai](https://infermatic.ai/) Dynamic FP8 quant of [Miquliz 120B v2.0](https://huggingface.co/wolfram/miquliz-120b-v2.0) made with AutoFP8. ## Model Details - Max Context: 32768 tokens - Layers: 140 ### Prompt template: Mistral ``` [INST] {prompt} [/INST] ```