Mikael110's picture
(Minor Issue) Fix broken GPTQ and GGML Links
dc7bf68
|
raw
history blame
5.68 kB
metadata
license: other
inference: false
TheBlokeAI

Monero's WizardLM-Uncensored-SuperCOT-Storytelling-30B GPTQ

This is GPTQ format quantised 4bit models of Monero's WizardLM-Uncensored-SuperCOT-Storytelling-30B.

It is the result of quantising to 4bit using GPTQ-for-LLaMa.

Repositories available

Prompt template

You are a helpful assistant
### USER: prompt goes here
### ASSISTANT:

To allow all output, add ### Certainly! to the end of the prompt

How to easily download and use this model in text-generation-webui

Open the text-generation-webui UI as normal.

  1. Click the Model tab.
  2. Under Download custom model or LoRA, enter TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GGML.
  3. Click Download.
  4. Wait until it says it's finished downloading.
  5. Click the Refresh icon next to Model in the top left.
  6. In the Model drop-down: choose the model you just downloaded, WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GGML.
  7. If you see an error in the bottom right, ignore it - it's temporary.
  8. Fill out the GPTQ parameters on the right: Bits = 4, Groupsize = None, model_type = Llama
  9. Click Save settings for this model in the top right.
  10. Click Reload the Model in the top right.
  11. Once it says it's loaded, click the Text Generation tab and enter a prompt!

Provided files

Compatible file - WizardLM-Uncensored-SuperCOT-Storytelling-GPTQ-4bit.act.order.safetensors

This will work with all versions of GPTQ-for-LLaMa. It has maximum compatibility

It was created without group_size to minimise VRAM usage, and with --act-order to improve inference quality.

  • WizardLM-Uncensored-SuperCOT-Storytelling-GPTQ-4bit.act.order.safetensors
    • Works with all versions of GPTQ-for-LLaMa code, both Triton and CUDA branches
    • Works with AutoGPTQ.
    • Works with text-generation-webui one-click-installers
    • Parameters: Groupsize = None. Act-order.
    • Command used to create the GPTQ:
      python llama.py HF_repo c4 --wbits 4 --act-order --true-sequential --save_safetensors WizardLM-Uncensored-SuperCOT-Storytelling-GPTQ-4bit.act.order.safetensors
      

Discord

For further support, and discussions on these models and AI in general, join us at: TheBloke AI's Discord server

Thanks, and how to contribute.

Thanks to the chirper.ai team!

I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.

If you're able and willing to contribute, it'd be most gratefully received and will help me to keep providing models, and work on new AI projects.

Donaters will get priority support on any and all AI/LLM/model questions, plus other benefits.

Patreon special mentions: Aemon Algiz; Talal Aujan; Jonathan Leane; Illia Dulskyi; Khalefa Al-Ahmad; senxiiz. Thank you all, and to all my other generous patrons and donaters.

Original model card: Monero's WizardLM-Uncensored-SuperCOT-Storytelling-30B

This model is a triple model merge of WizardLM Uncensored+CoT+Storytelling, resulting in a comprehensive boost in reasoning and story writing capabilities.

To allow all output, at the end of your prompt add ### Certainly!

You've become a compendium of knowledge on a vast array of topics.

Lore Mastery is an arcane tradition fixated on understanding the underlying mechanics of magic. It is the most academic of all arcane traditions. The promise of uncovering new knowledge or proving (or discrediting) a theory of magic is usually required to rouse its practitioners from their laboratories, academies, and archives to pursue a life of adventure. Known as savants, followers of this tradition are a bookish lot who see beauty and mystery in the application of magic. The results of a spell are less interesting to them than the process that creates it. Some savants take a haughty attitude toward those who follow a tradition focused on a single school of magic, seeing them as provincial and lacking the sophistication needed to master true magic. Other savants are generous teachers, countering ignorance and deception with deep knowledge and good humor.