Can't use downloaded model

#11
by philgrey - opened

llm = LlamaCpp(
temperature=0,
model_path = "mistral-7b-instruct-v0.1.Q4_K_M.gguf",
max_tokens=32,
stop=["Q:", "\n"],
)

My script is like above. I've downloaded model using following script

downloader = AutoModelForCausalLM.from_pretrained("TheBloke/Mistral-7B-Instruct-v0.1-GGUF", model_file="mistral-7b-instruct-v0.1.Q4_K_M.gguf", model_type="llama")

I've saved gguf model in the same folder with my script but, getting following error

Exception has occurred: ValidationError
1 validation error for LlamaCpp
root
Could not load Llama model from path: mistral-7b-instruct-v0.1.Q4_K_M.gguf. Received error Model path does not exist: mistral-7b-instruct-v0.1.Q4_K_M.gguf (type=value_error)
File "E:\work\Daily\11_3\dragon\ConvM\llama.py", line 22, in
llm = LlamaCpp(
pydantic.v1.error_wrappers.ValidationError: 1 validation error for LlamaCpp
root
Could not load Llama model from path: mistral-7b-instruct-v0.1.Q4_K_M.gguf. Received error Model path does not exist: mistral-7b-instruct-v0.1.Q4_K_M.gguf (type=value_error)

Sign up or log in to comment