Error downloading model

#13
by pr12431 - opened

ValueError: Unrecognized configuration class <class 'transformers_modules.microsoft.Phi-3-mini-4k-instruct-onnx.a484edc37e8f7b425a5aefa25e35905769963681.configuration_phi3.Phi3Config'> to build an AutoTokenizer.

How can I solve for this? I just tried the out of the box download code:
tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-4k-instruct-onnx", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("microsoft/Phi-3-mini-4k-instruct-onnx", trust_remote_code=True)

Microsoft org

The AutoModelForCausalLM class is for PyTorch models only and not ONNX models. You can use Hugging Face's Optimum to load the ONNX models with the same Hugging Face APIs that you're used to. For the tokenizer, instead of specifying microsoft/Phi-3-mini-128k-instruct-onnx in the AutoTokenizer.from_pretrained method, you need to download the files locally and specify the path to one of the sub-folders within the microsoft/Phi-3-mini-128k-instruct-onnx repo. Here is an example of how to specify the path.

pr12431 changed discussion status to closed
pr12431 changed discussion status to open
kvaishnavi changed discussion status to closed

Sign up or log in to comment