Base_model showing up as finetune

#1
by bartowski - opened

@julien-c any reason this shows as a finetune? is it by chance because it's a .safetensors so it assumes that it must be a tune?

Is there any way to do this differently or is it exclusive to GGUF?

base_model: Replete-AI/Replete-LLM-Qwen2-7b

The Hub will infer the type of relationship from the current model to the base model ("adapter", "merge", "quantized", "finetune") but you can also set it explicitly if needed: base_model_relation: quantized for instance.
https://huggingface.co/docs/hub/model-cards#specifying-a-base-model

I guess it wasnt able to infer the type on its own. Does it use the file types to do that? or the title?
This is all very new functionality, if its broken they will fix it.

Oh dam @Nelathan that's a perfect find, thank you!

Since theres no model in the main branch, it wasnt able to infer the type. Im not sure using branches for the different quants is the best way. Maybe consider using just folders?

Folders also mess with things, people will struggle downloading (and likely end up downloading 5 quant sizes in one command), I think branches has served me reasonably well

Sign up or log in to comment