Some error with this model

#1
by lsx666 - opened

Hi there!
I have just tried the pure transformers demo on my nootbook, but I get 3 errors, I have upgraded my transformers to the latest version by the way
The first is the error same as in llava-interleave:https://huggingface.co/llava-hf/llava-interleave-qwen-7b-hf/discussions/1#669959bfd77a8b28705bde27

The second is that the processor.apply_chat_template part get error like(part of the whole error):

--> 991 return self.tokenizer.apply_chat_template(
    992     conversation, chat_template=chat_template, tokenize=tokenize, **kwargs
    993 )
...
--> 939 raise rewrite_traceback_stack(source=source)

File <template>:4, in top-level template code()

TypeError: can only concatenate str (not "list") to str

I solve the two error by manually adding chat_template and change the prompt type .But I can not slove the third error easily ,here is the third error(part of the whole error):

File ~/.conda/envs/llava/lib/python3.10/site-packages/transformers/image_processing_utils.py:41, in BaseImageProcessor.__call__(self, images, **kwargs)
     39 def __call__(self, images, **kwargs) -> BatchFeature:
     40     """Preprocess an image or a batch of images."""
---> 41     return self.preprocess(images, **kwargs)
...
   (...)
    744         input_data_format=input_data_format,
    745     )

KeyError: 'shortest_edge'

I think it is because in https://github.com/huggingface/transformers/blob/481e15604a0527fec3cbdcccd350f9374a9116b1/src/transformers/models/llava_next/image_processing_llava_next.py#L723
the "size" in model's preprocessor_config.json is :
"size": {
"height": 384,
"width": 384
}
which does not have key "shortest_edge", besides I wonder if the visual input is video-like, is it necessary to excute this image_patches = self.get_image_patches() function? In the llava-next-video implemention, there is no such function, should we add some if-else here? lastly, here is my modified demo code with line 17, 32, 33 added:

image.png

Thank you!

Llava Hugging Face org

Right, this was added just yesterday and didn't make it to the transformers yet! I opened a PR today (https://github.com/huggingface/transformers/pull/32673) and you can install from thatbranch until it is merged

Sign up or log in to comment