-
Couldn't load subscription status.
- Fork 601
Open
Description
Feature request
I would like to be able to convert this model which is based on Qwen 2.5 VL architecture using optimum. Right now, I get the error:
ValueError: Trying to export a qwen2_5_vl model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type qwen2_5_vl to be supported natively in the ONNX export.
I read the documentation but I have no idea how I'd go about setting the custom onnx config up.
Motivation
Qwen 2.5 VL is a SOTA architecture that is already being used in downstream models (see my example), so it is worth supporting.
Your contribution
I can do research but I don't have enough experience with this codebase and ML code to contribute a PR.
Metadata
Metadata
Assignees
Labels
No labels