MX3 support for Yolo-v11-nano-pose

Hi memryX-Team,

I want to use the Yolo-v11-nano-pose on the MX3. When converting the model I’m getting the error:

Converting Model : (Done)
memryx.errors.OperatorError: Matmul (‘/model.10/m/m.0/attn/MatMul’) with axes=[3, 2] is not supported. Using autocrop (–autocrop) might help.

Activating autocrop results in the same error.

I tried simplify=True when exporting to onnx from pytorch, without any success.

I exported the Yolo model to onnx using Ultrylytics 8.3.225, i’m using compiler version 2.1.0.

Additionally I am facing a similar error when converting the classification model, which is listed as supported in the model explorer.

Could you provide me guidance on how to resolve these issues (should i use a specific version?) and are there any plans in adding support for the Yolo-v11-nano-pose model?

Hi @paspf

Maybe the required extension is missing. Did you follow the instructions at: Neural Compiler Extensions — MemryX Developer Hub?

2 Likes

Hi @shashi,

i was not aware of the extension flag, its compiling with:

mx_nc -m yolo11n-pose.onnx --extensions Yolov10 --autocrop

Thank you!

2 Likes

Thank you! @shashi and @paspf

Yes — YOLOv11 models are supported out of the box, and you can compile them using our Neural Compiler Extensions.

We also have MXA-optimized YOLOv11 models available in our Model eXplorer, If you’re curious, definitely take a look at the MXA-optimized Yolov11 nano pose 640.

1 Like