NanoOWL on Jetson AGX Orin 32GB Issue

Hello @dusty_nv,

Thank you for updating the Jetson Gen AI Lab demos! I tried to run this demo on my Jetson AGX Orin 32GB device:

However, it is very slow compared to the demo video that I see on the above website.

Could you please guide me further?

Best Regards,
Lakshantha

Hi @lakshantha.d ,

Thanks for reaching out!

This looks much slower than expected.

Just to check, do you have jetson_clocks enabled? Also what power mode are you running in (sudo nvpmodel -q)?

Best,
John

Hi,

Can you try to build the TensorRT engine on your Jetson?

Once in the container, please run the following.

python3 -m nanoowl.build_image_encoder_engine \
    data/owl_image_encoder_patch32.engine

And then re-run the demo?

cd examples/tree_demo
python3 tree_demo.py ../../data/owl_image_encoder_patch32.engine

Yes. jetson_clocks is enabled and running in MAXN.

I have done this as well. But no change.