Different behavior (NOT good) of yolov11n on Jetson Orin Nano Super

Detailed info, please check github issue: Different behavior (NOT good) of yolov11n on Jetson Orin Nano Super · Issue #19134 · ultralytics/ultralytics · GitHub

And currently, we find jetpack6.1 will be automatically upgrade to jetpack6.2. So it’s difficult for us to lock the system to jetpack6.1.

Is there anyway to lock jetpack, no auto-upgrade?

1 Like

Hi,

Could you share more about the “jetpack6.1 will be automatically upgrade to jetpack6.2”?
Do you have a custom kernel and the kernel be updated to JetPack 6.2?

Thanks.

Just flash L4T36.4.0, apt update/upgrade, install jetpack/ds/ etc, then I found the jtop-7 show it’s L4T36.4.3 and JP is 6.2.

All from NVIDIA, no self script/kernel/driver, and it’s Jetson Orin nano 8GB devkit.

Hi,

“apt upgrade” will automatically upgrade the BSP to the latest.

If you don’t want to use JetPack 6.2 and prefer to stay on the JEtPack 6.1.
Please run “apt update” only and it’s enough for updating the source list.

Thanks.

OK, then it might have the above issue: Different behavior (NOT good) of yolov11n on Jetson Orin Nano Super · Issue #19134 · ultralytics/ultralytics · GitHub

Software part of jetson-stats 4.3.1 - (c) 2024, Raffaello Bonghi
Model: NVIDIA Jetson Orin Nano Developer Kit - Jetpack 6.2 [L4T 36.4.3]
NV Power Mode[0]: 15W
Serial Number: [XXX Show with: jetson_release -s XXX]
Hardware:
 - P-Number: p3767-0005
 - Module: NVIDIA Jetson Orin Nano (Developer kit)
Platform:
 - Distribution: Ubuntu 22.04 Jammy Jellyfish
 - Release: 5.15.148-tegra
jtop:
 - Version: 4.3.1
 - Service: Active
Libraries:
 - CUDA: 12.6.85
 - cuDNN: 9.3.0.75
 - TensorRT: 10.3.0.30
 - VPI: 3.2.4
 - OpenCV: 4.11.0 - with CUDA: YES

--------------------------------
NVIDIA SDK:
DeepStream C/C++ SDK version: 7.1
    jetson-inference version: c038530 (dirty)
        jetson-utils version: 6d5471c

--------------------------------
Python Environment:
Python 3.10.12
    GStreamer:                   YES (1.20.3)
  NVIDIA CUDA:                   YES (ver 12.6, CUFFT CUBLAS FAST_MATH)
         OpenCV version: 4.11.0  CUDA True
           YOLO version: 8.3.75
         PYCUDA version: 2024.1.2
          Torch version: 2.5.0a0+872d972e41.nv24.08
    Torchvision version: 0.20.0a0+afc54f7
 DeepStream SDK version: 1.2.0
onnxruntime     version: 1.20.1
onnxruntime-gpu version: 1.20.0

--------------------------------
FPV Environment:
jetson-fpv version: 8c895d3 dirty
    WFB-ng version: 25.1.25.81795
    MSPOSD version: c28d645 20250217_163159
1 Like

Hi,

Would you mind simply describing the issue you have here?

The comment indicates the compatibility issue in input/output.
Are you facing some issues related to the image/video encoder/decoder?

Thanks.

Recently, we have upgrade our Jetson Orin Nano 8GB DevKit to Jetpack 6.2, which is supposed to have great performance from 40TB to 67TB.

From the test we have issues compared to result from Jetpack 5.1.4, which lost a lot of object detection bbox from the test video here:

$ cat test.txt
simulation rtp streaming source:
video-viewer file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 rtp://@:5600 --input-loop=-1 --headless

test yolo with 11n model
python3 ./utils/yolo.py rtp://@:5600

Detailed info, please check github issue: Different behavior (NOT good) of yolov11n on Jetson Orin Nano Super · Issue #19134 · ultralytics/ultralytics · GitHub

Hi,

We will try to reproduce this issue locally.
At the same time, do you have a output of JetPack 5.1.4 so we can verify when reproducing?

Thanks.

It used to work on JP5.1.4

Hi,

Would you mind sharing the video for JetPack 5.1.4 with us?
Since this is an accuracy drop issue, this helps us to verify if we reproduce the same issue as your environment or not.

Thanks.

Yolo has verified JP5.x/JP6.1 in their offical document: NVIDIA Jetson documentation

There is a simple script in python and my test video in JP6.2 above FYI. Maybe someone with the expertise can find a clue.

Hi,

Do you have an output video that was generated on JetPack 5.1.4?
Above you have shared a result for JetPack 6.2? Do you also have one for JetPack 5?

Thanks.

No, I’m sorry. As there is little support for JP5.x compared to JP6.x, we are switching from JP5.x to JP6.x.

BTW, it should be OK, or I will report this when we are using JP5.1.4. And yolo dev team would also verified the result of JP5.x.

Hi,

Thanks for your patience and sorry for the delay.

We have checked the YOLO detection accuracy on the JetPack 6.2 (TensorRT backend).
It looks quite well with the Ultralytics cli tool.

Could you also give it a check?
Follow the instructions here to generate yolo11n.engine and run:

$ yolo predict model=yolo11n.engine source="/opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4"

Thanks.

1 Like

@AastaLLL Are you using Virtual Environment? what yolo version are you using?

We have checked JP6.1 yolo cli command doesn’t works.

We are NOT using native linux NOT Virtual Environment, trying to max performance, thanks.

What’s the difference from yolo Guide?

Hi,

No, we try this on JetPack 6.2 natively.
Below are the steps for your reference:

$ wget https://pypi.jetson-ai-lab.dev/jp6/cu126/+f/5f9/67f920de3953f/torchvision-0.20.0-cp310-cp310-linux_aarch64.whl#sha256=5f967f920de3953f2a39d95154b1feffd5ccc06b4589e51540dc070021a9adb9
$ pip3 install torchvision-0.20.0-cp310-cp310-linux_aarch64.whl 
$ wget http://jetson.webredirect.org/jp6/cu126/+f/5cf/9ed17e35cb752/torch-2.5.0-cp310-cp310-linux_aarch64.whl#sha256=5cf9ed17e35cb7523812aeda9e7d6353c437048c5a6df1dc6617650333049092
$ pip3 install torch-2.5.0-cp310-cp310-linux_aarch64.whl 
$ pip3 install 'numpy<2'
$ pip3 install onnx
$ pip3 install onnxslim
$ wget https://nvidia.box.com/shared/static/i7n40ki3pl2x57vyn4u7e9asyiqlnl7n.whl -O onnxruntime_gpu-1.17.0-cp310-cp310-linux_aarch64.whl
$ pip install onnxruntime_gpu-1.17.0-cp310-cp310-linux_aarch64.whl 
$ pip3 install ultralytics
$ yolo export model=yolo11n.pt format=engine
$ yolo predict model=yolo11n.engine source="/opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4"

Then you can find the output at ./runs/detect/predict/sample_1080p_h264.avi.
Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.