Jetson AGX Orin – How to Build a Fully GPU-Only GStreamer Pipeline (UYVY/NV16 Input Fails to Play)

We’re trying to run a fully GPU-only camera pipeline on Jetson AGX Orin (JetPack 5), using nvv4l2camerasrc and nvvidconv with zero CPU fallback. The goal is to keep the entire pipeline in memory:NVMM, and avoid CPU-based videoconvert or memory copies.

Camera Capabilities

Using v4l2-ctl, our camera reports support for the following formats:

  • UYVY
  • NV16

nvvidconv Capabilities

Our installed nvvidconv plugin supports (for both video/x-raw(memory:NVMM) and video/x-raw):

  • YUV: UYVY, YUY2, NV12, NV16, NV24, YVYU, I420, Y42B, Y444
  • RGB: RGBA, BGRx
  • Grayscale: GRAY8
  • High bit-depth: P010_10LE, etc.

We expect the following pipeline to run fully in GPU memory:
pipeline << "nvv4l2camerasrc device=/dev/video0 ! "
<< "video/x-raw(memory:NVMM), width=1920, height=1080, format=UYVY ! "
<< "nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! "
<< "tee name=t ";
pipeline << "t. ! queue ! "
<< "appsink name=raw_sink emit-signals=false sync=false ";
(side mark: We split the pipeline with tee because we also need to add compressed and dewarped streams later.)

However:

  • gst_parse_launch() fails with not-negotiated or
  • Pipeline doesn’t transition to PLAYING

Our Questions

  1. What’s the correct way to build a GPU-only pipeline when the camera supports UYVY and NV16, but the pipeline refuses to start?
  2. Is it safer to request format=NV16 from the camera instead of UYVY?
  3. How do we confirm whether nvv4l2camerasrc supports UYVY + memory:NVMM input on our Jetson platform?
  4. Are there known limitations on nvv4l2camerasrc formats that override what v4l2-ctl reports?
  5. What’s the best fallback strategy that still preserves NVMM and avoids CPU memory?

We’ve tried several permutations, but haven’t been able to get the pipeline running in full GPU mode. Any help would be greatly appreciated!

1 Like

Hello,

Thanks for visiting the NVIDIA Developer forums.

Your topic will be best served in the Jetson category, I have moved this post for better visibility.

Cheers,
Tom

Hi,
Please check the suggestion in
Profiling CPU/GPU Usage in ROS 2 Camera Node on Jetson - #5 by DaneLLL

Since you need to send CPU buffers in BGR to appsink, have to convert NVMM buffers to CPU buffers so some CPU usage is expected.

The nvv4l2camerasrc plugin supports UYVY. It is open source and you can customize it to support YUYV(named YUY2 in gstreamer). We never try NV16, so you may give it a try but result is unknown. Would suggest use YUV422 formats such as UYVY or YUYV.