CUDA Illegal memory access using zedsrc + zeddemux + videoconvert + nvvideoconvert plugin

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): Jetson AGX Orin
• DeepStream Version: GStreamer 1.20.3 | DeepStreamSDK 7.1.0
• JetPack Version (valid for Jetson only): 6.1 (Ubuntu 22.04 LTS, CUDA 12.6)
• TensorRT Version: 10.3.0
• Issue Type( questions, new requirements, bugs): bug
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing):

gst-launch-1.0 zedsrc stream-type=4 camera-id=0
! zeddemux name=demux
demux.src_left ! queue
! videoconvert
! nvvideoconvert
! ‘video/x-raw(memory:NVMM),format=NV12’
! fakesink async=false
demux.src_aux ! videoconvert
! ‘video/x-raw,format=GRAY16_LE’
! fakesink

“Requirement details” (new‑feature vs. bug clarification)

Field Answer
Problem type Bug (not a new feature). Goal is simply to feed the left RGB image as an NV12/NVMM surface into nvstreammux, identical to any V4L2 camera, while continuing to pull 16‑bit depth on the host through the auxiliary pad.
Affected modules zedsrc, zeddemux, nvvideoconvert → nvstreammux on JetPack 6.x
Expected behaviour Pipeline should behave the same way it did on JetPack 5.1.2 (CUDA 11.4) where the exact same graph works for hours without errors.
Actual behaviour CUDA illegal memory access within nvbufsurftransform_copy.cpp when the upstream buffer originates from ZED’s CUDA context. Crash reproducible with a single camera and no inference.
Functional description (what we need fixed) Safe inter‑op between ZED SDK’s GPU buffers and DeepStream’s NVBufSurface allocator on Jetson Orin under CUDA 12.4. Ideally nvstreammux (or upstream nvvideoconvert) should accept the surface or provide an allocator API so zedsrc can write directly into NVMM without an intermediate copy.

Key excerpts from the sanitizer log (full trace attached): cuda_traceback

========= Program hit cudaErrorIllegalAddress (700)
nvbufsurftransform_copy.cpp:341 => Failed in mem copy
cudaMemcpy2DAsync               ← first fault inside nvvideoconvert thread
...
cudaCreateTextureObject(...) at /builds/sl/ZEDKit/lib/src/sl_core/utils/util.cu:482
gst_zedsrc_fill → sl::Camera::grab → StereoDispGpuSL

What we already tried

Attempt Result
Single camera only Fails the same way (so it’s not multi‑cam contention).
Force video/x-raw,format=NV12 without NVMM No crash, but DeepStream cannot consume system‑memory buffers.
Set GST_CUDA_DEVICE_ID=0, CUDA_VISIBLE_DEVICES=0 No change.
Added sync=0 / async-handling=true on queues No change.
Rebuilt ZED SDK 4.1.2 sample with CUDA 12.4 Same error, so seems not for latest SDK version.
Talked with ZED support Redirection to here
Going straight zeddemux → streammux Internal stream data error from zedsrc
Avoiding videoconvert, zeddemux → nvvideoconvert → streammux Internal stream data error from zedsrc
Changed resolution and batch size No change.

Hypothesis

  • Two different CUDA contexts appear in the back‑trace (ZED SDK vs. GStreamer/DeepStream).
  • The NVMM buffer hosting the RGB left image is likely copied (cudaMemcpy2DAsync) inside nvbufsurftransform_copy.cpp after the CPU pad push. When the allocator comes from zedsrc the surface may be missing proper pitch/stride alignment for NV12 on Orin (64‑byte requirement?).
  • Immediately afterwards the ZED SDK creates/destroys a texture object on the same stream, and the device raises the same illegal address on cudaCreateTextureObject, hinting that the earlier overrun corrupted global memory.

Questions to the community / NVIDIA team

  1. Is the memory layout produced by zedsrc compatible with nvvideoconvertnvstreammux on JetPack 6.x?
    a. Are there known alignment or pitch constraints for NV12(NVMM) on Orin that third‑party sources must satisfy?
  2. Can the DeepStream allocator be forced upstream? (so that zedsrc writes directly into an NVBUF surface coming from nvstreammux and eliminates the copy).
  3. Recommended way to inter‑operate ZED SDK and DeepStream streammux in JetPack 6.x?

We don’t know. ZED SDK is not provided by Nvidia.

There is alignment and pitch constraints for Jetson device. But it has nothing to do with the non-Nvidia component.

No. Actually if the zed plugin output normal system memory buffer, there is no impact.

We don’t know what ZED plugin have done with CUDA, we don’t have any comments. Maybe you need to replace the ZED src with videotestsrc to debug the DeepStream part first. What is the zedsrc+zeddemux output format? YUYV, I420, NV12,RGB,…?

The zeddemux plugin outputs the RGB channel with format: video/x-raw, format=BGRA

Knowing that, which should be the ideal way to stream that data to streammux?

Seems that the error appears on nvbufsurftransform, which of my plugins use that?

Can I stream directly from videoconvert to streammux without going through nvvideoconvert?
Or maybe could I copy the surfaces that are given from zeddemux so to avoid that nvbufsurftransform gets illegal memory access, because maybe it is that zeddemux drops that data. Idk anything more to try

Here’s the documentation of zeddemux: zeddemux Element | stereolabs/zed-gstreamer | DeepWiki

The pipeline looks fine.

nvvideoconvert use it

No. nvstreammux doesn’t accept system memory buffers.

You may refer to /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-appsrc-test for how to feed outside RGB data into the pipeline.

Please try this first.

I have tried using that and works, so seems an error because of zedsrc + zeddemux. But I need to use those plugins, so I can extract the depth from my stereo cameras.

I’ve tried removing nvvideoconvert and the pipeline works properly, so seems that there’s the error, also as you said I need that plugin so to stream the frames to streammux.

Removing videoconvert so it goes directly zeddemux → nvvideoconvert does not work also.

What should I do?

Have you set the output format and resolution exactly the same as what is the output from zedsrc+zeddemux?

Can you check the caps after videoconvert? What is the output format?

If the zedsrc+zeddemux output the standard RGBA format video, videoconvert is not necessary. Please check the actual output format of zedsrc+zeddemux.

Here you can see all caps from the command executed:
gst-launch-1.0 -v zedsrc stream-type=4 camera-id=0
! zeddemux name=demux
demux.src_left ! queue
! videoconvert
! nvvideoconvert
! ‘video/x-raw(memory:NVMM),format=NV12’
! fakesink async=false
demux.src_aux ! videoconvert
! ‘video/x-raw,format=GRAY16_LE’
! fakesink
Setting pipeline to PAUSED …
Setting depth_mode to NEURAL
Pipeline is live and does not need PREROLL …
Redistribute latency…
/GstPipeline:pipeline0/GstZedSrc:zedsrc0.GstPad:src: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)2400, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstZedDemux:demux.GstPad:src_left: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstZedDemux:demux.GstPad:src_aux: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert1.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)GRAY16_LE, colorimetry=(string)1:4:7:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)GRAY16_LE, colorimetry=(string)1:4:7:1
/GstPipeline:pipeline0/GstFakeSink:fakesink1.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)GRAY16_LE, colorimetry=(string)1:4:7:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)GRAY16_LE, colorimetry=(string)1:4:7:1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert1.GstPad:sink: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1
/GstPipeline:pipeline0/GstZedDemux:demux.GstPad:sink: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)2400, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1

Seems that:
zeddemux src_left → caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1

videoconvert → caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1

The point that confuses me is that the error we are talking about is raised eventually after the execution starts, maybe 10 seconds later or 5 minutes. So if it was a caps misalignment, won’t work from the starting point, am I right?

Where is videoconvert0? There are two videoconvert in your pipeline

You can try with videotestsrc and format BGRA resolution 1920x1200 to work with DeepStream pipeline.

Sorry it is here:

/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)RGBA, colorimetry=(string)sRGB
/GstPipeline:pipeline0/Gstnvvideoconvert:nvvideoconvert0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12, block-linear=(boolean)false, nvbuf-memory-type=(string)nvbuf-mem-surface-array, gpu-id=(int)0, colorimetry=(string)2:3:7:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12, block-linear=(boolean)false, nvbuf-memory-type=(string)nvbuf-mem-surface-array, gpu-id=(int)0, colorimetry=(string)2:3:7:1
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12, block-linear=(boolean)false, nvbuf-memory-type=(string)nvbuf-mem-surface-array, gpu-id=(int)0, colorimetry=(string)2:3:7:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)NV12, block-linear=(boolean)false, nvbuf-memory-type=(string)nvbuf-mem-surface-array, gpu-id=(int)0, colorimetry=(string)2:3:7:1
/GstPipeline:pipeline0/Gstnvvideoconvert:nvvideoconvert0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1, format=(string)RGBA, colorimetry=(string)sRGB
New clock: GstSystemClock
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)BGRA, width=(int)1920, height=(int)1200, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)sRGB, framerate=(fraction)15/1

OK. You can try with videotestsrc and format BGRA resolution 1920x1200 to work with DeepStream pipeline.

With videotestsrc does work properly, command used:
gst-launch-1.0 -v videotestsrc ! ‘video/x-raw,width=1920,height=1200,format=BGRA,framerate=15/1’ ! queue ! videoconvert ! nvvideoconvert ! ‘video/x-raw(memory:NVMM),format=NV12’ ! fakesink async=false

So now?

Can you add capsfilter after the ‘videoconvert’ and before ‘nvvideoconvert’. The caps can be 'video/x-raw(memory:NVMM),format=I420'.

Command:
gst-launch-1.0 zedsrc stream-type=4 camera-id=0 ! zeddemux name=demux demux.src_left ! queue ! videoconvert ! ‘video/x-raw(memory:NVMM),format=I420’ ! nvvideoconvert ! ‘video/x-raw(memory:NVMM),format=NV12’ ! fakesink async=false demux.src_aux ! videoconvert ! ‘video/x-raw,format=GRAY16_LE’ ! fakesink
WARNING: erroneous pipeline: could not link videoconvert0 to nvvideoconvert0,

Result:
videoconvert0 can’t handle caps video/x-raw(memory:NVMM), format=(string)I420

This is GStreamer plugin, (memory:NVMM) is not supported. Please remove “(memory:NVMM)”.

Same error appears. Specifically:
Program hit cudaErrorInvalidValue (error 1) due to “invalid argument” on CUDA API call to cudaMemcpy2DAsync.
/dvs/git/dirty/git-master_linux/nvutils/nvbufsurftransform/nvbufsurftransform_copy.cpp:341: => Failed in mem copy

Command used:
compute-sanitizer --tool memcheck gst-launch-1.0 zedsrc stream-type=4 camera-id=0 ! zeddemux name=demux demux.src_left ! queue ! videoconvert ! ‘video/x-raw,format=I420’ ! nvvideoconvert ! ‘video/x-raw(memory:NVMM),format=NV12’ ! fakesink async=false demux.src_aux ! videoconvert ! ‘video/x-raw,format=GRAY16_LE’ ! fakesink

Can you remove the ‘demux.src_aux ! videoconvert ! ‘video/x-raw,format=GRAY16_LE’ ! fakesink’ branch first?