Failed in mem copy

Hello,

I am experiencing the same issue with Jetpack 6.2 and Deepstream 7.1 on a Jetson Orin AGX.

I am using the following gstreamer pipeline, using a decklink card for UYVY input.

gst-launch-1.0
decklinkvideosrc device-number=0 profile=two-sub-devices-half connection=sdi mode=1080p25 video-format=8bit-yuv
! queue leaky=2 max-size-bytes=0
! ‘video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=25/1’
! nvvideoconvert
! ‘video/x-raw(memory:NVMM), width=1920, height=1080, format=(string)NV12, framerate=25/1’
! nvv4l2h265enc bitrate=4000000 iframeinterval=25 control-rate=constant_bitrate profile=Main maxperf-enable=true preset-level=UltraFastPreset
! tee name=videoTee
mpegtsmux name=mpegtsMux pmt-interval=9000 bitrate=38400000
! rtpmp2tpay ! udpsink ttl=64 host=127.0.0.1 port=5004
videoTee. ! queue ! h265parse ! mpegtsMux.

In my case the pipeline will run for 10 minutes before the mem copy error occurs, however the h265 bitstream appears to be corrupted?

If I don’t use the deepstream plugin’s the pipeline works just fine.

Could you try to add this plugin like the pipeline I attached?

nvvideoconvert compute-hw=1 nvbuf-memory-type=3

I have the same problem using appsrc with Jetpack 6.2 and Deepstream 7.1 on Orin Nano device.
The code works with Deepstream 5.1 and 6.2.
With Deepstream 7.1 the error occurs randomly:
Need Data/dvs/git/dirty/git-master_linux/nvutils/nvbufsurftransform/nvbufsurftransform_copy.cpp:341: => Failed in mem copy

ERROR: Failed to make stream wait on event, cuda err_no:700, err_str:cudaErrorIllegalAddress
ERROR: Preprocessor transform input data failed., nvinfer error:NVDSINFER_CUDA_ERROR
0:09:50.744161066 24827 0xaaaaac9da000 WARN nvinfer gstnvinfer.cpp:1420:gst_nvinfer_input_queue_loop:<primary_gie> error: Failed to queue input batch for inferencing
ERROR from primary_gie: Failed to queue input batch for inferencing
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(1420): gst_nvinfer_input_queue_loop (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie
ERROR from tracking_tracker: Failed to submit input to tracker
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvtracker2/gstnvtracker.cpp(794): gst_nv_tracker_submit_input_buffer (): /GstPipeline:pipeline/GstBin:tracking_bin/GstNvTracker:tracking_tracker

I added compute-hw=1 nvbuf-memory-type=4, but error continue to occur.

Could you try to set nvbuf-memory-type=3?

Could you try to set nvbuf-memory-type=3 ?

I got error:
Need Data/dvs/git/dirty/git-master_linux/nvutils/nvbufsurftransform/nvbufsurftransform.cpp:4543: => Surface type not supported for transformation NVBUF_MEM_CUDA_UNIFIED

/dvs/git/dirty/git-master_linux/nvutils/nvbufsurftransform/nvbufsurftransform.cpp:4543: => Surface type not supported for transformation NVBUF_MEM_CUDA_UNIFIED

0:00:10.994674404 58214 0xaaaaac9d89e0 ERROR nvvideoconvert gstnvvideoconvert.c:4255:gst_nvvideoconvert_transform: buffer transform failed

I’m using Orin nano device. I think NVBUF_MEM_CUDA_UNIFIED is not supported here.

Could you attach your whole pipeline? Or you can try the pipeline I attached before on your board.

GST_DEBUG=3 gst-launch-1.0 nvurisrcbin uri=file:///opt/nvidia/deepstream/deepstream-7.1/samples/streams/sample_1080p_h264.mp4 ! m.sink_0 nvstreammux name=m batch-size=1 width=1200 height=1920 ! nvvideoconvert compute-hw=1 nvbuf-memory-type=3 ! x264enc ! h264parse ! mp4mux ! fakesink

Hi, I have downgraded to Jetpack 6.1 and my pipeline is working now. It does look like you have a compatibility issue with Jetpack 6.2 and Deepstream 7.1

Regards Mick

Hi,
Just the Downgrade Resolve the issue?

Yes, according to our Guide, the DeepStream 7.1 compatible version is JetPack 6.1. We will fix this issue in the next release, but for now we can use the WAR pipeline I provided earlier to resolve this issue.

Hi yuweiw,

Is there a patch available for fixing nvvideoconvert, or do we have to wait till the next release of Deepstream.

Any idea when the new release will arrive?

1 Like

Sorry, we will not post any release date on the forum. You can use the workaround we attached before #24 now.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.