Dmabuf_fd -1 mapped entry NOT found

Hi,

I am running a pipeline with implementing nvstreammux which work well for few minutes. But After some time it throws below errors:

nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…

My Environment:
1> Hardware: Jetson Xavier AGX development board
2> Jetpack: 4.6.1
3> Gstreamer v6.0 (default as come with jetpack 4.6.1)
4> OS: Ubuntu 18.04
5> Camera modules: IMX390rcm (total 7 cameras)

My Pipeline:

gst-launch-1.0
nvarguscamerasrc sensor-id=0 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_0
nvarguscamerasrc sensor-id=1 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_1
nvarguscamerasrc sensor-id=2 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_2
nvarguscamerasrc sensor-id=3 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_3
nvarguscamerasrc sensor-id=4 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_4
nvarguscamerasrc sensor-id=5 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_5
nvarguscamerasrc sensor-id=6 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_6
nvstreammux name=m batch-size=7 live-source=1 width=1920 height=1080 ! nvv4l2vp9enc maxperf-enable=true bitrate=64000000 ! rtpvp9pay mtu=1400 ! udpsink host=192.168.50.100 port=5000 sync=false

Description:
My pipeline capturing inputs from 7 camera sources using nvarguscamerasrc.
Objective is to transmit them on Ethernet using single port (example: port 5000).
Hence I am trying to mux all input streams into single streaming.
I am trying to implement nvstreammux element but struggling to configure it properly.

Without using nvstreammux my pipeline work flawless and transmit output on udp. I was transmitting output on different Ethernet port. Now muxing them to send on single port.

My Objective: Low latency data transmission

Problem:
Once running above pipeline, it run well for some time (roughly 5-10 minutes). But after that it throw error as shown in below screenshot.

GST_ARGUS: Running with following settings:
Camera index = 3
Camera mode = 3
Output Stream W = 1936 H = 1096
seconds to Run = 0
Frame Rate = 29.999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success

Help Needed:
1> How to solve these errors?
2> Can anyone help on how to implement mux effectively as its not reliable at this state in pipeline?
3> Can anyone share or connect to a good online doc to configure nvstreammux for better understanding? I am already taking reference from Nvidia doc.

Any help will be appreciated.

Hi,
The nvstreammux plugin is for DeepStream SDK use-cases. The pipeline you run looks not correct. Please share your use-case so that we can suggest next.

Hi @DaneLLL :

Here is my Use case:

I need to capture stream from 7cameras (imx390). Mux them together and transmit on UDP.

As my prime focus is to keep latency as low as possible, I am trying to keep as minimum elements in my pipeline as possible.
Also trying to use elements which load my gpu rather than CPU.
That’s why I selected nvstreammux for this application.

So considering that prime focus is latency muxing 7 camera feeds together, can you recommend better solution as compared to nvstreammux.

Pls let me know in case you expecting any other info from my end. I’ll try to provide asap.

Regards

Hi,
Would you like to stream each source individually? Or composite into single video plane and then encoder/stream out. The nvstreammux plugin works with nvmultistreamtiler like:

... ! nvstreammux ! nvmultistreamtiler ! nvv4l2vp9enc ! ...

The sources are composited into single plane and then feed into encoder.

Hi @DaneLLL ,
You got me right.
I want to composite all 7 camera stream into a single stream and then encode it before throwing out on ethernet.

Composite is the right word. Thx for making me understand it. I was not aware of it. I thought mux is the right word and started searching using a keyword mux.

Is there any element you recommend to perform to composite into a single video stream?

I also need to de-composite (if it’s the right word) it at receiving end. Any recommendations for de-composite element?

So on receiving end, below is what I am trying to do:
Receive from UDP → de-composite into 7 video streams (de-mux) → perform de-wrap on each video stream ->perform stiching → display on tiles

Hi @DaneLLL :

I misunderstood your response. We are not compositing all camera feeds into single video plane. As you correctly said, I am trying to transmit each camera stream individually.

Let me again try to explain my usecases:

Hardware: 1> Jetson AGX Xavier development board.
2> 16 Cameras interface board from D3 Engineering.
3> 7 Fisheye camera modules (IMX-390 rcm)

Objectives:
to transmit 7 camera video stream from machine to command station and display with minimum latency. Operator drive the machine based on these video feeds. Hence latency play vital role.

Objective 1: Transmission End: Transmitting all 7 camera stream (without audio) individually on UDP port 5000.

Objective 2: Receiving End: Receiving via UDP port 5000 all 7 camera stream and displaying all 7 camera live video stream (without audio) on multiple displays.
Something like below image:
image

Usecase Description:

Transmission:
So, on transmission I have captured multiple camera streams but now want to mux them to get a single video stream to transmit on UDP port 5000.

Reception: On receiving end, we want to demux so get back each camera stream separate.

So pipeline is : Demux → de-wrap (fish eye) → Implement tiler → Display on multiple displays

Current issue / scenario:

I am currently facing issue on how to effectively implement nvstreammux on transmission side.
Not sure whether this is the right gstreamer element to use of if you can suggest the right element to use to merge all camera feeds into single stream.

Hi,
Generally, we separate the steams in different port number like:

$ gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! queue ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 ! h264parse ! rtph264pay ! udpsink host=192.168.50.100 port=5000 sync=0 gst-launch-1.0 videotestsrc is-live=1 pattern=1 ! video/x-raw,width=1280,height=720 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! queue ! nvv4l2h264enc insert-sps-pps=1 idrinterval=15 ! h264parse ! rtph264pay ! udpsink host=192.168.50.100 port=5001 sync=0 

Please check if you can run like this. And for having optimal performance, please execute the steps:

  1. Run $ sudo nvpmodel -m 0 and $ sudo jetson_clocks
  2. Enable maxperf-enable=1 to v4l2 encoder

Hi @DaneLLL ,

I think it works for me now… THanks for your help.
My single command based pipeline is working now. Below is working pipeline:

gst-launch-1.0 -e
nvarguscamerasrc sensor-id=0 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5000 sync=false
nvarguscamerasrc sensor-id=1 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5001 sync=false
nvarguscamerasrc sensor-id=2 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5002 sync=false
nvarguscamerasrc sensor-id=3 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5003 sync=false
nvarguscamerasrc sensor-id=4 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5004 sync=false
nvarguscamerasrc sensor-id=5 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5005 sync=false
nvarguscamerasrc sensor-id=6 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5006 sync=false
nvarguscamerasrc sensor-id=7 tnr-strength=1 tnr-mode=2 ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=30/1’ ! queue ! nvv4l2vp9enc maxperf-enable=true bitrate=4000000 ! rtpvp9pay mtu=9000 ! udpsink host=10.0.0.173 port=5007 sync=false

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.