Hi,
I am running a pipeline with implementing nvstreammux which work well for few minutes. But After some time it throws below errors:
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
My Environment:
1> Hardware: Jetson Xavier AGX development board
2> Jetpack: 4.6.1
3> Gstreamer v6.0 (default as come with jetpack 4.6.1)
4> OS: Ubuntu 18.04
5> Camera modules: IMX390rcm (total 7 cameras)
My Pipeline:
gst-launch-1.0
nvarguscamerasrc sensor-id=0 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_0
nvarguscamerasrc sensor-id=1 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_1
nvarguscamerasrc sensor-id=2 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_2
nvarguscamerasrc sensor-id=3 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_3
nvarguscamerasrc sensor-id=4 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_4
nvarguscamerasrc sensor-id=5 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_5
nvarguscamerasrc sensor-id=6 bufapi-version=TRUE ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1’ ! m.sink_6
nvstreammux name=m batch-size=7 live-source=1 width=1920 height=1080 ! nvv4l2vp9enc maxperf-enable=true bitrate=64000000 ! rtpvp9pay mtu=1400 ! udpsink host=192.168.50.100 port=5000 sync=false
Description:
My pipeline capturing inputs from 7 camera sources using nvarguscamerasrc.
Objective is to transmit them on Ethernet using single port (example: port 5000).
Hence I am trying to mux all input streams into single streaming.
I am trying to implement nvstreammux element but struggling to configure it properly.
Without using nvstreammux my pipeline work flawless and transmit output on udp. I was transmitting output on different Ethernet port. Now muxing them to send on single port.
My Objective: Low latency data transmission
Problem:
Once running above pipeline, it run well for some time (roughly 5-10 minutes). But after that it throw error as shown in below screenshot.
GST_ARGUS: Running with following settings:
Camera index = 3
Camera mode = 3
Output Stream W = 1936 H = 1096
seconds to Run = 0
Frame Rate = 29.999999
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
nvbuf_utils: Can not get HW buffer from FD… Exiting…
CONSUMER: Done Success
Help Needed:
1> How to solve these errors?
2> Can anyone help on how to implement mux effectively as its not reliable at this state in pipeline?
3> Can anyone share or connect to a good online doc to configure nvstreammux for better understanding? I am already taking reference from Nvidia doc.
Any help will be appreciated.