jetson.utils.videoSource.Capture() latency issue

I’m having (quite serieus) latency issues in the Capture() method of the videoSource object. I’ve opened an issue about it on Github but there was no response (yet). I’m sorry if I come across impatient but I’d like to draw some attention to it.

Is there anyone who might have an answer to this? Am I the only one experiencing this issue?

In short: whether sourcing from RTSP or local file (on SSD), the Capture() method takes between 80 and 130ms to complete which makes it impossible to get anything close to 25 fps for video. Gstreamer (with the same pipeline) on the command line works fine.

Any help would be greatly appreciated.

Hi @willemvdkletersteeg, please see my reply here on GitHub:

Thank you, Dusty. I have also tried to discuss the latency/performance issue a few months ago (in february I think it was) and the response back then was also to look into Deepstream. It was too complicated to rebuild our entire videocapture and inferencing loop, including the processing of the metadata. Deepstream just works way different in that regard.

I think, now that our application that we’ve worked on for months is now practically useless because of this one single latency issue we have no choice but to go the rebuilding route… But before we start from scratch I’d like to at least try to keep as much as I can intact of what we already have.

I have now built a drop-in replacement of the jetson.utils.videoSource object that builds a Deepstream/Gstreamer pipeline in the constructor and offers a .Capture() method, just like jetson.utils, to “capture” the next frame. This works and also seems to offer great performance. But, I use an appsink for this and that returns a GstSample/GstBuffer object in Python. I would like to somehow feed this into the jetson.inference.Detectnet detector that we have (and need, because our entire application “feeds” onto the output of this detector).

Is there a somewhat easy way to encapsulate GstBuffer data into a cudaImage object in Python that the detector accepts? Can I use an existing class/constructor for this somewhere? Otherwise I’ll have to build my own, I suppose.

No there isn’t, the memory essentially needs copied from GstBuffer into cudaImage object (which is currently done in gstDecoder.cpp / gstCamera.cpp). It seems likely this is where the additional latency is coming from, because it’s a CPU memcpy()

What I’ve been working on is using NVMM memory, so that the CPU doesn’t need to perform any memcpy. So far it seems to show promising results in terms of utilization and latency. I should have a code update checked in within a day or two, and will let you know to try it out.

That’s a bummer. I have the Gst pipeline working with NVMM memory and it’s really fast (130fps with an HD h264 file). It’s just that I need to “cast” the GstBuffer data into a CudaImage capsule to do anything useful with it. Still trying to figure this out.

Good to hear that you’re also working on a solution. That would be even better! (for other users as well) Python lacks a way of direct memory handling so I suppose a solution in the shared library itself is potentially more performant.

I had also been digging a little further and the culprit is this line:
gstDecoder.ccp:

	// wait until a new frame is recieved
	if( !mWaitEvent.Wait(timeout) )
		return false;

Most lines in the capture() function take a few nanoseconds to complete, but this mWaitEvent.Wait() function takes anywhere from 80.000 to 120.000 ns to return, which is ages. But you’ve probably figured this out as well by now ;) haha

I have hit a dead end with my solution.

As far as I can tell there is no way to use the appsink with NVMM memory as the reported size of the GstBuffer (in the GstSample) the appsink produces is way too small. It’s probably garbage, anyway.

If I remove the “(memory: NVMM)” from the pipeline:

filesrc location={input_uri[5:]} ! qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=RGBA ! appsink

I can grab samples perfectly well. I can transform them to a numpy array and then put that array into jetson.utils.cudaFromNumpy():

np_array = numpy.ndarray(
            (caps.get_structure(0).get_value('height'),
            caps.get_structure(0).get_value('width'),
            4),
            buffer=buf.extract_dup(0, buf.get_size()),
            dtype=numpy.uint8)
return jetson.utils.cudaFromNumpy(np_array)

It works, but it totally defeats the purpose as my entire pipeline is in NVMM memory, then the buffer needs to get copied to CPU memory for the appsink and transform to numpy, and then back to GPU memory again for the inferencing. Totally inefficient, not a viable solution.

I hope your solution works! Can’t wait to find out.

That’s because that GstBuffer is simply a descriptor of the NVMM memory handle, and the nvbuf_utils / EGL APIs need to be used to map it into CUDA. These are C/C++ APIs so not accessible from Python - which isn’t a problem for me, because my underlying implementation is in C++ (which gets exposed via my Python extension modules)

Anyhow, I’ve commited the initial changes to gstDecoder here:

I recommend that you re-clone/re-build from source. These changes are only for gstDecoder currently (so receiving RTP/RTSP and reading video files). I am going to refactor this so I can make it work with gstCamera as well. But it will be good to know if it works/improves things for you. I don’t think you should need to make any modification to the command line. After re-building, you should see the pipeline printed out now specifies to use NVMM memory.

Yes I’m aware of the architecture and the limitations of Python in this regard. That’s why I was hoping you’d be able to work on a solution ;) So thank you! I’m going to check it out right now.

You are my hero! You have actually increased performance by about 500%. The call to videoSource.Capture() now takes approximately 16ms, where it used to be between 80 and 130ms. Excellent work, man. It’s also quite a big commit, I saw. I thought it would’ve been a few small changes. Thank you, thank you.

There is one small problem still, I think. It seems to continually try to release/free a memory buffer that’s not there (anymore). When running the pipeline, I get a continuous stream of:

nvbuf_utils: dmabuf_fd 1120 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1104 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1108 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1114 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1096 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1106 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1116 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1112 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...
nvbuf_utils: dmabuf_fd 1110 mapped entry NOT found
nvbuf_utils: NvReleaseFd Failed... Exiting...

Etc. etc.

Filesources seem to work great (apart from the stream of errors, as reported above), but I’m having trouble with RTSP sources. They are 25fps streams but the Capture() method now takes about 85ms on average to return. Maybe the console output might give you a clue whats going on?

[gstreamer] gstDecoder -- creating decoder for IPADDRESS
Opening in BLOCKING MODE
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
[gstreamer] gstDecoder -- discovered video resolution: 1920x1080  (framerate 25.000000 Hz)
[gstreamer] gstDecoder -- discovered video caps:  video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)main, pixel-aspect-ratio=(fraction)189/190, width=(int)1920, height=(int)1080, framerate=(fraction)25/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
[gstreamer] gstDecoder -- pipeline string:
[gstreamer] rtspsrc location=rtsp://IPADDRESS:9554/camera2 latency=2000 ! queue ! rtph264depay ! h264parse ! omxh264dec ! video/x-raw(memory:NVMM) ! appsink name=mysink
[video]  created gstDecoder from rtsp://IPADDRESS:9554/camera2
------------------------------------------------
gstDecoder video options:
------------------------------------------------
  -- URI: rtsp://IPADDRESS:9554/camera2
     - protocol:  rtsp
     - location:  IPADDRESS
     - port:      9554
  -- deviceType: ip
  -- ioType:     input
  -- codec:      h264
  -- width:      1920
  -- height:     1080
  -- frameRate:  25.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[gstreamer] opening gstDecoder for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from NULL to READY ==> h264parse1
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264depay1
[gstreamer] gstreamer changed state from NULL to READY ==> queue0
[gstreamer] gstreamer changed state from NULL to READY ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from READY to PAUSED ==> h264parse1
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264depay1
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> queue0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtspsrc0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> h264parse1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264depay1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> queue0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0

(python3:18199): GStreamer-CRITICAL **: 11:51:26.919: gst_caps_is_empty: assertion 'GST_IS_CAPS (caps)' failed

(python3:18199): GStreamer-CRITICAL **: 11:51:26.919: gst_caps_truncate: assertion 'GST_IS_CAPS (caps)' failed

(python3:18199): GStreamer-CRITICAL **: 11:51:26.919: gst_caps_fixate: assertion 'GST_IS_CAPS (caps)' failed

(python3:18199): GStreamer-CRITICAL **: 11:51:26.919: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

(python3:18199): GStreamer-CRITICAL **: 11:51:26.919: gst_structure_get_string: assertion 'structure != NULL' failed

(python3:18199): GStreamer-CRITICAL **: 11:51:26.919: gst_mini_object_unref: assertion 'mini_object != NULL' failed
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
Allocating new output: 1920x1088 (x 11), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 1920, nFrameHeight = 1080 
[gstreamer] gstDecoder -- onPreroll()
[gstreamer] gstreamer changed state from NULL to READY ==> manager
[gstreamer] gstreamer changed state from READY to PAUSED ==> manager
[gstreamer] gstreamer changed state from NULL to READY ==> rtpssrcdemux2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpssrcdemux2
[gstreamer] gstreamer changed state from NULL to READY ==> rtpsession2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpsession2
[gstreamer] gstreamer changed state from NULL to READY ==> funnel4
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel4
[gstreamer] gstreamer changed state from NULL to READY ==> funnel5
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel5
[gstreamer] gstreamer changed state from NULL to READY ==> rtpstorage2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpstorage2
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> rtpssrcdemux3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpssrcdemux3
[gstreamer] gstreamer changed state from NULL to READY ==> rtpsession3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpsession3
[gstreamer] gstreamer changed state from NULL to READY ==> funnel6
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel6
[gstreamer] gstreamer changed state from NULL to READY ==> funnel7
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel7
[gstreamer] gstreamer changed state from NULL to READY ==> rtpstorage3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpstorage3
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpssrcdemux3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpstorage3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpsession3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel6
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel7
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpssrcdemux2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpstorage2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpsession2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel4
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel5
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> manager
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> rtpptdemux2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpptdemux2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpptdemux2
[gstreamer] gstreamer changed state from NULL to READY ==> rtpjitterbuffer2
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpjitterbuffer2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpjitterbuffer2
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from NULL to READY ==> rtpptdemux3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpptdemux3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpptdemux3
[gstreamer] gstreamer changed state from NULL to READY ==> rtpjitterbuffer3
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpjitterbuffer3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpjitterbuffer3
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)";
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)370200, maximum-bitrate=(uint)370200, bitrate=(uint)347940;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)370200, maximum-bitrate=(uint)10087200, bitrate=(uint)1233327;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)203400, maximum-bitrate=(uint)10087200, bitrate=(uint)1147500;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)1074769;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)1019742;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)970066;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)928200;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)889447;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)861922;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)831705;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)807000;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)782257;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)761727;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)741686;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)724641;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)708312;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)679096;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10087200, bitrate=(uint)654020;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)202000, maximum-bitrate=(uint)10148000, bitrate=(uint)949135;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)925218;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)904024;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)886117;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)868422;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)837172;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)812369;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)787960;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)766116;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)745777;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)727540;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10148000, bitrate=(uint)711053;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10289000, bitrate=(uint)891431;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10289000, bitrate=(uint)865441;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10289000, bitrate=(uint)844283;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10289000, bitrate=(uint)825252;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)10289000, bitrate=(uint)809006;
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)183800, maximum-bitrate=(uint)46902400, bitrate=(uint)1557160;
[gstreamer] gstDecoder -- map buffer size was less than max size (1008 vs 3110400)
[gstreamer] gstDecoder recieve caps:  video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1920, height=(int)1080, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)189/190, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1
[gstreamer] gstDecoder -- recieved first frame, codec=h264 format=nv12 width=1920 height=1080 size=3110400
[gstreamer] gstDecoder -- recieved NVMM memory
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(Main\ Profile\)", minimum-bitrate=(uint)150000, maximum-bitrate=(uint)46902400, bitrate=(uint)1534464;
RingBuffer -- allocated 4 buffers (6220800 bytes each, 24883200 bytes total)

I have had to remove some private information (IP address/camera URI).

These messages come from within the nvbuf_utils library (presumably at some other place in the pipeline), and I get them too sometimes - they seem fine to ignore.

Have you tried running with --input-rtsp-latency=0 ? The increased latency seems likely to come from the rtspsrc element (e.g. networking), since that is the only different in the pipeline vs the file-based video playback.

These messages come from within the nvbuf_utils library (presumably at some other place in the pipeline), and I get them too sometimes - they seem fine to ignore.

I get about a dozen of these messages per second and it’s totally flooding the console. Our application sometimes instantiates 2, 3 or even 4 streams at once. Other than piping stderr and stdout to /dev/null, is there a way to silence these warnings/errors?

Have you tried running with --input-rtsp-latency=0 ? The increased latency seems likely to come from the rtspsrc element (e.g. networking), since that is the only different in the pipeline vs the file-based video playback.

Yes, I’ve tried that. No difference… I’m beginning to suspect that some (or all?) of our camerastreams are actually <25fps even though they “report” 25fps in the metadata. If I play the exact same RTSP stream in ffplay, ffplay actually also reports 25fps (just like gstreamer does during init) but the video looks stuttery anyway. I think we’ll have to look for a solution elsewhere, knowing that the current pipeline is as fast as can be.

Unfortunately these print outs don’t seem to be coming from my code, so I’m unable to silence them. If I use --log-level=silent, then they are still printed out (meaning they don’t come from my code). Sorry about that.

OK, update on this topic - NVMM support for CSI/V4L2 camera has been integrated into jetson-utils here:

https://github.com/dusty-nv/jetson-utils/commit/b38357bbe33640613acb7616fd7e675adbeaab2a

1 Like