Nvcompositor slow

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) AGX Xavier
• DeepStream Version 6.3
• JetPack Version (valid for Jetson only) 5.1.2
• TensorRT Version 8.5.2 / CUDA 11.4
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) Question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I built my pipeline out, with six camera streams going from a gstbin with nvv4l2src into an nvstreammux, where inference is done. Previously I would then send to an nvmultistreamtiler and send that along and got 65fps no problem the entire way, including at inference for each of six cameras (so 390fps total inference).

I replaced the tiler with nvstreamdemux and an nvcompositor. I have six src pads from the demux going to six sink pads on the compositor, and so long as I keep the size of the final composited image fairly small I get great performance. Once I get the image up past ~1920x720, the final output fps drops waaaay down, to less than 1fps. Inference upstream is unaffected. It seems like the resolution / performance tradeoff is unnoticeable until I get to around this resolution, and then it starts to plummet. Tagrastats shows my entire system with my pipeline running consuming less than 3gb of ram with plenty of processor overheard. Looking at my network monitor, the amount of data being sent doesn’t change noticeably when I adjust image size.

I’ll keep troubleshooting, but does anyone have any pointers? I’ve included my pipeline.

Can you show the CPU loading and GPU loading when you run with 1920x720 nvcompositor resolution?

The graph image is too small to be visible. Can you upload the original PNG picture?

I’m having some trouble figuring out how to upload the full image. The forum keeps shrinking my uploads. I’ve uploaded a zip of the image. If that doesn’t work, I put it on ibb as well: 0-00-00-249703083-pipeline-graph hosted at ImgBB — ImgBB

Here’s tegrastats:
03-20-2024 05:48:24 RAM 2493/31002MB (lfb 6435x4MB) SWAP 0/15501MB (cached 0MB) CPU [26%@1190,18%@1190,14%@1190,13%@1190,10%@1190,13%@1190,19%@1190,23%@1190] EMC_FREQ 0% GR3D_FREQ 43% AUX@34.5C CPU@37.5C thermal@35.7C Tboard@32C AO@34C GPU@35.5C iwlwifi@46C Tdiode@38C PMIC@50C

I’m thinking I may needs a capsfilter? I’ve started working on that.
0.00.00.249703083-pipeline_graph.zip (700.9 KB)

I’ve been playing with my bitrate. I took a look at my qos on the receiver side of my rtp stream; below is the last message. Two things I just noticed are (1) live is set to false, which I’m assuming should be true since they are live streams and (2) after cranking it down by a factor of 10 (to 800,000) I’m dropping about 78% of my frames. It seems like no matter how low I go on bandwidth I drop about this many frames. When I was just using tiler a bitrate of 8,000,000 was fine. I didn’t have time to adjust the capsfilters but I’m wondering if that matters? I need to sign off for the day but will pick back up tomorrow. Any pointers?

Message { ptr: 0x5aa9f5a83340, type: “qos”, seqnum: 1579, src: Some(“avdec_h265-0”), structure: Some(GstMessageQOS { live: (gboolean) FALSE, running-time: (guint64) 26151466028, stream-time: (guint64) 26151466028, timestamp: (guint64) 26151466028, duration: (guint64) 18446744073709551615, jitter: (gint64) 14160491, proportion: (gdouble) 0.999874, quality: (gint) 1000000, format: (GstFormat) ((GstFormat) GST_FORMAT_BUFFERS), processed: (guint64) 1227, dropped: (guint64) 933 }) }

Yes. “live-stream=1” is needed for camera src.

Please make sure the “batched-push-timeout” property of nvstreammux is set correctly with your cameras. DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

Another possible bottleneck may be video encoder. You can check the encoder performance by the “Jetson Power GUI”.

I’m starting to try everything out. I set live-stream=1 in nvstreammux, and I see that my first two frames come across as live, but then all the others show false. Why might this be? The first two are avsink, and all the following are avdec_h265. I’m not sure where to set the live-source on the receiver side - I don’t see it in avdec_h265 or any of the other elements.

Message { ptr: 0x7325480018b0, type: “qos”, seqnum: 117, src: Some(“autovideosink0-actual-sink-xvimage”), structure: Some(GstMessageQOS { live: (gboolean) TRUE, running-time: (guint64) 6640830649, stream-time: (guint64) 6640830649, timestamp: (guint64) 6640830649, duration: (guint64) 40000000, jitter: (gint64) 52672174, proportion: (gdouble) -1.000000, quality: (gint) 1000000, format: (GstFormat) ((GstFormat) GST_FORMAT_BUFFERS), processed: (guint64) 0, dropped: (guint64) 1 }) }
PLAYING
Message { ptr: 0x732548001a30, type: “qos”, seqnum: 120, src: Some(“autovideosink0-actual-sink-xvimage”), structure: Some(GstMessageQOS { live: (gboolean) TRUE, running-time: (guint64) 6657497315, stream-time: (guint64) 6657497315, timestamp: (guint64) 6657497315, duration: (guint64) 40000000, jitter: (gint64) 72436330, proportion: (gdouble) -1.000000, quality: (gint) 1000000, format: (GstFormat) ((GstFormat) GST_FORMAT_BUFFERS), processed: (guint64) 0, dropped: (guint64) 2 }) }
Message { ptr: 0x58687696c340, type: “qos”, seqnum: 122, src: Some(“avdec_h265-0”), structure: Some(GstMessageQOS { live: (gboolean) FALSE, running-time: (guint64) 6674150175, stream-time: (guint64) 6674150175, timestamp: (guint64) 6674150175, duration: (guint64) 18446744073709551615, jitter: (gint64) 128219800, proportion: (gdouble) 2.185849, quality: (gint) 1000000, format: (GstFormat) ((GstFormat) GST_FORMAT_BUFFERS), processed: (guint64) 3, dropped: (guint64) 1 }) }

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

What do you mean? The pipeline graph you post here is a RTSP server(sender).

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.