Nvivafilter multiple pipelines memory leak

Hello,

Here is my environment:
# R32 (release), REVISION: 7.2, GCID: 30192233, BOARD: t210ref, EABI: aarch64, DATE: Wed Apr 20 21:34:48 UTC 2022

Issue description.
3 pipelines using nvivafilter eat memory. The memory leak is significant about 1MB per minute.

memory_usage_leak

Here is a sample source code to reproduce the issue.
test_leak.txt (4.8 KB)
As soon as I reduce pipeline number from 3 to 2 (by commenting out start(2);) or remove nvivafilter from pipelines the issue goes away.

memory_usage_no_leak

Here is source code sample with nvivafilter removed from the pipeline.
test_no_leak.txt (4.5 KB)
To compile the samples: g++ -Wall -std=c++11 test_leak.cpp -o test $(pkg-config --cflags --libs gstreamer-app-1.0) -ldl
Can reproduce this issue on TX2 as well.

Observe no issues with stream and encoder until all the available memory is consumed and things start to crash randomly due to absence of available system memory.

Please advise.
Thank you.

Hi,
We will set up r32.7.4 to replicate the issue. Would be great if you can upgrade to r32.7.4 and give it a try.

Hi @DaneLLL,
I wasn’t able to reproduce it in r32.7.4.
The only difference is that I’ve had to use videotestsrc(it was eating memory on our r32.7.2 as well) instead of nvv4l2camerasrc since we have custom drivers and I would like to avoid re-compiling kernel for r32.7.4 and making new image as much as I can.
I’ll try bare r32.7.2 as well and will let you know the results.

One more observation as soon as I unplug HDMI cable from the r32.7.4 unit with test application running. Memory consumption starts to creep the same way as it does on 32.7.2 unit.
We are not using HDMI port by default, we have a DSI screen connected.
Could it be the contributing factor?

Update: Confirmed on 32.7.2 as long as I run the test app on HDMI screen standalone - there is no memory leak.
As soon as I disconnect HDMI screen with application running, memory leak shows up right away.

Hi,

Can I make sure how did you run the APP?
Like is any librariy other than GStreamer required?

We are testing on TX2 with 32.7.4 and an e3333 camera, but we got GST_STATE_CHANGE_FAILURE error in the end.
Also want to know how did you record the memory usage to get the diagram.

Hi @DaveYYY
Can I make sure how did you run the APP? - Sure here are the steps:

  1. export DISPLAY=:0
  2. ./test

And here is the output:

Opening in BLOCKING MODE
complete
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
Opening in BLOCKING MODE
complete
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
Opening in BLOCKING MODE
complete
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 77, Level = 0
H264: Profile = 77, Level = 0
H264: Profile = 77, Level = 0

Like is any librariy other than GStreamer required? - No, nothing else as you can see this only requires gstreamer
g++ -Wall -std=c++11 test_leak.cpp -o test $(pkg-config --cflags --libs gstreamer-app-1.0) -ldl

Update: I was wrong about nvivafilter being the reason for memory leak.
Here is why.
Somehow removing of nvivafilter blocks (no errors it just gets “stuck”) 1 out of 3 pipelines randomly and that prevents memory leak from happening. Didn’t spot it before because it displays last frame and I had to put something active in front of camera to spot it. As I’ve mentioned above 2 pipelines do run without memory leaks.

Also want to know how did you record the memory usage to get the diagram. - Sure.

  1. Script for collecting memory usage (simple bash script).
    mem_usage_script.txt (116 Bytes)
  2. Script for plotting using gnuplot (you will have to install gnuplot)
    plot_memory.txt (206 Bytes)

Somehow it is related to DSI output? Probably - yes. DSI has 60 Hz refresh rate and HDMI monitor has 144Hz. When I’m running 3 pipelines simultaneously I do see frame drops on all nv3dsinks. Instead of 30 FPS I get only 20 which is 60 / 3 similar to this post:

And it also explains why everything works well with up to 2 pipelines, it is because we are requesting 30 fps and 60/2 = 30 so it can keep up with performance.

Is it all tied to the screen refresh rate?
If I fork pipelines like this:

 	pid_t pid = fork();
	
     if (pid < 0) {
        std::cerr << "Fork failed.\n";
        return 1;
    }

    if (pid == 0) {
		gst_init (&argc, &argv);
        start(0);
    }
	
	pid = fork();
	
    if (pid < 0) {
        std::cerr << "Fork failed.\n";
        return 1;
    }

    if (pid == 0) {
		gst_init (&argc, &argv);
        start(1);
    }

	pid = fork();
	
    if (pid < 0) {
        std::cerr << "Fork failed.\n";
        return 1;
    }

    if (pid == 0) {
		gst_init (&argc, &argv);
        start(2);
    }

Instead of:

gst_init (&argc, &argv);

start(0);
start(1);
start(2);

The issue goes away.

Any ideas would be appreciated!

Update:
Due to pipelines working correctly in separate processes and having memory leak only within single process.
I’ve searched the forum and found these 2 topics.

I did set nv3dsink sync=false and it fixed the memory leak issue. But it is not acceptable since recording is lagging behind up to couple of seconds in this case.

Splitting the pipelines into different processes is not an option as well because it will require huge amount of work and change of current architecture.

So it looks like all pipelines are sharing some kind of resource which degrades and lags behind as well as losing frames. In case of 3 pipelines within within the same process. And this lag happens only if DSI screen is enabled (60Hz), it doesn’t happen on the HDMI (144Hz) screen standalone.

If you guys have any ideas it would be great!

Hi,

Can you please make a summary of the current situation?

I think this paragraph is kind of contradicting with each other.
You said nvivafilter was not the reason for memory leak, but you also said moving it out from 1 of the 3 pipelines prevented memory leak?

How about the lag between DSI monitor/HDMI monitor?
I don’t quite get it. Is it about performance issue or also memory leak?

@DaveYYY

Please find the answers below.

You said nvivafilter was not the reason for memory leak, but you also said moving it out from 1 of the 3 pipelines prevented memory leak?

You are partially correct. Moving nvivafilter out from all 3 pipelines solves memory leak, but introduces new issue - 1 out of 3 pipelines will be blocked. It is visible on nv3dsink output, it will display the last available frame and won’t update. So it just “fixes” memory leak by introducing another issue.

How about the lag between DSI monitor/HDMI monitor?

Memory leak happens on 60Hz DSI monitor as well as degraded 20 fps performance on 3 pipelines.
Memory leak doesn’t happen on 144Hz HDMI monitor and all 3 pipelines are running at 30 fps.

I don’t quite get it. Is it about performance issue or also memory leak?

Memory leak is consequence of degraded performance.

Here is the summary:

  1. 3 pipelines running including nvivafilter on a 60 Hz DSI screen - each pipeline gets only 20 fps. Memory leak is present.
  2. 3 pipelines running without nvivafilter on a 60 Hz DSI screen. In this case one of the pipelines blocks randomly (I haven’t spotted this when I’ve opened the topic). The rest 2 pipelines run correctly at 30 fps each. No memory leak.

My assumption is:
Something in the pipeline is limited by 20 Hz refresh rate (probably nv3dsink) when running 3 pipelines at the same time within the same process on a 60Hz screen refresh rate. Throughput is throttled which leads to memory leak.

This issue is complex.
Sorry if something is not clear.
Please let me know if more clarification is needed.
I think it would be appropriate to change the title after we get to the root cause of the issue.

Hi,

Sorry for coming back later. Thanks for the clarification.
However, sorry that we currently don’t have a DSI monitor for testing, so it’s kind of hard to proceed on further debugging if it only happens on DSI monitors.
We are also loaded with other tasks, so please keep the HDMI monitor if it works fine, and we’ll get back to you when we are available and when we get a DSI monitor for testing.

Hi @DaveYYY

Unfortunately I can’t use HDMI monitor as a solution.
DSI is the primary monitor.

You don’t need a DSI monitor to reproduce the issue.
All you need is an HDMI monitor which supports 60Hz. I was able to reproduce the issue by switching HDMI monitor refresh rate to 1920x1080@60Hz instead of 1920x1080@144Hz.

A temporary workaround is to dynamically unlink then re-link nv3dsink branch when needed during runtime.
Our design allows such approach.

Looks like the root cause is that 3 instances of nv3dsink running simultaneously within the same app on a 60Hz refresh rate screen (doesn’t matter DSI or HDMI) can’t run faster than 20Hz, thus slowing down the whole pipeline. And side effect is a memory leak with the pipeline I’ve provided as a sample.

I think that title should be renamed, because nvivafilter is not the real memory leak cause.

Hi,

I ran your code again on TX2+r32.7.4+e3333 camera, but I did not get anything from the camera on the screen.
I just had 3 windows showing the wallpaper; is that expected?

Hi @DaveYYY,

No it is not supposed to look like that.
It is supposed to look like this.

Sorry I’m not familiar with e3333 camera modules.
Do they support “UYVY” color format?

You might need to update this section of code

	// Set source filters and link
	caps = gst_caps_new_simple( "video/x-raw",
								"width", G_TYPE_INT, 1920,
								"height", G_TYPE_INT, 1080,
								"format", G_TYPE_STRING, "UYVY",
								"framerate", GST_TYPE_FRACTION, 30, 1,
								"interlace-mode", G_TYPE_STRING, "progressive",
								NULL );

To an appropriate color format supported by e3333 module.

Here is the demonstration of the issue. Everything is the same for both tests except screen refresh rate.

144 Hz sample - no issues.
xrandr --output HDMI-0 --mode 1920x1080 --rate 143.98

60 Hz sample - Dropped frames video is “choppy” needle jumps from place to place.
xrandr --output HDMI-0 --mode 1920x1080 --rate 60

Hi,
Are you able to try nvegltransform ! nveglglessink? Would like to confirm the issue may be specific to using nv3dsink.

@DaneLLL
I’ve tested nveglglessink as requested.
nvegltransform ! nveglglessink - exhibits the same behavior as nv3dsink. Missed frames and “choppy” output.