Nvv4l2av1enc memory leak

Hi,

We’re experiencing a memory leak when using the v4l2 AV1 encoder in GStreamer (nvv4l2av1enc). The leak happens when changing the video caps. For example, halving the framerate or resolution. The available memory shrinks until the application crashes. This is a minimal reproducible example using Rust:

main.rs

use std::time::Duration;
use gst::glib::GString;
use gst::prelude::{ElementExt, GstBinExtManual, GstObjectExt, ObjectExt, PadExtManual};

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    gst::init()?;

    let source_element = gst::ElementFactory::make("videotestsrc")
        .name("videotestsrc")
        .build()?;

    let caps = gst_video::VideoCapsBuilder::new()
        .width(1920)
        .height(1536)
        .framerate(gst::Fraction::new(15, 1))
        .build();

    let camera_capsfilter = gst::ElementFactory::make("capsfilter")
        .name("camera_capsfilter")
        .property("caps", &caps)
        .build()?;

    let videoconvert = gst::ElementFactory::make("nvvidconv")
        .name("hw_vidconv")
        .build()?;

    let videorate = gst::ElementFactory::make("videorate")
        .name("videorate")
        .property("drop-only", true)
        .property("skip-to-first", true)
        .build()?;

    let queue =gst::ElementFactory::make("queue")
        .name("only_queue")
        .property("max-size-bytes", 0u32)
        .property("max-size-time", 0u64)
        .property("max-size-buffers", 4u32)
        .build()?;

    let caps2 = gst_video::VideoCapsBuilder::new()
        .features(["memory:NVMM"])
        .width(1920)
        .height(1536)
        .framerate(gst::Fraction::new(15, 1))
        .build();

    let capsfilter_for_framerate = gst::ElementFactory::make("capsfilter")
        .name("framerate_capsfilter")
        .property("caps", &caps2)
        .build()?;

    let nvv4l2av1enc = gst::ElementFactory::make("nvv4l2av1enc").build()?;

    let fakesink = gst::ElementFactory::make("fakesink")
        .name("fakesink")
        .build()?;

    tokio::spawn(toggle_caps(capsfilter_for_framerate.clone()));

    let pipeline = gst::Pipeline::new();

    pipeline.add_many([
        &source_element,
        &camera_capsfilter,
        &videoconvert,
        &videorate,
        &queue,
        &capsfilter_for_framerate,
        &nvv4l2av1enc,
        &fakesink,
    ])?;

    gst::Element::link_many([
        &source_element,
        &camera_capsfilter,
        &videoconvert,
        &videorate,
        &queue,
        &capsfilter_for_framerate,
        &nvv4l2av1enc,
        &fakesink,
    ])?;

    pipeline.set_state(gst::State::Playing)?;

    let bus = pipeline.bus().unwrap();

    for msg in bus.iter_timed(gst::ClockTime::NONE) {
        match msg.view() {
            gst::MessageView::Eos(_) => {
                println!("End of stream");
                break;
            }
            gst::MessageView::Error(err) => {
                eprintln!(
                    "Error from {}: {} ({})",
                    err.src().map(|s| s.path_string()).unwrap_or_else(|| GString::from("unknown".to_string())),
                    err.error(),
                    err.debug().unwrap_or_else(|| GString::from("no debug info".to_string()))
                );
                break;
            }
            _ => (),
        }
    }

    Ok(())
}

async fn toggle_caps(capsfilter: gst::Element) {
    let mut current_fps = 15;
    let mut current_width = 1920;
    let mut current_height = 1536;

    loop {
        tokio::time::sleep(Duration::from_millis(500)).await;

        let caps = gst_video::VideoCapsBuilder::new()
            .features(["memory:NVMM"])
            .width(current_width)
            .height(current_height)
            .framerate(gst::Fraction::new(current_fps, 1))
            .build();

        println!("Setting caps: {}x{} @ {}fps", current_width, current_height, current_fps);
        capsfilter.set_property("caps", &caps);

        if current_height == 1536 {
            current_height = 768;
            current_width = 960;
        } else {
            current_height = 1536;
            current_width = 1920;
        }

        current_fps = if current_fps == 15 { 7 } else { 15 };
    }
}

Cargo.toml

[package]
name = "min-reproducable-example"
version = "0.1.0"
edition = "2024"

[dependencies]
anyhow = "1.0.98"
gst = { package = "gstreamer", git = "https://gitlab.freedesktop.org/gstreamer/gstreamer-rs.git", tag = "0.23.5", features = ["v1_24"] }
gst-video = { package = "gstreamer-video", git = "https://gitlab.freedesktop.org/gstreamer/gstreamer-rs.git", tag = "0.23.5", features = ["v1_24"] }
tokio = { version = "1.46.1", features = ["rt", "rt-multi-thread", "macros", "time"] }

Each time we set the caps, we see a change in EMC bandwidth. This also happens when using the h264 HW encoder.

L4T version: 36.4.0
GStreamer version: 1.24.8

Any tips or suggestions would be greatly appreciated!

*** Please note that this reply is generated by LLM automatically ***

The issue you’re experiencing with the v4l2 AV1 encoder in GStreamer, specifically with the nvv4l2av1enc component, is a memory leak that occurs when changing the video caps, such as halving the framerate or resolution. This problem is not unique to the AV1 encoder, as you’ve also observed it with the H.264 hardware (HW) encoder. Given the context, let’s break down the potential causes and suggestions for troubleshooting and resolving the issue.

Understanding the Components Involved

  1. GStreamer: A pipeline-based multimedia framework that allows you to create complex multimedia workflows. It’s widely used for video and audio processing.
  2. nvv4l2av1enc: A GStreamer plugin for encoding video in AV1 format using NVIDIA hardware acceleration. It utilizes the V4L2 (Video for Linux 2) API, which provides a standard interface for video capture and output devices.
  3. V4L2: A Linux kernel module for video capture and output. It’s crucial for interacting with video devices, including those that provide hardware acceleration for video encoding and decoding.
  4. Jetson Orin NX: A series of NVIDIA Jetson modules designed for AI computing and edge devices. They come with a Linux distribution (L4T) that supports various multimedia functionalities, including hardware-accelerated video encoding and decoding.

Potential Causes of the Memory Leak

  1. Incompatible or Outdated Software: Ensure that your L4T version (34.4.0) and GStreamer version (1.24.8) are compatible and up-to-date. Sometimes, memory leaks can be resolved by updating to the latest versions of the software.
  2. Incorrect Usage of GStreamer Elements: The way you set up your GStreamer pipeline might be causing the memory leak. Double-check the pipeline configuration, especially how you handle the changing of video caps.
  3. Hardware Limitations: The Jetson Orin NX, like any hardware, has its limitations. Excessive changing of video caps might be pushing the hardware beyond its optimal operating conditions, leading to memory leaks.
  4. Driver Issues: The V4L2 driver or NVIDIA’s hardware acceleration drivers might have bugs or inefficiencies that cause memory leaks under certain conditions.

Troubleshooting Steps

  1. Update Software: Ensure your system, including L4T and GStreamer, is updated to the latest versions.
  2. Simplify the Pipeline: Try to simplify your GStreamer pipeline to the bare minimum required for your application. This can help isolate if the issue is with a specific element or configuration.
  3. Monitor System Resources: Use tools like top, htop, or sysdig to monitor system resources (CPU, memory, etc.) while running your application. This can provide insights into how resources are being utilized and if there’s a clear indication of a memory leak.
  4. Check for Known Issues: Look for known issues or bugs reported by the community or NVIDIA that might be related to your problem. Sometimes, workarounds or patches are available.
  5. Debugging Tools: Utilize GStreamer’s debugging tools and options (like GST_DEBUG environment variable) to get more detailed logs about what’s happening within your pipeline.
  6. Contact NVIDIA Support: If the issue persists and you’ve tried all troubleshooting steps, consider reaching out to NVIDIA’s support forums or official support channels. They might be able to provide more specific guidance or patches for known issues.

Comparison with Other Encoders

You’ve mentioned that the issue also occurs with the H.264 HW encoder. This suggests that the problem might not be specific to the AV1 encoder but could be related to how GStreamer or the V4L2 API handles changes in video caps, or it could be a more general issue with the Jetson Orin NX’s hardware acceleration under L4T.

Comparing the behavior of different encoders (like nvv4l2h264enc for H.264) might provide additional insights. If the memory leak is consistent across different encoders, it points more towards a systemic issue rather than a problem specific to the nvv4l2av1enc component.

In conclusion, resolving the memory leak issue will likely require a combination of updating software, carefully examining the GStreamer pipeline configuration, monitoring system resources, and potentially working with NVIDIA support to address any underlying bugs or limitations in the hardware or software stack.

*** Please give the thumbs up if you get answers you like. Or provide feedback to help us improve the answer. ***

Hi,
Runtime resolution change is not supported. Please destroy the current pipeline and re-initialize it with new resolution.

Hi @DaneLLL, what do you mean exactly by not supported? Resolution and framerate changes work fine during runtime. It’s just that it results in a memory leak. Destroying and re-initializing is not really a viable option for us. The product we work on heavily relies upon upscaling/downscaling/sampling of the video stream.. We don’t seem the be the only people with this problem. Any possibility to have someone have a look at this?

@DaneLLL, friendly ping!

Hi,
Runtime resolution change is not validated, so it may have potential issues such as memory leak or instability.

Hi @DaneLLL, long shot, but this is a very important feature for us.. Is there any way this can be validated or we can request this as a feature?

Hi emiel4,

According to my experience, a video file with multiple resolutions is not a common thing, so encoders are designed to encode the frames with the same resolution throughout the encoding session.

Have you tried a different solution? If you can provide us with some details about your use case, I will try to help you find a more viable option.

Hi sarper,
It’s not really a video file, We have an application with livestreaming over webrtc, depending on the available bandwidth we upscale/downscale/upsample/downsample (mitigations) the videostream. So the video is encoded depending on the available bitrate + (mitigations).

I don’t have experience on this specific use case however I have an idea.

I think you need a little bit complex solution where you build multiple pipelines that you run together. You would need an input pipeline, multiple encoding pipelines for each resolution, and an output pipeline where you can adjust fps with videorate element.

This seems to be a best option to go forward to me but you need some implementation to switch between pipelines and you need to make sure buffers(frames) are transmitted between pipelines seamlessly. There may be more constraints that you can encounter during implementation.

If you have more questions about design choices or how-to’s, let me know.

1 Like

Hi,
Please apply the solution to enable runtime resolution change:
Memory Leak in NVENC encoders on Jetson Orin NX/AGX for jetpack 6.2 [L4T 36.4.3] - #15 by DaneLLL

hi @DaneLLL , we just validated this, it works like a charm! Thank you very much!

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.