1.When using JETSON_MULTIMEDIA_API, setMaxPerfMode(1) has been set during encoding, but it has not taken effect. The encoding power fluctuates whether it is configured or not.The power supply has been set to MAXN mode.
2.How can we make the encoder always run at maximum power to reduce encoding latency?
3.Why is the delay when encoding a severely distorted image significantly higher than that of an undistorted image? For the same 1080p image, the delay when encoding an undistorted image is about 15ms, while the delay when encoding a fisheye image is about 110ms.
This is the tegrastats log. Is it because there is no frame rate limit for reading the file input image that makes the encoder run at 99%? The difference from 01_video_encode is that my project feeds images at 30fps.
I may not have expressed my meaning clearly. When I input images to the encoder at 30fps, the encoder power is fluctuating and the encoding delay is not at the lowest state. When I input images to the encoder at 10fps, the encoding delay is longer than that at 30fps.
Is it possible to run the encoder at full power so that the latency of image encoding can be at a stable minimum regardless of the input?
Hi,
NVENC is up and running at maximum clock. Setting --max-perf should take effect. You may run the script to enable most hardware engines at maximum performance: