OpenMax API for Rendering on TX2

I’m trying to render video from a camera with minimal latency on the TX2 L4T 32.1. I have gotten my latency to around 85 ms using the Argus pipeline with EGL for rendering, but I need to go lower.

My software matches the latency of the following gstreamer pipeline (as well as MMAPI sample 13):

gst-launch-1.0 nvarguscamerasrc ! nvegltransform ! nveglglessink

However, the following gstreamer pipeline is about 10-15 ms faster:

gst-launch-1.0 nvarguscamerasrc ! nvoverlaysink

The main difference from what I see is the method of rendering. In the quicker pipeline, it looks like nvoverlaysink renders using OpenMax. This leads me to believe I can get better latency by using the OpenMax API in place of EGL.

I don’t see any samples, nor does it appear that the OpenMax API comes installed on the TX2. Is is possible for me to use this API? Perhaps somebody can explain the significant latency different between those pipelines. In my code the NvEglRenderer->render call takes about 11 ms. Maybe there is a way to improve render time with EGL?


We support gstreamer and tegra_multimedia_api. OpenMax api is not supported.
If NvEglRenderer is good for your case, you may pick tegra_multimedia_api in your implementation.
On r32.1, you may also try nv3dsink and nvdrmvideosink.

Thanks. Was hoping I could use the OpenMax API since the render time seemed a bit better, but looks like I’ll be working with NvEglRenderer or EGL + OpenGL.