Hello, is NvEGLRenderer performing YUV to RGB color space conversion?

Hello,

In the 00_video_decode example, is EGLRenderer performing YUV to RGB color space conversion?

if not
Is YUV texture mapping as it is in EGLRenderer?

Thank you.

Hi,
NvEglRenderer supports NvBufferColorFormat_NV12 and NvBufferColorFormat_ABGR32. If you sends NvBufferColorFormat_NV12 buffers to the renderer, low-level display controller does the conversion.

1 Like

What is the low-level display controller?

If I don’t use NvEglrenderer and use glfw and glew on my own, can I use a low-level display controller?

Isn’t the low-level display controller /nvhost-vic?

Thank you.

Before NvBufferTransform, the output_pixformat of the decoded buffer was forced into ARGB to create a buffer with RGB values.


Is it possible to create an RGB buffer like this and render it using OpenGL? (without using NvEglRenderer)

Thank you.

Hello,

① Set the output pixel format to ARGB32
image

② In ARGB32 format, dump_dmabuf is called only once.

③ In the ARGB32 format, write to the file as follows.

The file written in that way has RGB values as follows.

Is it okay to do this?

Thank you.

Hi,
Display controller is a hardware block. Please check CHAPTER 24: DISPLAY CONTROLLER in TRM of TX1:
https://developer.nvidia.com/embedded/downloads#?search=trm

On Jetson platforms, pure OpenGL may not work properly. Suggest use EGL as demonstrated in NvEglrenderer.

Is it the correct way to write the RGB data to a file after going through the above process?

Are you sure that the above psrc_data contains data that has been successfully decoded and converted to rgb?

If we allow our customers to use RGB data decoded in Nano, can we provide it through the above process?

This is the DISPLAY CONTROLLER?

Hi,
Yes, Figure 1: Tegra X1 Processor Block Diagram shows all hardware blocks and the red circle is display controller.

1 Like