Is it possible to convert from from YUV 444 to I420 using NvBufferTransform?

Hi,

we have a MIPI-CSI capture device wich give us an AYUV image (Bits31-24 Alpha, Bits23-16 Y, Bits 15-8 U, Bits 7-0 V). We have been able to capture it and need to transform it into something nvv4l2h265enc can manage.

Is the layout of NvBufferColorFormat_YUV444 the same as what we are receiving? Can NvBufferTransform transform it into I420 or NV12?

We have tried adding GST_VIDEO_FORMAT_AYUV/NvBufferColorFormat_YUV444 to nvvidconv but calling NvBufferTransform results in:

Caught SIGSEGV
#0  0x0000007f9790fa04 in poll () from /lib/libc.so.6
#1  0x0000007f97a2d54c in ?? () from /usr/lib/libglib-2.0.so.0
#2  0x0000007f97a2d924 in g_main_loop_run () from /usr/lib/libglib-2.0.so.0
#3  0x0000007f97bc3b0c in gst_bus_poll () from /usr/lib/libgstreamer-1.0.so.0
#4  0x000000559071ea08 in ?? ()
#5  0x000000559071d878 in ?? ()
#6  0x0000007f97872110 in __libc_start_main () from /lib/libc.so.6
#7  0x000000559071df18 in ?? ()
Spinning.  Please run 'gdb gst-launch-1.0 6777' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

Is this caused due to NvBufferTransform not being able to transform YUV422 to I420 nor NV12, or may be we are doing something wrong before calling NvBufferTransform?

Best regards, Dani.

hello doquiros,

may I know what’s the transform parameters you’re used.
you may also check MMAPI samples, 01_video_encode for reference, thanks

We are using this params.

transform_params.transform_flag = 0;
transform_params.transform_filter = NvBufferTransform_Filter_Nearest; // i think this doesn't care because transform_flag = 0
transform_params.transform_flip = NvBufferTransform_None; // i think this doesn't care because transform_flag = 0
transform_params.src_rect.top = 0
transform_params.src_rect.left = 0
transform_params.src_rect.width = 0
transform_params.src_rect.height = 0
transform_params.dst_rect.top
transform_params.dst_rect.left = 0
transform_params.dst_rect.width = 0
transform_params.dst_rect.height = 0

After looking at the examples of tegra_multimedia_api I think that NvBufferColorFormat_YUV444 is a planar format, with one buffer with the frame Y samples, anothe buffer with the frame U samples and one last buffer with the frame V samples. So, it seems that it’s not possible to do the conversion using NvBufferTransform.

For the moment we will use gstreamer’s videoconvert. But for the future I would like to know, Is nvarguscamerasrc able to convert from our 32bits-packed AYUV input to I420 or another format compatible with nvvidconv (maybe NV24 or RGB888)?

Thanks.
Best regards, Dani.

hello doquiros,

please refer to 12_camera_v4l2_cuda to capture frames via v4l2 and use the CUDA engine to process the buffer.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.