How to correctly use nvvidconv gstreamer plugin with GRAY16 video?

Video was streamed with gstreamer pipeline:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! ‘video/x-raw,format=GRAY16_LE,width=1024,height=768,framerate=50/1’ ! videoconvert ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! rtph264pay name=pay0 ! udpsink host=10.0.21.125 port=5004

It was viewed on pc with vlc. But image was darker (lower brigthness/contrast) than it come from /dev/video0. It probably incorrect video conversion from grayscale to color.

Hi,
Hardware converter does not support GRAY16_LE. You may try this command and check if YUV420 data is good:

gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=66 ! 'video/x-raw,format=GRAY16_LE,width=1024,height=768,framerate=50/1' ! videoconvert ! video/x-raw,format=I420 ! multifilesink location=dump_%03d.yuv

If it is expected and not darker, the following command should work:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-raw,format=GRAY16_LE,width=1024,height=768,framerate=50/1' ! videoconvert ! video/x-raw,format=I420 ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! rtph264pay name=pay0 ! udpsink host=10.0.21.125 port=5004

Please give it a try.

For the first pipeline - YUV420 data is good.
But running second one gives “redscale” image. Brightness is ok. But in places where it should be white it is red.

Hi,
Please try with videotestsrc and check if the issue is present.

gst-launch-1.0 -v videotestsrc is-live=1 ! 'video/x-raw,format=GRAY16_LE,width=1024,height=768,framerate=50/1' ! videoconvert ! video/x-raw,format=I420 ! nvvidconv ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 insert-vui=1 ! rtph264pay name=pay0 ! udpsink host=10.0.21.125 port=5004

And please share the release version you are using.

With videotestsrc issue is not present:

Board was flashed with JetPack 5.0.1.

$ cat /etc/nv_tegra_release
R35 (release), REVISION: 1.0, GCID: 31346300, BOARD: t186ref, EABI: aarch64, DATE: Thu Aug 25 18:41:45 UTC 2022

$ v4l2-ctl --device /dev/video0 --all

Driver Info:
        Driver name      : test
        Card type        : V4L2 PCI test
        Bus info         : PCI: testTV
        Driver version   : 5.10.104
        Capabilities     : 0x85200001
                Video Capture
                Read/Write
                Streaming
                Extended Pix Format
                Device Capabilities
        Device Caps      : 0x05200001
                Video Capture
                Read/Write
                Streaming
                Extended Pix Format
Priority: 2
Video input : 0 (testTV: ok)
Format Video Capture:
        Width/Height      : 1024/768
        Pixel Format      : 'Y16 ' (16-bit Greyscale)
        Field             : None
        Bytes per Line    : 2048
        Size Image        : 1572864
        Colorspace        : Raw
        Transfer Function : Default (maps to None)
        YCbCr/HSV Encoding: Default (maps to ITU-R 601)
        Quantization      : Default (maps to Full Range)
        Flags             :
Streaming Parameters Video Capture:
        Frames per second: invalid (0/0)
        Read buffers     : 0

Hi,
It seems specific to the v4l2 source. Please execute the command and share the YUV files for reference:

gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=10 ! 'video/x-raw,format=GRAY16_LE,width=1024,height=768,framerate=50/1' ! videoconvert ! video/x-raw,format=I420 ! multifilesink location=dump%03d.yuv

DaneLL, here is raw YUV files + corresponding image in png format.
test_view.zip (3.1 MB)

To grab image was used v4l2 subsystem and cv::VideoWriter with gstreamer pipeline:

#include "cap_cv4l2.h"
#include "opencv2/opencv.hpp"
#include <opencv2/core/core.hpp>

using namespace cv;
using namespace std;

VideoCapture cap{"videotestsrc ! appsink", CAP_GSTREAMER };

VideoWriter out("appsrc ! videoconvert ! nvvidconv ! nvv4l2h264enc ! rtph264pay name=pay0  ! udpsink host=10.0.21.125 port=5004", CAP_GSTREAMER, 0, 50, Size(1024, 768), true);

#define VIDEO_WIDTH  1024
#define VIDEO_HEIGHT 768

static cv::Mat src_image(cv::Size(VIDEO_WIDTH, VIDEO_HEIGHT), CV_16UC1);
static cv::Mat pp_image(cv::Size(VIDEO_WIDTH, VIDEO_HEIGHT), CV_8UC1);
static cv::Mat dst_image(cv::Size(VIDEO_WIDTH, VIDEO_HEIGHT), CV_8UC3);
//-------------------------------------------------------------------------------------------------------------------------------------
int main( int argc, char** argv )
{
    open_device("/dev/video0");
  
    if(!cap.isOpened()) 
    {
        cout << "Videocapture not opened" << endl;
        exit(-1);
    }
    
    for(;;)
    {
        read_frame(src_image, 1);

        src_image.convertTo(pp_image, CV_8UC1, 1.0/256);
        cvtColor(pp_image, dst_image, cv::COLOR_GRAY2BGR);     
           
        out.write(dst_image);       
    
    }       
}

Image with this code is ok. It seems videoconvert plugin incorrectly deal with my video source.

Hi,
Is it possible the source is in GRAY16_BE? Probably the color is wrong due to wrong bug or little endian.

For experiment we change most significant byte with less significant. As result of this video image was ruined. It prove that with source endianness it’s ok.

Hi,
Yo may consider use OpenCV since it works fine. If you would like to link v4l2src ! videoconvert, please try to check why videoconvert plugin cannot convert the source to I420 correctly. May dump the frame generated by videotestsrc for comparison.

Yes, DaneLLL. We will patch this issue with help OpenCV.
Thank you for for your assistance.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.