Green tint video

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) TX2
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 4.4.1 [L4T 32.4.4]
• TensorRT Version 7.1.3.0

Hi!
I am trying to save a video using the gstreamer pipeline.
If I use videoconvert I get normal video
gst-launch-1.0 pylonsrc imageformat=Mono8 ! videoconvert ! omxh264enc ! matroskamux ! filesink location=/mnt/detections/ex.mkv


But if I use nvvideoconvert I get a green tint video
gst-launch-1.0 pylonsrc imageformat=Mono8 ! nvvideoconvert ! omxh264enc ! matroskamux ! filesink location=/mnt/detections/ex.mkv

What could be the problem?

Hi,
We have deprecated omx plugins. Please link with nvv4l2h264enc. And try nvvidconv:

gst-launch-1.0 pylonsrc imageformat=Mono8 ! nvvidconv ! nvv4l2h264enc ! h264parse ! matroskamux ! filesink location=/mnt/detections/ex.mkv

Hi, @DaneLLL ! Thanks for the answer
I want to use this pipeline in my DS app.
And if I use nvvidconv I get the error gst_nvvconv_transform: NvBufferTransform not supported.
If I use nvvideoconvert I get a green tint video.

If I run my DS apps with this pipeline
pylonsrc -> nvvideoconvert -> capsfilter -> nvstreammux -> nvinfer -> nvvideoconvert -> capsfilter -> omxh264enc -> matroskamux -> tcpserversink I am getting video with a green tint and a video resolution of 3640x2048 instead of 2448x2048

#include <gst/gst.h>
#include <glib.h>
#include <stdio.h>
#include "gstnvdsmeta.h"

#define MAX_DISPLAY_LEN 64

#define PGIE_CLASS_ID_VEHICLE 0
#define PGIE_CLASS_ID_PERSON 2

/* The muxer output resolution must be set if the input streams will be of
 * different resolution. The muxer will scale all the input frames to this
 * resolution. */
#define MUXER_OUTPUT_WIDTH 1920
#define MUXER_OUTPUT_HEIGHT 1080

/* Muxer batch formation timeout, for e.g. 40 millisec. Should ideally be set
 * based on the fastest source's framerate. */
#define MUXER_BATCH_TIMEOUT_USEC 40000

gint frame_number = 0;
gchar pgie_classes_str[4][32] = { "Vehicle", "TwoWheeler", "Person",
  "Roadsign"
};

/* osd_sink_pad_buffer_probe  will extract metadata received on OSD sink pad
 * and update params for drawing rectangle, object information etc. */

static gboolean
bus_call (GstBus * bus, GstMessage * msg, gpointer data)
{
  GMainLoop *loop = (GMainLoop *) data;
  switch (GST_MESSAGE_TYPE (msg)) {
    case GST_MESSAGE_EOS:
      g_print ("End of stream\n");
      g_main_loop_quit (loop);
      break;
    case GST_MESSAGE_ERROR:{
      gchar *debug;
      GError *error;
      gst_message_parse_error (msg, &error, &debug);
      g_printerr ("ERROR from element %s: %s\n",
          GST_OBJECT_NAME (msg->src), error->message);
      if (debug)
        g_printerr ("Error details: %s\n", debug);
      g_free (debug);
      g_error_free (error);
      g_main_loop_quit (loop);
      break;
    }
    default:
      break;
  }
  return TRUE;
}

int
main (int argc, char *argv[])
{
  GMainLoop *loop = NULL;
  GstElement *pipeline = NULL, *source = NULL, *h264parser = NULL,
      *decoder = NULL, *streammux = NULL, *sink = NULL, *pgie = NULL, *nvvidconv = NULL,
      *nvosd = NULL, *encoder = NULL, *muxer = NULL, *nvvidconv_sink = NULL, *parser = NULL;

  GstBus *bus = NULL;
  guint bus_watch_id;
  GstPad *osd_sink_pad = NULL;

  /* Standard GStreamer initialization */
  gst_init (&argc, &argv);
  loop = g_main_loop_new (NULL, FALSE);

  /* Create gstreamer elements */
  /* Create Pipeline element that will form a connection of other elements */
  pipeline = gst_pipeline_new ("dstest1-pipeline");

  encoder = gst_element_factory_make ("nvv4l2h264enc", "omxh264enc"); 
  if (!encoder) {
    g_printerr ("encoder could not be created. Exiting.\n");
    return -1;
  }
  muxer = gst_element_factory_make ("matroskamux", "matroskamux");
  parser = gst_element_factory_make ("h264parse", "h264parse");
  if (!parser) {
    g_printerr ("parser could not be created. Exiting.\n");
    return -1;
  }

  /* Source element for reading from the file */
  source = gst_element_factory_make ("pylonsrc", "file-source");
  g_object_set (G_OBJECT (source), "imageformat", "Mono8", NULL);

  /* Since the data format in the input file is elementary h264 stream,
   * we need a h264parser */
  h264parser = gst_element_factory_make ("h264parse", "h264-parser");

  /* Use nvdec_h264 for hardware accelerated decode on GPU */
  decoder = gst_element_factory_make ("nvv4l2decoder", "nvv4l2-decoder");

  /* Create nvstreammux instance to form batches from one or more sources. */
  streammux = gst_element_factory_make ("nvstreammux", "stream-muxer");

  if (!pipeline || !streammux) {
    g_printerr ("One element could not be created. Exiting.\n");
    return -1;
  }

  /* Use nvinfer to run inferencing on decoder's output,
   * behaviour of inferencing is set through config file */
  pgie = gst_element_factory_make ("nvinfer", "primary-nvinference-engine");

  /* Use convertor to convert from NV12 to RGBA as required by nvosd */
  nvvidconv = gst_element_factory_make ("nvvideoconvert", "nvvideo-converter");
  nvvidconv_sink = gst_element_factory_make ("nvvideoconvert", "nvvideo-converter_sink");

  /* Create OSD to draw on the converted RGBA buffer */
  nvosd = gst_element_factory_make ("nvdsosd", "nv-onscreendisplay");

  sink = gst_element_factory_make ("tcpserversink", "nvvideo-renderer");
  g_object_set (G_OBJECT (sink), "host", "10.10.10.10", NULL);
  g_object_set (G_OBJECT (sink), "port", 8888, NULL);

  g_object_set (G_OBJECT (streammux), "batch-size", 1, NULL);

  g_object_set (G_OBJECT (streammux), "width", MUXER_OUTPUT_WIDTH, "height",
      MUXER_OUTPUT_HEIGHT,
      "batched-push-timeout", MUXER_BATCH_TIMEOUT_USEC, NULL);

  /* Set all the necessary properties of the nvinfer element,
   * the necessary ones are : */
  g_object_set (G_OBJECT (pgie),
      "config-file-path", "dstest1_pgie_config.txt", NULL);

  /* we add a message handler */
  bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
  bus_watch_id = gst_bus_add_watch (bus, bus_call, loop);
  gst_object_unref (bus);

  GstElement* filter_src = gst_element_factory_make ("capsfilter", "filter_src");
  GstCaps* caps_filter_src = gst_caps_from_string(
			"video/x-raw(memory:NVMM),"
			"format=NV12,"
			"width=2448,"
			"height=2048,"
			"framerate=35/1");
  g_object_set(G_OBJECT(filter_src), "caps", caps_filter_src, NULL);
  gst_caps_unref(caps_filter_src);

  GstElement* filter_sink = gst_element_factory_make("capsfilter", "filter_sink");
  GstCaps* caps_filter_sink = gst_caps_from_string(
  		"video/x-raw(memory:NVMM),"
  		"width=2448,"
  		"height=2048,"
  		"framerate=35/1");
  g_object_set(G_OBJECT(filter_sink), "caps", caps_filter_sink, NULL);
  gst_caps_unref(caps_filter_sink);

  GstPad *sinkpad, *srcpad;
  gchar pad_name_sink[16] = "sink_0";
  gchar pad_name_src[16] = "src";

  gst_bin_add_many (
		  GST_BIN (pipeline),
		  source,
		  nvvidconv,
		  filter_src,
		  streammux,
		  pgie,
		  nvvidconv_sink,
		  filter_sink,
		  encoder,
		  parser,
		  muxer,
		  sink, NULL);

  sinkpad = gst_element_get_request_pad (streammux, pad_name_sink);
  if (!sinkpad) {
    g_printerr ("Streammux request sink pad failed. Exiting.\n");
    return -1;
  }

  srcpad = gst_element_get_static_pad (filter_src, pad_name_src);
  if (!srcpad) {
    g_printerr ("Decoder request src pad failed. Exiting.\n");
    return -1;
  }

  if (gst_pad_link (srcpad, sinkpad) != GST_PAD_LINK_OK) {
      g_printerr ("Failed to link decoder to stream muxer. Exiting.\n");
      return -1;
  }

  gst_object_unref (sinkpad);
  gst_object_unref (srcpad);

  /* we link the elements together */
  /* file-source -> h264-parser -> nvh264-decoder ->
   * nvinfer -> nvvidconv -> nvosd -> video-renderer */

  if (!gst_element_link_many (source, nvvidconv, filter_src, NULL)) {
    g_printerr ("Elements could not be linked: 1a. Exiting.\n");
    return -1;
  }
  if (!gst_element_link_many (streammux, pgie, nvvidconv_sink, NULL)) {
    g_printerr ("Elements could not be linked: 1b. Exiting.\n");
    return -1;
  }
  if (!gst_element_link_many (nvvidconv_sink, filter_sink, encoder, parser, muxer, sink, NULL)) {
    g_printerr ("Elements could not be linked: 1c. Exiting.\n");
    return -1;
  }
  gst_element_set_state (pipeline, GST_STATE_PLAYING);

  /* Wait till pipeline encounters an error or EOS */
  g_print ("Running...\n");
  g_main_loop_run (loop);

  /* Out of the main loop, clean up nicely */
  g_print ("Returned, stopping playback\n");
  gst_element_set_state (pipeline, GST_STATE_NULL);
  g_print ("Deleting pipeline\n");
  gst_object_unref (GST_OBJECT (pipeline));
  g_source_remove (bus_watch_id);
  g_main_loop_unref (loop);
  return 0;
}

Hi,
Is your source in GRAY8 format? We try the pipeline and don’t observe the issue.

$ gst-launch-1.0 videotestsrc num-buffers=100 ! video/x-raw,format=GRAY8 ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=a.mp4

The pylonsrc plugin looks specific to certain cameras. We would need your help to share a pipeline we can reproduce it with other plugins such as videotestsrc.

Hi,
Yes, my source is in GRAY8 format

this pipeline works fine for me
vlcsnap-2021-02-04-10h18m48s438

But if I change videotestsrc to pylonsrc I get a green tint video again.
gst-launch-1.0 pylonsrc imageformat=Mono8 ! video/x-raw,format=GRAY8 ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! nvv4l2h264enc ! h264parse ! matroskamux ! filesink location=a.mp4

The pylonsrc is a specific plugin for Basler cameras. We are using this plugin from this repo

If add video/x-raw,format=GRAY8 ! videoconvert ! video/x-raw,format=NV12 before nvvideoconvert in my DS app - everything works fine
gst-launch-1.0 pylonsrc imageformat=Mono8 ! "video/x-raw,format=GRAY8" ! videoconvert ! "video/x-raw,format=NV12" ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! nvv4l2h264enc ! h264parse ! matroskamux ! filesink location=a.mp4
Any more efficient solutions?

Hi,
It looks like the UV plane is with wrong value in the case. Please run

gst-launch-1.0 pylonsrc imageformat=Mono8 ! video/x-raw,format=GRAY8 ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=NV12' ! dsexample ! nvv4l2h264enc ! h264parse ! matroskamux ! filesink location=a.mp4

And you can access the buffers in dsexample plugin. You can apply the logic in nvbuff_do_clearchroma() for a try.

nvbuff_do_clearchroma() is in

/usr/src/jetson_multimedia_api/samples/12_camera_v4l2_cuda

and implemented with NvBuffer APIs. You can do the same with NvBufSurface APIs.

Hi @DaneLLL
I was able to replicate the problem on videotestsrc. When the resolution is increased to 960x720, a green tint appears. This is not seen on x86 with depstream 5.1.
640x480

Pipeline for Jetson

gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480,format=GRAY8 ! nvvideoconvert ! nvv4l2h264enc bitrate=4000000 maxperf-enable=1 preset-level=3 profile=4 ! h264parse ! matroskamux ! filesink location=640x480.mp4

works fine, but the pipeline

gst-launch-1.0 videotestsrc ! video/x-raw,width=960,height=720,format=GRAY8 ! nvvideoconvert ! nvv4l2h264enc bitrate=4000000 maxperf-enable=1 preset-level=3 profile=4 ! h264parse ! matroskamux ! filesink location=960x720.mp4

gives a green tint.

Pipeline for x86

gst-launch-1.0 videotestsrc ! video/x-raw,width=960,height=720,format=GRAY8 ! nvvideoconvert ! nvv4l2h264enc bitrate=4000000 profile=4 ! h264parse ! matroskamux ! filesink location=960x720.mp4

For our camera on Jetson

gst-launch-1.0 pylonsrc exposure=2000 autogain=continuous ! nvvideoconvert ! 'video/x-raw(memory:NVMM),width=612,height=512' ! nvv4l2h264enc bitrate=4000000 maxperf-enable=1 preset-level=3 profile=4 ! h264parse ! matroskamux ! tcpserversink port=8554 host=0.0.0.0 sync=false

does not give a green tint, but the camera resolution is 2448x2048

Hi @ilya.zvezdin
As a quick solution please leverage dsexample plugin as suggested in
Green tint video - #9 by DaneLLL

This solution does not work. I tried to fix the UV components by writing 0x80 to the UV plane, but I got a faded image.

Hi,
Please set colorimetry=bt601 in source pad of nvvideoconvert plugin:

$ gst-launch-1.0 videotestsrc num-buffers=300 ! video/x-raw,width=960,height=720,format=GRAY8 ! nvvideoconvert ! 'video/x-raw(memory:NVMM),format=I420,colorimetry=bt601' ! nvoverlaysink

Thanks @DaneLLL, it works