The delay difference of live video stream between gst-launch-1.0 command and appsink callback

Hello everyone,I have encountered a confusing problem on my jetson-TX1(gstreamer1.8,qt5.5,opencv3.2 which has been configured with gstreamer).
First,I tested “gst-launch-1.0 rtspsrc location=rtsp://admin:admin12345@192.168.1.64:554/h264/ch33/main/av_stream latency=0 ! decodebin ! videoconvert ! xvimagesink sync=false” in command window and I got real-time video of my IP camera.
However,when I want to use gstreamer in qt project to do opencv displaying.I tested the follow code which used appsink callback and the result is that video is not real-time and the delaying is accumulating over time.The result is unacceptable in my real-time project.Could someone help me?Best wishes to you all!

#include <gst/gst.h>
#include <gst/app/gstappsink.h>
#include <stdlib.h>
#include<QTime>
#include<QDebug>

#include "opencv2/opencv.hpp"

using namespace cv;

#define CAPS "video/x-raw,format=BGR,fFrameRate=25/1,width=1280,height=720"

// TODO: use synchronized deque
GMainLoop *loop;
std::deque<Mat> frameQueue;
int live_flag = 0;
int quit_flag = 0;
int sum_time=0;

GstFlowReturn new_preroll(GstAppSink *appsink, gpointer data)
{
    g_print ("Got preroll!\n");
    return GST_FLOW_OK;
}

GstFlowReturn new_sample(GstAppSink *appsink, gpointer data)
{
    QTime time;
    time.start();
    static int framecount = 0;
    framecount++;

    static int width=0, height=0 ;

    GstSample *sample = gst_app_sink_pull_sample(appsink);
    GstCaps *caps = gst_sample_get_caps(sample);
    GstBuffer *buffer = gst_sample_get_buffer(sample);
    static GstStructure *s;
    const GstStructure *info = gst_sample_get_info(sample);
    // ---- get width and height
    if(framecount==1)
    {
        if(!caps)
        {
            g_print("Could not get image info from filter caps");
            exit(-11);
        }

        s = gst_caps_get_structure(caps,0);
        gboolean res = gst_structure_get_int(s, "width", &width);
        res |= gst_structure_get_int(s, "height", &height);
        if(!res)
        {
            g_print("Could not get image width and height from filter caps");
            exit(-12);
        }
        g_print("Image size: %d\t%d\n",width,height);
    }


    // ---- Read frame and convert to opencv format ---------------
    GstMapInfo map;
    gst_buffer_map (buffer, &map, GST_MAP_READ);

    // convert gstreamer data to OpenCV Mat, you could actually
    // resolve height / width from caps...

    Mat frame(Size(width, height), CV_8UC3, (char*)map.data, Mat::AUTO_STEP);

        // this lags pretty badly even when grabbing frames from webcam
        //Mat edges;
        //cvtColor(frame, edges, CV_RGB2GRAY);
        //GaussianBlur(edges, edges, Size(7,7), 1.5, 1.5);
        //Canny(edges, edges, 0, 30, 3);
        imshow("stream", frame);

        //char key = cv::waitKey(10);
        //if(key!=-1) quit_flag = 1;


    gst_buffer_unmap(buffer, &map);

    // ------------------------------------------------------------

    // print dot every 30 frames
    if (framecount%30 == 0) {
    g_print (".");
    }

    // show caps on first frame
    if (framecount == 1) {
    g_print ("%s\n", gst_caps_to_string(caps));
    }

    gst_sample_unref (sample);
    sum_time =time.elapsed();
    qDebug()<<"time:"<<sum_time<<"\n";
    return GST_FLOW_OK;
}

static gboolean my_bus_callback (GstBus *bus, GstMessage *message, gpointer data)
{
    g_print ("Got %s message from %s\n", GST_MESSAGE_TYPE_NAME (message), GST_OBJECT_NAME (message->src));
    switch (GST_MESSAGE_TYPE (message))
    {
            case GST_MESSAGE_ERROR:
            {
                    GError *err;
                    gchar *debug;

                    gst_message_parse_error (message, &err, &debug);
                    g_print ("Error from %s: %s\n", GST_OBJECT_NAME (message->src), err->message);
                    g_error_free (err);
                    g_free (debug);
                    break;
            }
            case GST_MESSAGE_EOS:
                    /* end-of-stream */
                    quit_flag = 1;
                    break;
            case GST_MESSAGE_STATE_CHANGED:
                    GstState oldstate, newstate;
                    gst_message_parse_state_changed(message, &oldstate, &newstate, NULL);
                    g_print ("Element %s changed state from %s to %s.\n",
                    GST_OBJECT_NAME (message->src),
                            gst_element_state_get_name (oldstate),
                            gst_element_state_get_name (newstate));
                    break;
            default:
                    /* unhandled message */
                    break;
    }
    /* we want to be notified again the next time there is a message
    * on the bus, so returning TRUE (FALSE means we want to stop watching
    * for messages on the bus and our callback should not be called again)
    */
    return TRUE;
}

int main (int argc, char *argv[])
{
    GError *error = NULL;

    GstElement *pipeline, *sink;
    GstStateChangeReturn state_ret;

    GstSample *sample;

    gst_init (&argc, &argv);

        gchar *descr = g_strdup(
        "rtspsrc location=rtsp://admin:admin12345@192.168.1.64:554/h264/ch33/main/av_stream latency=0 ! "
        "decodebin ! "
        "videoconvert ! "
        "appsink name=sink sync=false"
    );

//    gchar *descr = g_strdup(
//              "rtspsrc location=\"rtsp://admin:admin12345@192.168.1.64:554/h264/ch33/main/av_stream\" latency=0 ! "
//                    "decodebin ! "
//                    "videoconvert ! "
//                    "xvimagesink name=sink sync=true"
//                );

    pipeline = gst_parse_launch (descr, &error);

    if (error != NULL)
    {
        g_print ("could not construct pipeline: %s\n", error->message);
        g_error_free (error);
        exit (-1);
    }

    /* get sink */
    sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");

    /*set to pause*/
    state_ret = gst_element_set_state(pipeline, GST_STATE_PAUSED);

    switch(state_ret)
    {
        case GST_STATE_CHANGE_FAILURE:
            g_print ("failed to play the file\n");
            exit (-2);
        case GST_STATE_CHANGE_NO_PREROLL:
            /* for live sources, we need to set the pipeline to PLAYING before we can
            * receive a buffer. */
            g_print ("live source detected\n");
            live_flag = 1;
            break;
        default:
            break;
    }

    gst_app_sink_set_emit_signals((GstAppSink*)sink, true);
    gst_app_sink_set_drop((GstAppSink*)sink, true);
    gst_app_sink_set_max_buffers((GstAppSink*)sink, 1);
    GstAppSinkCallbacks callbacks = { NULL, new_preroll, new_sample };
    gst_app_sink_set_callbacks (GST_APP_SINK(sink), &callbacks, NULL, NULL);

    GstBus *bus;
    guint bus_watch_id;
    bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
    bus_watch_id = gst_bus_add_watch (bus, my_bus_callback, NULL);
    gst_object_unref (bus);

        gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);

    namedWindow("stream",1);

    loop = g_main_loop_new(NULL,false);
        g_main_loop_run(loop);

    cv::destroyWindow("stream");
    g_print ("Going to end of main!\n");
    gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
    gst_object_unref (GST_OBJECT (pipeline));

    return 0;
}

Hi Sulli,
Here are two suggestions for your usecase:
[1] Because opencv mainly leverages CPU, please try to run the system in max performance(run ~/jetson_clocks.sh)
[2] If BGRx format is good for you, you can use nvvidconv:

gst-launch-1.0 filesrc location= videoplayback.mp4 ! decodebin ! nvvidconv ! 'video/x-raw,format=BGRx' ! ximagesink

It is a HW converter and should bring better performance.

Hello DaneLLL,thank you for your reply!However I could not find ~/jetson_clocks.sh in my TX1.I don’t know the reason.Could you please help me?

Hi Sulli,
It is in rootfs\home\ubuntu
https://developer.nvidia.com/embedded/dlc/l4t-sample-root-filesystem-24-2-1

Hello DaneLLL,I even don’t have roofts.But your nvvidconv did work and fixed my problem!Thank you very much!!!

FYI, the “rootfs” is from the driver package on the PC flash host. “rootfs” is a subdirectory of “Linux_for_Tegra” (which is from unpacking the driver package), typically populated by the sample rootfs and updated by the apply_binaries.sh script. Whatever is in the rootfs at the time of flash will become the Jetson file system (slightly modified in boot files, otherwise verbatim).

This is very clear. Thank you, linuxdev.

Hello DaneLLL,I have found rootfs\home\ubuntu\jetson_clocks.sh on my host PC.However I still don’t understand how to run it as when I run ./jetson_clocks.sh on my host PC(not TX1),it was failed.

Hello linuxdev!Thank you~,I have found jetson_clocks.sh on my host PC with your help.However when I run the sh file to max TX1 capabilities,it was failed to run.

Hi Sulli, do you run with ‘sudo’?

Hi DaneLLL,yes,I have run jetson_clocks.sh with ‘sudo’.And the error occured as follows:

./jetson_clocks.sh: 第 196 行: echo: 写错误: 无效的参数
./jetson_clocks.sh: 第 197 行: echo: 写错误: 无效的参数
cat: /sys/devices/system/cpu/cpuquiet/tegra_cpuquiet/enable: 没有那个文件或目录
./jetson_clocks.sh: 行 167: /sys/devices/system/cpu/cpuquiet/tegra_cpuquiet/enable: 没有那个文件或目录
cat: /sys/kernel/debug/clock/override.gbus/min: 没有那个文件或目录
cat: /sys/kernel/debug/clock/override.gbus/max: 没有那个文件或目录
cat: /sys/kernel/debug/clock/override.gbus/rate: 没有那个文件或目录
cat: /sys/kernel/debug/clock/override.gbus/state: 没有那个文件或目录
./jetson_clocks.sh: 行 218: /sys/kernel/debug/clock/override.gbus/rate: 没有那个文件或目录
./jetson_clocks.sh: 行 219: /sys/kernel/debug/clock/override.gbus/state: 没有那个文件或目录
Error: Failed to max GPU frequency!
cat: /sys/kernel/debug/clock/override.emc/min: 没有那个文件或目录
cat: /sys/kernel/debug/clock/override.emc/max: 没有那个文件或目录
cat: /sys/kernel/debug/clock/override.emc/rate: 没有那个文件或目录
cat: /sys/kernel/debug/clock/override.emc/state: 没有那个文件或目录
./jetson_clocks.sh: 行 244: /sys/kernel/debug/clock/override.emc/rate: 没有那个文件或目录
./jetson_clocks.sh: 行 245: /sys/kernel/debug/clock/override.emc/state: 没有那个文件或目录
Can't access Fan!

Hi Sulli, it works on r24.2.1 + sample root file system. Looks like your rootfs is different and the script is not able to be applied.

Hello DaneLLL,I recoveried my TX1 with jetpack2.3.I don’t know the reason why my TX1 is different from yours.

jetson_clocks.sh is actually run from the Jetson, not from the PC. One can refer to the file structure from the rootfs directory of the PC to see what will be on the actual Jetson. If the file is present in rootfs of the PC, then it will be mirrored on the Jetson.

Hello linuxdev,thank you for your reply!However,it’s strange that I can find jetson_clocks.sh in my host PC with your suggestion but I can’t find the file on my TX1 with

sudo locate jetson_clocks.sh

or

sudo find / -name jetson_clocks.sh

commands.I don’t figure out the answer.Please help me~

If your host was used to flash the Jetson, and if at the time of flash the “/where/ever/it/is/Linux_for_Tegra/rootfs/home/ubuntu/jetson_clocks.sh” exists on your host, then the Jetson will contain the “jetson_clocks.sh” in “/home/ubuntu/jetson_clocks.sh”. If the Jetson does not contain this while the PC host does, then flash either never took place, or flash was truncated by lack of file system space on the host PC (run “df -H -T” and see if relevant partitions had space). The point of having that file on the host is that the host created the entire root file system on itself in order to copy it to the Jetson during flash.

On the Jetson do you have “/home/ubuntu/jetson_clocks.sh”? FYI, locate won’t see this unless you ran updatedb and also will not show this if the updatedb config does not look at the directory (there’s no reason it wouldn’t look there, but updatedb and locate are not installed by default on a Jetson, so there may be extra steps before this works as expected).

Hello linuxdev!I don’t have “/home/ubuntu/jetson_clocks.sh” on the Jetson.I am sure I have enough space on the host PC.So now I have to reflash the Jetson to accelerate CPU?

Hi Sulli, please copy jetson_clocks.sh from Host PC to Jetson TX1, and execute ‘sudo ./jetson_clocks.sh’ on Jetson TX1.

Hi DaneLLL.I have copied jetson_clocks.sh from Host PC to Jetson TX1,but when I executed ‘sudo ./jetson_clocks.sh’ on Jetson TX1,it was failed with “sudo ./jetson_clocks.sh:command not found”,however I have seen the file in the directory.

Hi Sulli,
Please not run the script but try to set up directly on TX1:

sudo su
echo performance > /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor
cat /sys/kernel/debug/clock/gbus/max > /sys/kernel/debug/clock/override.gbus/rate
echo 1 > /sys/kernel/debug/clock/override.gbus/state