Hardware-accelerated video encoding with gstreamer

I am using a TX1 with L4T R24.2.1. I would like to encode video using gstreamer, and take advantage of the GPU to encode the video in high resolution and high quality.

From the user manual, there are two examples available which I could use:

h264 encoding:

gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)I420, width=(int)640, height=(int)480' ! omxh264enc ! 'video/x-h264, streamformat=(string)byte-stream' ! h264parse ! qtmux ! filesink location=test.mp4 -e

Which works well, and seems to be hardware-accelerated, as the CPU load is not very high, even when testing high resolution. (side-question: is there a simple way to tell whether the GPU is being used?)

h265 encoding:

gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)I420, width=(int)640, height=(int)480' ! omxh265enc ! filesink location=test.h265 -e

Which seems to be working, as it produces a ‘test.h265’ file with typical video size. However, this file does not seem be recognized as a video file. When playing it in VLC, nothing happens, and no codec seems to be detected.

Is this normal? Is there an additional step needed in the pipeline to create a standard AVI or MKV file?

Yes, in the home directory there is an executable called ‘tegrastats’ this will print to the terminal the status of the TX1 including GPU usage.

One thing to note though is that the TX1 has dedicated video encoders, I’m not sure if these are used by gstreamer or if it’s accelerated via the GPU.

Hi ericleib,
Please run
gst-launch-1.0 videotestsrc num-buffers=90 ! ‘video/x-raw, format=(string)I420, width=(int)640, height=(int)480’ ! omxh265enc ! ‘video/x-h265,stream-format=byte-stream’ ! filesink location=test.h265

And for playback
gst-launch-1.0 filesrc location=test.h265 ! h265parse ! omxh265dec ! nvoverlaysink

Please mux into MKV for playback in VLC
gst-launch-1.0 videotestsrc num-buffers=90 ! ‘video/x-raw, format=(string)I420, width=(int)640, height=(int)480’ ! omxh265enc ! matroskamux ! filesink location=test.mkv

Thanks jazza for sharing information of tegrastats

Thank you! It works just fine & fast.

how to convert ts → mkv or play .ts with gst?

Hi DaneLLL
I compile a sample in tx1 ,the file is gst-plugins-base-2.8.0/tests/examoles/snapshot/snapshot.c.
when I run it ,the sample is hanging on.

GST_STATE_CHANGE_SUCCESS----------------
GST_STATE_CHANGE_ASYNC-GST_STATE_PAUSED--------
NvMMLiteOpen : Block : BlockType = 261
TVMR: NvMMLiteTVMRDecBlockOpen: 7907: NvMMLiteBlockOpen
NvMMLiteBlockCreate : Block : BlockType = 261
TVMR: cbBeginSequence: 1223: BeginSequence 3840x1088, bVPR = 0
TVMR: LowCorner Frequency = 345000
TVMR: cbBeginSequence: 1622: DecodeBuffers = 7, pnvsi->eCodec = 4, codec = 0
TVMR: cbBeginSequence: 1693: Display Resolution : (3840x1080)
TVMR: cbBeginSequence: 1694: Display Aspect Ratio : (3840x1080)
TVMR: cbBeginSequence: 1762: ColorFormat : 5
TVMR: cbBeginSequence:1767 ColorSpace = NvColorSpace_YCbCr709_ER
TVMR: cbBeginSequence: 1904: SurfaceLayout = 3
TVMR: cbBeginSequence: 2005: NumOfSurfaces = 14, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1, BitDepthForSurface = 8 LumaBitDepth = 8, ChromaBitDepth = 8, ChromaFormat = 5
TVMR: cbBeginSequence: 2007: BeginSequence ColorPrimaries = 1, TransferCharacteristics = 1, MatrixCoefficients = 1
Allocating new output: 3840x1088 (x 14), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3464: Send OMX_EventPortSettingsChanged : nFrameWidth = 3840, nFrameHeight = 1088
failed to play the file= GST_STATE_PAUSED========
GST_STATE_CHANGE_SUCCESS----------------
GST_STATE_CHANGE_ASYNC-GST_STATE_PLAYING--------
failed to play the file= GST_STATE_PLAYING========
GST_STATE_CHANGE_SUCCESS----------------
aaaaaaaaaaaaaaaaa----------------
TVMR: FrameRate = 55
TVMR: NVDEC LowCorner Freq = (576000 * 1024)

I find it is because "g_signal_emit_by_name (sink, “pull-preroll”, &sample, NULL);"can you give me some abvices .
bty,the code is :

/* GStreamer snapshot example
 * Copyright (C) <2007> Wim Taymans <wim.taymans@gmail.com>
 *
 * This library is free software; you can redistribute it and/or
 * modify it under the terms of the GNU Library General Public
 * License as published by the Free Software Foundation; either
 * version 2 of the License, or (at your option) any later version.
 *
 * This library is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
 * Library General Public License for more details.
 *
 * You should have received a copy of the GNU Library General Public
 * License along with this library; if not, write to the
 * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
 * Boston, MA 02110-1301, USA.
 */

#include <gst/gst.h>
#include <gtk/gtk.h>

#include <stdlib.h>

#define CAPS "video/x-raw,format=RGB,width=160,pixel-aspect-ratio=1/1"

int
main (int argc, char *argv[])
{
  GstElement *pipeline, *sink;
  gint width, height;
  GstSample *sample;
  gchar *descr;
  GError *error = NULL;
  GdkPixbuf *pixbuf;
  gint64 duration, position;
  GstStateChangeReturn ret;
  gboolean res;
  GstMapInfo map;

  gst_init (&argc, &argv);

  if (argc != 2) {
    g_print ("usage: %s <uri>\n Writes snapshot.png in the current directory\n",
        argv[0]);
    exit (-1);
  }

  /* create a new pipeline */
 // descr =
    //  g_strdup_printf ("uridecodebin uri=%s ! videoconvert ! videoscale ! "
    //  " appsink name=sink caps=\"" CAPS "\"", argv[1]);

descr =

      g_strdup_printf ("filesrc location=%s ! qtdemux name=demux demux.video_0 ! queue ! h264parse ! omxh264dec ! videoconvert ! videoscale"

      " appsink name=sink caps=\"" CAPS "\"", argv[1]);

  pipeline = gst_parse_launch (descr, &error);

  if (error != NULL) {
    g_print ("could not construct pipeline: %s\n", error->message);
    g_error_free (error);
    exit (-1);
  }

  /* get sink */
  sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");

  /* set to PAUSED to make the first frame arrive in the sink */
/*
  ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
    /*  g_print ("live sources not supported yet\n");
      exit (-1);
    default:
      break;
  }
*/
  /* This can block for up to 5 seconds. If your machine is really overloaded,
   * it might time out before the pipeline prerolled and we generate an error. A
   * better way is to run a mainloop and catch errors there. */
  /*ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
  if (ret == GST_STATE_CHANGE_FAILURE) {
    g_print ("failed to play the file\n");
    exit (-1);
  }
*/


  
  /* set to PAUSED to make the first frame arrive in the sink */
  //ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
  //ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
    ret = gst_element_set_state (pipeline, GST_STATE_READY);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file----------------\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
      g_print ("live sources not supported yet\n");
      exit (-1);
    case GST_STATE_CHANGE_ASYNC:{
      g_print ("GST_STATE_CHANGE_ASYNC----------------\n");
      ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
     //ret = gst_element_get_state (pipeline, NULL, NULL, GST_CLOCK_TIME_NONE);
      if (ret == GST_STATE_CHANGE_FAILURE) {
        g_print ("failed to play the file===============\n");
        exit (-1);
     }
	}
      //exit (-1);
  case GST_STATE_CHANGE_SUCCESS:
      g_print ("GST_STATE_CHANGE_SUCCESS----------------\n");
      //exit (-1);
    default:
      break;
  }

  ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file--GST_STATE_PAUSED----\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
      g_print ("live sources not supported yet\n");
      exit (-1);
    case GST_STATE_CHANGE_ASYNC:{
      g_print ("GST_STATE_CHANGE_ASYNC-GST_STATE_PAUSED--------\n");
      //ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
     ret = gst_element_get_state (pipeline, NULL, NULL, GST_CLOCK_TIME_NONE);
      if (ret == GST_STATE_CHANGE_FAILURE) {
        g_print ("failed to play the file= GST_STATE_PAUSED========\n");
        //exit (-1);
     }
	}
      //exit (-1);
  case GST_STATE_CHANGE_SUCCESS:
      g_print ("GST_STATE_CHANGE_SUCCESS----------------\n");
      //exit (-1);
    default:
      break;
  }

 ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file--GST_STATE_PLAYING----\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
      g_print ("live sources not supported yet\n");
      exit (-1);
    case GST_STATE_CHANGE_ASYNC:{
      g_print ("GST_STATE_CHANGE_ASYNC-GST_STATE_PLAYING--------\n");
      //ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
     ret = gst_element_get_state (pipeline, NULL, NULL, GST_CLOCK_TIME_NONE);
      if (ret == GST_STATE_CHANGE_FAILURE) {
        g_print ("failed to play the file= GST_STATE_PLAYING========\n");
        //exit (-1);
     }
	}
      //exit (-1);
  case GST_STATE_CHANGE_SUCCESS:
      g_print ("GST_STATE_CHANGE_SUCCESS----------------\n");
      //exit (-1);
    default:
      break;
  }




  /* get the duration */
  gst_element_query_duration (pipeline, GST_FORMAT_TIME, &duration);

  if (duration != -1)
    /* we have a duration, seek to 5% */
    position = duration * 5 / 100;
  else
    /* no duration, seek to 1 second, this could EOS */
    position = 1 * GST_SECOND;

  /* seek to the a position in the file. Most files have a black first frame so
   * by seeking to somewhere else we have a bigger chance of getting something
   * more interesting. An optimisation would be to detect black images and then
   * seek a little more */
  gst_element_seek_simple (pipeline, GST_FORMAT_TIME,
      GST_SEEK_FLAG_KEY_UNIT | GST_SEEK_FLAG_FLUSH, position);

  /* get the preroll buffer from appsink, this block untils appsink really
   * prerolls */
g_print ("aaaaaaaaaaaaaaaaa----------------\n");
  g_signal_emit_by_name (sink, "pull-preroll", &sample, NULL);
  g_print ("bbbbbbbbbbbbbbbbbbbbbbbbb----------------\n");
  gst_object_unref (sink);

  /* if we have a buffer now, convert it to a pixbuf. It's possible that we
   * don't have a buffer because we went EOS right away or had an error. */
  if (sample) {
    GstBuffer *buffer;
    GstCaps *caps;
    GstStructure *s;

    /* get the snapshot buffer format now. We set the caps on the appsink so
     * that it can only be an rgb buffer. The only thing we have not specified
     * on the caps is the height, which is dependant on the pixel-aspect-ratio
     * of the source material */
    caps = gst_sample_get_caps (sample);
    if (!caps) {
      g_print ("could not get snapshot format\n");
      exit (-1);
    }
    s = gst_caps_get_structure (caps, 0);

    /* we need to get the final caps on the buffer to get the size */
    res = gst_structure_get_int (s, "width", &width);
    res |= gst_structure_get_int (s, "height", &height);
    if (!res) {
      g_print ("could not get snapshot dimension\n");
      exit (-1);
    }

    /* create pixmap from buffer and save, gstreamer video buffers have a stride
     * that is rounded up to the nearest multiple of 4 */
    buffer = gst_sample_get_buffer (sample);
    gst_buffer_map (buffer, &map, GST_MAP_READ);
    pixbuf = gdk_pixbuf_new_from_data (map.data,
        GDK_COLORSPACE_RGB, FALSE, 8, width, height,
        GST_ROUND_UP_4 (width * 3), NULL, NULL);

    /* save the pixbuf */
    gdk_pixbuf_save (pixbuf, "snapshot.png", "png", &error, NULL);
    gst_buffer_unmap (buffer, &map);
  } else {
    g_print ("could not make snapshot\n");
  }

  /* cleanup and exit */
  gst_element_set_state (pipeline, GST_STATE_NULL);
  gst_object_unref (pipeline);

  exit (0);
}

Hi DaneLLL
I compile a sample in tx1 ,the file is gst-plugins-base-2.8.0/tests/examoles/snapshot/snapshot.c.
when I run it ,the sample is hanging on.

GST_STATE_CHANGE_SUCCESS----------------
GST_STATE_CHANGE_ASYNC-GST_STATE_PAUSED--------
NvMMLiteOpen : Block : BlockType = 261
TVMR: NvMMLiteTVMRDecBlockOpen: 7907: NvMMLiteBlockOpen
NvMMLiteBlockCreate : Block : BlockType = 261
TVMR: cbBeginSequence: 1223: BeginSequence 3840x1088, bVPR = 0
TVMR: LowCorner Frequency = 345000
TVMR: cbBeginSequence: 1622: DecodeBuffers = 7, pnvsi->eCodec = 4, codec = 0
TVMR: cbBeginSequence: 1693: Display Resolution : (3840x1080)
TVMR: cbBeginSequence: 1694: Display Aspect Ratio : (3840x1080)
TVMR: cbBeginSequence: 1762: ColorFormat : 5
TVMR: cbBeginSequence:1767 ColorSpace = NvColorSpace_YCbCr709_ER
TVMR: cbBeginSequence: 1904: SurfaceLayout = 3
TVMR: cbBeginSequence: 2005: NumOfSurfaces = 14, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1, BitDepthForSurface = 8 LumaBitDepth = 8, ChromaBitDepth = 8, ChromaFormat = 5
TVMR: cbBeginSequence: 2007: BeginSequence ColorPrimaries = 1, TransferCharacteristics = 1, MatrixCoefficients = 1
Allocating new output: 3840x1088 (x 14), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3464: Send OMX_EventPortSettingsChanged : nFrameWidth = 3840, nFrameHeight = 1088
failed to play the file= GST_STATE_PAUSED========
GST_STATE_CHANGE_SUCCESS----------------
GST_STATE_CHANGE_ASYNC-GST_STATE_PLAYING--------
failed to play the file= GST_STATE_PLAYING========
GST_STATE_CHANGE_SUCCESS----------------
aaaaaaaaaaaaaaaaa----------------
TVMR: FrameRate = 55
TVMR: NVDEC LowCorner Freq = (576000 * 1024)

I find it is because "g_signal_emit_by_name (sink, “pull-preroll”, &sample, NULL);"can you give me some abvices .
bty,the code is :

/* GStreamer snapshot example
 * Copyright (C) <2007> Wim Taymans <wim.taymans@gmail.com>
 *
 * This library is free software; you can redistribute it and/or
 * modify it under the terms of the GNU Library General Public
 * License as published by the Free Software Foundation; either
 * version 2 of the License, or (at your option) any later version.
 *
 * This library is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
 * Library General Public License for more details.
 *
 * You should have received a copy of the GNU Library General Public
 * License along with this library; if not, write to the
 * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
 * Boston, MA 02110-1301, USA.
 */

#include <gst/gst.h>
#include <gtk/gtk.h>

#include <stdlib.h>

#define CAPS "video/x-raw,format=RGB,width=160,pixel-aspect-ratio=1/1"

int
main (int argc, char *argv[])
{
  GstElement *pipeline, *sink;
  gint width, height;
  GstSample *sample;
  gchar *descr;
  GError *error = NULL;
  GdkPixbuf *pixbuf;
  gint64 duration, position;
  GstStateChangeReturn ret;
  gboolean res;
  GstMapInfo map;

  gst_init (&argc, &argv);

  if (argc != 2) {
    g_print ("usage: %s <uri>\n Writes snapshot.png in the current directory\n",
        argv[0]);
    exit (-1);
  }

  /* create a new pipeline */
 // descr =
    //  g_strdup_printf ("uridecodebin uri=%s ! videoconvert ! videoscale ! "
    //  " appsink name=sink caps=\"" CAPS "\"", argv[1]);

descr =

      g_strdup_printf ("filesrc location=%s ! qtdemux name=demux demux.video_0 ! queue ! h264parse ! omxh264dec ! videoconvert ! videoscale"

      " appsink name=sink caps=\"" CAPS "\"", argv[1]);

  pipeline = gst_parse_launch (descr, &error);

  if (error != NULL) {
    g_print ("could not construct pipeline: %s\n", error->message);
    g_error_free (error);
    exit (-1);
  }

  /* get sink */
  sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");

  /* set to PAUSED to make the first frame arrive in the sink */
/*
  ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
    /*  g_print ("live sources not supported yet\n");
      exit (-1);
    default:
      break;
  }
*/
  /* This can block for up to 5 seconds. If your machine is really overloaded,
   * it might time out before the pipeline prerolled and we generate an error. A
   * better way is to run a mainloop and catch errors there. */
  /*ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
  if (ret == GST_STATE_CHANGE_FAILURE) {
    g_print ("failed to play the file\n");
    exit (-1);
  }
*/


  
  /* set to PAUSED to make the first frame arrive in the sink */
  //ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
  //ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
    ret = gst_element_set_state (pipeline, GST_STATE_READY);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file----------------\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
      g_print ("live sources not supported yet\n");
      exit (-1);
    case GST_STATE_CHANGE_ASYNC:{
      g_print ("GST_STATE_CHANGE_ASYNC----------------\n");
      ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
     //ret = gst_element_get_state (pipeline, NULL, NULL, GST_CLOCK_TIME_NONE);
      if (ret == GST_STATE_CHANGE_FAILURE) {
        g_print ("failed to play the file===============\n");
        exit (-1);
     }
	}
      //exit (-1);
  case GST_STATE_CHANGE_SUCCESS:
      g_print ("GST_STATE_CHANGE_SUCCESS----------------\n");
      //exit (-1);
    default:
      break;
  }

  ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file--GST_STATE_PAUSED----\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
      g_print ("live sources not supported yet\n");
      exit (-1);
    case GST_STATE_CHANGE_ASYNC:{
      g_print ("GST_STATE_CHANGE_ASYNC-GST_STATE_PAUSED--------\n");
      //ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
     ret = gst_element_get_state (pipeline, NULL, NULL, GST_CLOCK_TIME_NONE);
      if (ret == GST_STATE_CHANGE_FAILURE) {
        g_print ("failed to play the file= GST_STATE_PAUSED========\n");
        //exit (-1);
     }
	}
      //exit (-1);
  case GST_STATE_CHANGE_SUCCESS:
      g_print ("GST_STATE_CHANGE_SUCCESS----------------\n");
      //exit (-1);
    default:
      break;
  }

 ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
  switch (ret) {
    case GST_STATE_CHANGE_FAILURE:
      g_print ("failed to play the file--GST_STATE_PLAYING----\n");
      exit (-1);
    case GST_STATE_CHANGE_NO_PREROLL:
      /* for live sources, we need to set the pipeline to PLAYING before we can
       * receive a buffer. We don't do that yet */
      g_print ("live sources not supported yet\n");
      exit (-1);
    case GST_STATE_CHANGE_ASYNC:{
      g_print ("GST_STATE_CHANGE_ASYNC-GST_STATE_PLAYING--------\n");
      //ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
     ret = gst_element_get_state (pipeline, NULL, NULL, GST_CLOCK_TIME_NONE);
      if (ret == GST_STATE_CHANGE_FAILURE) {
        g_print ("failed to play the file= GST_STATE_PLAYING========\n");
        //exit (-1);
     }
	}
      //exit (-1);
  case GST_STATE_CHANGE_SUCCESS:
      g_print ("GST_STATE_CHANGE_SUCCESS----------------\n");
      //exit (-1);
    default:
      break;
  }




  /* get the duration */
  gst_element_query_duration (pipeline, GST_FORMAT_TIME, &duration);

  if (duration != -1)
    /* we have a duration, seek to 5% */
    position = duration * 5 / 100;
  else
    /* no duration, seek to 1 second, this could EOS */
    position = 1 * GST_SECOND;

  /* seek to the a position in the file. Most files have a black first frame so
   * by seeking to somewhere else we have a bigger chance of getting something
   * more interesting. An optimisation would be to detect black images and then
   * seek a little more */
  gst_element_seek_simple (pipeline, GST_FORMAT_TIME,
      GST_SEEK_FLAG_KEY_UNIT | GST_SEEK_FLAG_FLUSH, position);

  /* get the preroll buffer from appsink, this block untils appsink really
   * prerolls */
g_print ("aaaaaaaaaaaaaaaaa----------------\n");
  g_signal_emit_by_name (sink, "pull-preroll", &sample, NULL);
  g_print ("bbbbbbbbbbbbbbbbbbbbbbbbb----------------\n");
  gst_object_unref (sink);

  /* if we have a buffer now, convert it to a pixbuf. It's possible that we
   * don't have a buffer because we went EOS right away or had an error. */
  if (sample) {
    GstBuffer *buffer;
    GstCaps *caps;
    GstStructure *s;

    /* get the snapshot buffer format now. We set the caps on the appsink so
     * that it can only be an rgb buffer. The only thing we have not specified
     * on the caps is the height, which is dependant on the pixel-aspect-ratio
     * of the source material */
    caps = gst_sample_get_caps (sample);
    if (!caps) {
      g_print ("could not get snapshot format\n");
      exit (-1);
    }
    s = gst_caps_get_structure (caps, 0);

    /* we need to get the final caps on the buffer to get the size */
    res = gst_structure_get_int (s, "width", &width);
    res |= gst_structure_get_int (s, "height", &height);
    if (!res) {
      g_print ("could not get snapshot dimension\n");
      exit (-1);
    }

    /* create pixmap from buffer and save, gstreamer video buffers have a stride
     * that is rounded up to the nearest multiple of 4 */
    buffer = gst_sample_get_buffer (sample);
    gst_buffer_map (buffer, &map, GST_MAP_READ);
    pixbuf = gdk_pixbuf_new_from_data (map.data,
        GDK_COLORSPACE_RGB, FALSE, 8, width, height,
        GST_ROUND_UP_4 (width * 3), NULL, NULL);

    /* save the pixbuf */
    gdk_pixbuf_save (pixbuf, "snapshot.png", "png", &error, NULL);
    gst_buffer_unmap (buffer, &map);
  } else {
    g_print ("could not make snapshot\n");
  }

  /* cleanup and exit */
  gst_element_set_state (pipeline, GST_STATE_NULL);
  gst_object_unref (pipeline);

  exit (0);
}

Hi 934271405,
I don’t have experience in running the app. Other users may share experience.

From the log it looks like you are running decoding. Please refer to user guide and try the commands.
https://developer.nvidia.com/embedded/dlc/l4t-accelerated-gstreamer-guide-28-1

alright, commands is working well,but the sample has some problems.anyway thanks