H264 encoding from RGB source

I’m trying to get a working gstreamer pipeline to encode a RGB or RGBA source to H264 video. For example, I’ve created an RGBA or RGB file.

gst-launch-1.0 filesrc location=vid-20211114_211850.rgb ! video/x-raw, format=RGBA,width=2880, height=1440, framerate=30/1 ! nvvidconv ! video/x-raw, format=NV12 ! omxh264enc ! qtmux ! filesink location=test.mp4 -e

When I run this though I get:

Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 40

(gst-launch-1.0:27205): GStreamer-CRITICAL **: 22:45:42.107: gst_segment_to_running_time: assertion 'segment->format == format' failed
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming stopped, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...

For bonus points, I’d love a similar pipeline to go straight to video output. Let me know where I went wrong. Thanks!

Hi,
Please try with videoparse plugin like:

gst-launch-1.0 filesrc location=vid-20211114_211850.rgb !  videoparse format=11 width=2880 height=1440 ! nvvidconv ! video/x-raw, format=NV12 ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=test.mp4 -e

That worked for the most part but I had to make one change to put it into NVMM for nvv4l2h264enc:

gst-launch-1.0 filesrc location=vid-20211114_211850.rgb ! videoparse format=11 width=2880 height=1440 framerate=30/1 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=NV12' ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=test.mp4 -e

What’s the difference between using nvv4l2h264enc and omxh264enc? Also, is there any real performance hit using videoparse over just trying to feed it straight? Ultimately, I need this to work real-time, replacing the filesrc with an appsrc. If you have any suggestions, it would be much very appreciated.

Hi,
We have deprecated omx plugins. Please use v4l2 plugins such as nvv4l2h264enc, nvv4l2h265enc, nvv4l2decoder.

With appsrc the pipeline may run like:

appsrc ! video/x-raw,format=RGBA,width=2880,height=1440 ! nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location=test.mp4

And the performance may be capped in copying CPU buffer to NVMM buffer in

... ! video/x-raw,format=RGBA,width=2880,height=1440 ! nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! ...

Please execute sudo nvpmodel -m 0 and sudo jetson_clocks, to run CPU cores at max clocks.

Why nvpmodel -m 0 instead of nvpmodel -m 2?

Thanks!

Hi,
You may try both settings. All power modes are listed in developer guide. One mode is 2 CPU cores@1.9GHz, and the other is 6 CPU cores@1.4GHz.

Thanks for that link! I’ll give them a try. I’m going to have a lot going on at once. I’ll have to see how many are CPU bound and how many are GPU.

I was able to get the following line to work well and produce a good mp4 file when feeding in raw RGBA frames:

gst-launch-1.0 filesrc location=$1 ! videoparse format=11 width=2880 height=1440 framerate=30/1 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=NV12' ! nvv4l2h264enc bitrate=16000000 profile=4 ! h264parse ! qtmux ! filesink location=$2 -e

I then moved over to C and went with this line:

#define GST_ENC_PIPELINE   "appsrc name=srcEncode ! " \
                           "videoparse format=11 width=2880 height=1440 framerate=30/1 ! " \
                           "nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! " \
                           "nvv4l2h264enc bitrate=16000000 profile=4 ! " \
                           "h264parse ! qtmux ! filesink location=%s"

When I do this, the encoding doesn’t produce any errors but the mp4 is not playable by mplayer:

[dvvideo @ 0x7f7da90ab8]could not find dv frame profile
Error while decoding frame!
V:   1.6  48/ 48  0%  0%  0.0% 0 0 

From other errors I’ve seen, it appears that the PTS isn’t being passed through and/or encoded. Here’s my loop:

   while (this_vod.recording) {
      sem_wait(&this_vod.p_mutex);
      //printf("Creating new encoding buffer.\n");
      buffer = gst_buffer_new_wrapped(this_vod.rgb_out_pixels[this_vod.buffer_num],
                                      OUTPUT_WIDTH*2 * RGB_OUT_SIZE * OUTPUT_HEIGHT);
      if (buffer == NULL) {
         printf("Failure to allocate new buffer for encoding.\n");
         break;
      }
      buffer->pts = gst_clock_get_time(sys_clock);
      buffer->duration = gst_util_uint64_scale(1, GST_SECOND, 30);
      buffer->offset = count++;

      //printf("Feeding the buffer (%lu, %lu)...\n", buffer->offset, buffer->pts);

      /* get the preroll buffer from appsink */
      g_signal_emit_by_name (srcEncode, "push-buffer", buffer, &ret);

      sem_post(&this_vod.r_mutex);
      gst_buffer_unref(buffer);

      if (ret != GST_FLOW_OK) {
         printf("GST_FLOW error while pushing buffer: %d\n", ret);
         break;
      }
   }

Can you give me any insight on what I may be doing wrong on this one? For the record, if I shorten the pipeline, I can dump RGBA frames just fine through appsrc and filesink.

Thanks!

One more data point:
If I change my pipeline to the following:

#define GST_ENC_PIPELINE   "appsrc name=srcEncode ! " \
                           "video/x-raw, width=(int)2880, height=(int)1440, format=(string)RGBA, framerate=(fraction)30/1 ! " \
                           "nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! " \
                           "nvv4l2h264enc bitrate=16000000 profile=4 ! " \
                           "filesink location=%s"

(I removed the muxing.)

The h264 encoding appears to be where the issue is:

Starting playback...
[h264 @ 0x7fa2c88ab8]No start code is found.
[h264 @ 0x7fa2c88ab8]Error splitting the input into NAL units.
Error while decoding frame!
.
.
.

And the file is not empty. It’s 7.2MB.

Hi,
For generating a valid mp4 we would need to send EoS in termination. Please add -e in the pipeline and check if it helps. If it does not resolve the issue, please try to use matroskamux to generate mkv files.

Thank you again! I double checked my pipeline closing and I think I was adding an extra step and/or killing the application too soon.

   g_signal_emit_by_name (srcEncode, "end-of-stream", &ret);
   //gst_element_set_state (pipeline, GST_STATE_NULL);
   gst_object_unref (pipeline);

I commented out the middle line and I’m not closing too soon afterwords. I also verified that mkv works as well.

Hi,
You may refer to this sample to wait for EoS message:
GStreamer freeze when using qtmux and NVIDIA-accelerated h264/h265 encoding - #7 by DaneLLL

I’m going to share my complete video encoding thread. You can see the mutexes I use to protect the shared data between this thread and the writing thread. If you see any errors @DaneLLL, please let me know. This is working very well for me at the moment and running real-time. My only issue is that it seems to be encoding at twice the framerate I’m hoping for. I’m not sure if this is a feeding issue though since it does say 30 FPS when I play it back as it should.

Please ignore anything that’s just not cleaned up yet. This is still a WIP.

#ifdef MKV_OUT
#define GST_ENC_PIPELINE   "appsrc name=srcEncode ! " \
                           "video/x-raw, width=(int)2880, height=(int)1440, format=(string)RGBA, framerate=(fraction)30/1 ! " \
                           "nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! " \
                           "nvv4l2h264enc bitrate=16000000 profile=4 ! " \
                           "h264parse ! matroskamux ! filesink location=%s"
#else
#define GST_ENC_PIPELINE   "appsrc name=srcEncode ! " \
                           "video/x-raw, width=(int)2880, height=(int)1440, format=(string)RGBA, framerate=(fraction)30/1 ! " \
                           "nvvidconv ! video/x-raw(memory:NVMM), format=NV12 ! " \
                           "nvv4l2h264enc bitrate=16000000 profile=4 ! " \
                           "h264parse ! qtmux ! filesink location=%s"
#endif


void *video_encoding_thread(void *arg)
{
   time_t r_time;
   struct tm *l_time = NULL;
   char datetime[16];

   gchar descr[1024];
   GstElement *pipeline = NULL, *srcEncode = NULL;
   GError *error = NULL;

   GstFlowReturn ret = -1;
   GstBuffer *buffer = NULL;

   GstClock *sys_clock = NULL;
   GstClockTime current_time = 0;
   guint64 count = 0;

   struct timespec timeout;
   GstBus *bus = NULL;

   time(&r_time);
   l_time = localtime(&r_time);
   strftime(datetime, sizeof(datetime), "%Y%m%d_%H%M%S", l_time);

#ifdef MKV_OUT
   snprintf(this_vod.filename, sizeof(this_vod.filename), "vid-%s.mkv", datetime);
#else
   snprintf(this_vod.filename, sizeof(this_vod.filename), "vid-%s.mp4", datetime);
#endif

   sys_clock = gst_system_clock_obtain();
   current_time = gst_clock_get_time(sys_clock);

   g_snprintf(descr, 1024, GST_ENC_PIPELINE, this_vod.filename);
   pipeline = gst_parse_launch(descr, &error);
   if (error != NULL) {
      SDL_Log("could not construct pipeline \"%s\": %s\n", descr, error->message);
      g_error_free(error);
      return NULL;
   }

   /* get sink */
   srcEncode = gst_bin_get_by_name(GST_BIN(pipeline), "srcEncode");
   gst_element_set_state(pipeline, GST_STATE_PLAYING);

   while (this_vod.recording) {
      timeout.tv_sec = time(NULL);
      timeout.tv_nsec = 50000000;
      if (sem_timedwait(&this_vod.p_mutex, &timeout) == 0) {
         //printf("Creating new encoding buffer.\n");
         buffer = gst_buffer_new_wrapped(this_vod.rgb_out_pixels[this_vod.buffer_num],
                                         OUTPUT_WIDTH * 2 * RGB_OUT_SIZE * OUTPUT_HEIGHT);
         if (buffer == NULL) {
            printf("Failure to allocate new buffer for encoding.\n");
            break;
         }
         //buffer->pts = gst_clock_get_time(sys_clock);
         buffer->duration = gst_util_uint64_scale(1, GST_SECOND, 30);
         current_time += buffer->duration;
         buffer->pts = current_time;
         buffer->offset = count++;

         //printf("Feeding the buffer (%lu, %lu)...\n", buffer->offset, buffer->pts);

         /* get the preroll buffer from appsink */
         g_signal_emit_by_name(srcEncode, "push-buffer", buffer, &ret);

         sem_post(&this_vod.r_mutex);
         gst_buffer_unref(buffer);

         if (ret != GST_FLOW_OK) {
            printf("GST_FLOW error while pushing buffer: %d\n", ret);
            break;
         }
      }
   }
   sem_trywait(&this_vod.p_mutex);
   sem_post(&this_vod.r_mutex);

   /* Video */
   g_signal_emit_by_name(srcEncode, "end-of-stream", &ret);
   bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
   gst_bus_poll(bus, GST_MESSAGE_EOS, GST_CLOCK_TIME_NONE);

   gst_element_set_state((GstElement *) pipeline, GST_STATE_NULL);
   gst_object_unref(pipeline);

   return NULL;
}

Thanks again @DaneLLL !

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.