Problems with gstreamer video decoding pipeline

The following gstreamer pipeline used to work for me on Jetson TX2:

rtspsrc location=rtsp://127.0.0.1:554 latency=0 ! rtph264depay ! h264parse ! omxh264decode ! nvvidconv ! capsfilter name=myfilter ! videorate ! appsink

On Xavier with this pipeline the frame rate is extremely low (like 1-2 fps) and frames have a huge delay (several seconds).

I have also problems getting nvvidconv to work properly. I want to scale frames in a dynamic way with it. Therefore i am modifying the capsfilter with caps like

video/x-raw,width=(int)1280,pixel-aspect-ration=1/1

programmatically. When hardcoded in the pipeline it seems to scale the image correctly, letting it unset and changing it later seems to crop the image to the desired size instead.

How do i get the pipeline to work and scale frames correctly?

Hi,
I have verified below case on r31.0.2:
https://devtalk.nvidia.com/default/topic/1014789/jetson-tx1/-the-cpu-usage-cannot-down-use-cuda-decode-/post/5188538/#5188538

The server command is same and the client command is:

$ export GST_DEBUG=fpsdisplaysink:5
$ export RTSP_PATH=rtsp://127.0.0.1:8554/test
$ gst-launch-1.0 rtspsrc location="$RTSP_PATH" ! rtph264depay ! h264parse ! omxh264dec ! nvvidconv ! video/x-raw ! fpsdisplaysink video-sink=fakesink

The frame rate is as expected:

0:00:02.951215189  9134   0x7f300050a0 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 24.363359

Don’t see your server command. Please share it for reference.

I tested this using an 1080p60 IP camera. The framerate seems to be correct. But i have still the problem that nvvidconv is cropping the frames instead of scaling them. Using “videoconvert ! videoscale” instead, seems to scale the images correctly, but with a big performance loss. When using nvvidconv, do i really need to recreate the pipeline each time, changing the scaling?

Hi,
Please share the pipelines with ‘nvvidconv’ and ‘videoconvert ! videoscale’ so that we can compare the difference. Also share us the bitstream of your 1080p60 IP camera. It shouldbe dumped out with the following pipeline:

$ gst-launch-1.0 rtspsrc location="$RTSP_PATH" ! rtph264depay ! h264parse ! video/x-h264,stream-format=byte-stream ! filesink location=dump.h264

I made some more tests now.
I captured a short clip with the IP Camera using your script and uploaded it to onedrive [url]https://1drv.ms/u/s!Ag-3nd82CvQGgtA-s7P2AfKGzVWDbA[/url].
The clip is showing an orange box drawed on a whitebord about 28 seconds long.

fps results with nvvidconv:

gst-launch-1.0 filesrc location=/home/nvidia/orange_box.h264  ! h264parse ! omxh264dec ! nvvidconv ! capsfilter name=myfilter ! fpsdisplaysink video-sink=fakesink
Setting pipeline to PAUSED ...
0:00:00.082219630 20151   0x557250fb90 DEBUG         fpsdisplaysink fpsdisplaysink.c:440:fps_display_sink_start:<fpsdisplaysink0> Use text-overlay? 1
Pipeline is PREROLLING ...
NvMMLiteOpen : Block : BlockType = 261 
NvMMLiteBlockCreate : Block : BlockType = 261 
Allocating new output: 1920x1088 (x 12), ThumbnailMode = 0
Over-riding video dimension with display dimensionOPENMAX: HandleNewStreamFormat: 3528: Send OMX_EventPortSettingsChanged: nFrameWidth = 1920, nFrameHeight = 1080 
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.725443052 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 63.827697
0:00:00.725671317 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 63.827697
0:00:01.241881225 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 60.027993
0:00:01.758534767 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 60.000499
0:00:02.275202261 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 59.999956
0:00:03.291870684 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 59.998594
0:00:03.811451769 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 59.656546
0:00:07.896464218 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 59.468631
0:00:24.217420579 20151   0x7f84004230 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 58.958522
Got EOS from element "pipeline0".
Execution ended after 0:00:28.599883262
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

And the with videoconvert and videoscale:

gst-launch-1.0 filesrc location=/home/nvidia/orange_box.h264  ! h264parse ! omxh264dec ! videoconvert ! videoscale ! capsfilter name=myfilter ! fpsdisplaysink video-sink=fakesink
Setting pipeline to PAUSED ...
0:00:00.111518962 20210   0x55bd576e00 DEBUG         fpsdisplaysink fpsdisplaysink.c:440:fps_display_sink_start:<fpsdisplaysink0> Use text-overlay? 1
Pipeline is PREROLLING ...
NvMMLiteOpen : Block : BlockType = 261 
NvMMLiteBlockCreate : Block : BlockType = 261 
Allocating new output: 1920x1088 (x 12), ThumbnailMode = 0
Over-riding video dimension with display dimensionOPENMAX: HandleNewStreamFormat: 3528: Send OMX_EventPortSettingsChanged: nFrameWidth = 1920, nFrameHeight = 1080 
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.917794076 20210   0x55bd5616d0 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 20.241339
0:00:00.917858527 20210   0x55bd5616d0 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 20.241339
Got EOS from element "pipeline0".
Execution ended after 0:01:19.097998375
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Something seemed to be wrong here. So i testet it again with xvimagesink instead of fakesink:

gst-launch-1.0 filesrc location=/home/nvidia/orange_box.h264  ! h264parse ! omxh264dec ! videoconvert ! videoscale ! capsfilter name=myfilter ! fpsdisplaysink video-sink=xvimagesink
Setting pipeline to PAUSED ...
0:00:00.116792114 23451   0x5581ce8290 DEBUG         fpsdisplaysink fpsdisplaysink.c:440:fps_display_sink_start:<fpsdisplaysink0> Use text-overlay? 1
Pipeline is PREROLLING ...
NvMMLiteOpen : Block : BlockType = 261 
NvMMLiteBlockCreate : Block : BlockType = 261 
Allocating new output: 1920x1088 (x 12), ThumbnailMode = 0
Over-riding video dimension with display dimensionOPENMAX: HandleNewStreamFormat: 3528: Send OMX_EventPortSettingsChanged: nFrameWidth = 1920, nFrameHeight = 1080 
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.842384404 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 37.509964
0:00:00.842563707 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 37.509964
0:00:01.359533497 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:376:display_current_fps:<fpsdisplaysink0> Updated min-fps to 32.872263
0:00:01.874145783 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 38.864176
0:00:02.394129392 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 42.313083
0:00:02.911496639 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 42.518870
0:00:03.427646235 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 42.623505
0:00:03.943478058 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 42.649369
0:00:04.460642928 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 48.346054
0:00:08.043390074 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 50.225791
0:00:21.892359646 23451   0x7f68004c00 DEBUG         fpsdisplaysink fpsdisplaysink.c:372:display_current_fps:<fpsdisplaysink0> Updated max-fps to 50.548627
Got EOS from element "pipeline0".
Execution ended after 0:00:28.583115284
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

I also did write a c++ script for reproducing the scaling problem, but somehow i cant post it here without getting:
devtalk.nvidia.com - Access Denied
Error code 15
This request was blocked by the security rules

Hi,
Please zip it and upload like
https://devtalk.nvidia.com/default/topic/1043500/jetson-tx2/deinterlacing-problem-in-tx2/post/5294986/#5294986

Hi d klose,
Does it show cropping issue by comparing

gst-launch-1.0 filesrc location=/home/nvidia/orange_box.h264  ! h264parse ! omxh264dec ! nvvidconv ! nvoverlaysink

and

gst-launch-1.0 filesrc location=/home/nvidia/orange_box.h264  ! h264parse ! omxh264dec ! videoconvert ! videoscale ! xvimagesink

It is expected you see low frame rate with ‘videoconvert ! videoscale’ because it is full CPU functions from 3rdparty. What we would like to clarify is the video cropping. It looks to be an 1920x1080 video file, so nvvidconv should correctly show you 1920x1080.

Tested both pipelines. Did not see any cropping issues. Attached script for reproducing scaling issue now.
ScalingIssue.zip (1.32 KB)

Hi d klose,

Your problem is that you are changing the caps after you set the pipeline state to “GST_STATE_PLAYING”, and nvvidconv does not support changing the caps live.

You can solve this problem on three ways:

  1. By setting the pipeline to playing after you set the caps:
    /* Build the pipeline */
      pipeline = gst_parse_launch ("filesrc location=/home/nvidia/orange_box.h264  ! h264parse ! omxh264dec ! nvvidconv ! capsfilter name=myfilter ! fpsdisplaysink video-sink=xvimagesink", NULL);
    
      /* Start playing */
      //gst_element_set_state (pipeline, GST_STATE_PLAYING);
    
      /* wait and change scaling */
      sleep(5);
    	caps = gst_caps_new_simple ("video/x-raw",
    			"width", G_TYPE_INT, 1280,
          "height", G_TYPE_INT, 720, NULL);
      myfilter = gst_bin_get_by_name((GstBin*)pipeline, "myfilter");
      g_object_set (G_OBJECT (myfilter), "caps", caps, NULL);
      gst_caps_unref (caps);
    
      /* Start playing after caps have been changed */
      gst_element_set_state (pipeline, GST_STATE_PLAYING);
    
  2. By restarting the pipeline after changing the caps:
    /* Build the pipeline */
      pipeline = gst_parse_launch ("filesrc location=/home/nvidia/orange_box.h264  ! h264parse ! omxh264dec ! nvvidconv ! capsfilter name=myfilter ! fpsdisplaysink video-sink=xvimagesink", NULL);
    
      /* Start playing */
      gst_element_set_state (pipeline, GST_STATE_PLAYING);
    
      /* wait and change scaling */
      sleep(5);
    	caps = gst_caps_new_simple ("video/x-raw",
    			"width", G_TYPE_INT, 1280,
          "height", G_TYPE_INT, 720, NULL);
      myfilter = gst_bin_get_by_name((GstBin*)pipeline, "myfilter");
      g_object_set (G_OBJECT (myfilter), "caps", caps, NULL);
      gst_caps_unref (caps);
    
      /* Restart the pipeline after caps have been changed */
      gst_element_set_state(pipeline, GST_STATE_READY);
      gst_element_set_state(pipeline, GST_STATE_PLAYING);
    
  3. If you can't stop the pipeline and really need to change the caps on running time, I would suggest you using another element. We at RidgeRun offer a plugin called GstPTZR that can perform pan, zoom, tilt and rotation on a stream. The element is accelerated with openGL and supports live caps renegotiation. For more information on this product please check https://developer.ridgerun.com/wiki/index.php?title=Xavier/RidgeRun_Products/GstPTZR