Thanks for your answer. I could run this example successfully.
I believe I am very new to video encoding/decoding pipeline and I am trying to decode an ‘.avi’ file. Would these samples be able to decode these files? I want a ‘.yuv’ raw file out of it.
However, when I input another .mp4 video, it does not work and shows me errors. The pipeline and corresponding output is as follows:
GST_DEBUG=3 gst-launch-1.0 filesrc location=/home/nvidia/DJI_0021_480.MP4 ! qtdemux name=demux demux.video_0 ! queue ! h264parse ! omxh264dec ! nveglglessink
0:00:00.062383648 8931 0x627440 WARN omx gstomx.c:2836:plugin_init: Failed to load configuration file: Valid key file could not be found in search dirs (searched in: /home/nvidia/.config:/etc/xdg/xdg-ubuntu:/usr/share/upstart/xdg:/etc/xdg as per GST_OMX_CONFIG_DIR environment variable, the xdg user config directory (or XDG_CONFIG_HOME) and the system config directory (or XDG_CONFIG_DIRS)
Setting pipeline to PAUSED ...
0:00:00.111006848 8931 0x627440 WARN basesrc gstbasesrc.c:3489:gst_base_src_start_complete:<filesrc0> pad not activated yet
Pipeline is PREROLLING ...
0:00:00.111425472 8931 0x626230 WARN qtdemux qtdemux.c:2651:qtdemux_parse_trex:<demux> failed to find fragment defaults for stream 1
0:00:00.111513120 8931 0x627440 WARN structure gststructure.c:1935:priv_gst_structure_append_to_gstring: No value transform to serialize field 'display' of type 'GstEGLDisplay'
Got context from element 'eglglessink0': gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
0:00:00.111630944 8931 0x626230 WARN basesrc gstbasesrc.c:2396:gst_base_src_update_length:<filesrc0> processing at or past EOS
0:00:00.112433568 8931 0x626280 FIXME videodecoder gstvideodecoder.c:946:gst_video_decoder_drain_out:<omxh264dec-omxh264dec0> Sub-class should implement drain()
NvMMLiteOpen : Block : BlockType = 261
TVMR: NvMMLiteTVMRDecBlockOpen: 7907: NvMMLiteBlockOpen
NvMMLiteBlockCreate : Block : BlockType = 261
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
Event_BlockError from 0BlockH264Dec : Error code - e3040
Sending error event from 0BlockH264Dec0:00:00.117813504 8931 0x7f7c001720 ERROR omx gstomx.c:504:EventHandler:<omxh264dec-omxh264dec0> decode got error: Format not detected (0x80001020)
TVMR: NvMMLiteTVMRDecDoWork: 6430: TVMR Video Dec Unsupported Stream
0:00:00.117890848 8931 0x626280 ERROR omx gstomx.c:276:gst_omx_component_handle_messages:<omxh264dec-omxh264dec0> decode got error: Format not detected (0x80001020)
0:00:00.117919840 8931 0x626280 ERROR omx gstomx.c:1293:gst_omx_port_acquire_buffer:<omxh264dec-omxh264dec0> Component decode is in error state: Format not detected
0:00:00.117958208 8931 0x626280 WARN omxvideodec gstomxvideodec.c:3743:gst_omx_video_dec_handle_frame:<omxh264dec-omxh264dec0> error: OpenMAX component in error state Format not detected (0x80001020)
0:00:00.117951104 8931 0x7f7c002d90 ERROR omx gstomx.c:1293:gst_omx_port_acquire_buffer:<omxh264dec-omxh264dec0> Component decode is in error state: Format not detected
0:00:00.118007168 8931 0x7f7c002d90 WARN omxvideodec gstomxvideodec.c:2822:gst_omx_video_dec_loop:<omxh264dec-omxh264dec0> error: OpenMAX component in error state Format not detected (0x80001020)
0:00:00.118094624 8931 0x626230 WARN qtdemux qtdemux.c:5520:gst_qtdemux_loop:<demux> error: streaming stopped, reason error
ERROR: from element /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0: GStreamer encountered a general supporting library error.
0:00:00.118153504 8931 0x626230 WARN queue gstqueue.c:992:gst_queue_handle_sink_event:<queue0> error: Internal data flow error.
0:00:00.118185920 8931 0x626230 WARN queue gstqueue.c:992:gst_queue_handle_sink_event:<queue0> error: streaming task paused, reason error (-5)
Additional debug info:
/dvs/git/dirty/git-master_linux/external/gstreamer/gst-omx/omx/gstomxvideodec.c(2822): gst_omx_video_dec_loop (): /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0:
OpenMAX component in error state Format not detected (0x80001020)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Caught SIGSEGV
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
Event_BlockError from 0BlockH264Dec : Error code - e3040
Blocking error event from 0BlockH264DecTVMR: NvMMLiteTVMRDecDoWork: 6430: TVMR Video Dec Unsupported Stream
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
Event_BlockError from 0BlockH264Dec : Error code - e3040
Blocking error event from 0BlockH264DecTVMR: NvMMLiteTVMRDecDoWork: 6430: TVMR Video Dec Unsupported Stream
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
Event_BlockError from 0BlockH264Dec : Error code - e3040
Blocking error event from 0BlockH264DecTVMR: NvMMLiteTVMRDecDoWork: 6430: TVMR Video Dec Unsupported Stream
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
Event_BlockError from 0BlockH264Dec : Error code - e3040
Blocking error event from 0BlockH264DecTVMR: NvMMLiteTVMRDecDoWork: 6430: TVMR Video Dec Unsupported Stream
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
TVMR: TVMRBufferProcessing: 5668: video_parser_parse Unsupported Codec
Event_BlockError from 0BlockH264Dec : Error code - e3040
Blocking error event from 0BlockH264DecTVMR: NvMMLiteTVMRDecDoWork: 6430: TVMR Video Dec Unsupported Stream
#0 0x0000007f8e260130 in pthread_join (threadid=547785761280,
#1 0x0000007f8e316e40 in ?? () from /lib/aarch64-linux-gnu/libglib-2.0.so.0
#2 0x0000000000000011 in ?? ()
Spinning. Please run 'gdb gst-launch-1.0 8931' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.
Another question is related to the appsink. When I use the gst-launch with autovideosink/nveglglessink I am able to view the contents properly. However, when I use the same pipeline in the jetson inference code, the output video is corrupted. following two are the links to
respective screenshots.
Thanks for your reply. As mentioned, I am able to test another MP4 video file.
Now that you mention it, I want to believe that it is indeed corrupted as I am not able to play it with the vlc player. It could be because I had created this video after processing a file in opencv on my Mac. I will give another shot at creating another file on Jetson itself.
Btw, any pointers/idea on the second question? Maybe dusty_nv can have some pointers? I am unsure if I can tag people in this forum.
I would like to think that I am not getting it as per:
Allocating new output: 3840x2160 (x 13), ThumbnailMode = 0
Is there any other way to check for your test code (attached here)?
Moreover, with this pipeline too, I am getting similar results in appsink for the inference code I have. I merely modified the launch string in the inference code.
Thanks for all your support. After some ‘discovery and inspection’ (using gst-discoverer/inspect!) of different files I have, I have come to the conclusion that there is nothing wrong with my pipeline (again, thanks to you for your suggestions too).
The reason for believing so is because the following pipeline works fine:
and it works because the videoconvert can produce ‘RGB’ format. And, the inference code calls a function to convert RGB to RGBA.
Now, with ‘nvvidconv’, I have an advantage of getting the RGBA frame from the pipeline itself and maybe don’t need to pass it through the function that converts RGB to RGBA.
This is my initial intuition and will hope that this works.
If there are any good sources to read about different types of format, please feel free to suggest. As always, you have been great!
My initial intuition was correct. With some more inspection of the inference code I am using, I found an implementation of conversion from NV12 to RGBA.
Thus, I changed my pipeline to the following (and it works as expected!):
Is any further optimization possible?
I tried using nvivafilter as you suggested earlier. But when I test the pipeline using gst-launch, I get some corrupted output as shown in the attachment (corrupt.png).
In general, what is the difference between video/x-raw(memory:NVMM) and just video/x-raw? The former one stores memory on the device and the later on the host?
Related to question 2, when I try to use video/x-raw(memory:NVMM) in the appsink pipeline above, I am getting a black image. What could be the issue, if you may comment? I am using the following pipeline as a reference: