nvvidconv plugin and v4l2 camera problem :(

This was in regards to my issue posted previously on this thread where running:

gst-launch-1.0 -v v4l2src device="/dev/video1" ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! fakesink sync=true

Wouldn’t work because nvvidconv would always throw:

ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2943): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)

But simply replacing the nvvidconv element with the software based videoconvert worked fine (sans the slowdown because it is software based). So I knew it was related to the v4l2src element linking w/ the nvvidconv element.

In my scenario I am using a custom video4linux2 driver I wrote for my custom hardware, but this driver is essentially a very minor modified version of the ov5930_V4l2 driver included w/ the sdk. And as I mentioned using yavta worked fine with this driver as well as running the gstreamer pipeline w/ videoconvert. So i know my driver is fine.

The only thing that fixed it was setting that io-mode to rw. And i can run the exact pipelines side by side w/ only that change and that is the only thing that makes a difference. Again i still do not know why.

hi x1tester62,

I think we are trying to debug different issues.
There are three style drivers we had used,

  1. soc_camera based NVIDIA driver, like ov5693_v4l2
  2. vivi driver
  3. UVC driver

For the 1st one, I don’t have such kind device except a sensor outputs UYVY with max resolution 480p, which is working fine as per my experiment. But it’s not typical to clarify the issue, at least we should use 1080p video device for a try.

ubuntu@tegra-ubuntu:~$ gst-launch-1.0 v4l2src device=/dev/video2 ! 'video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! nvhdmioverlaysink -v
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/GstNvHDMIOverlaySink-nvhdmioverlaysink:nvhdmioverlaysink-nvhdmioverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:03.050257394
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu@tegra-ubuntu:~$

For the 2nd one, amitev and I always used this virtual driver as it’s common in linux, no hw required and can support 1080p.
For the 3rd one, all the usb cam I have are outputting YUYV, which is beyond the sink capabilities of nvvidconv.

What I can tell is those drivers behave different when using v4l2src along with nvvidconv.

Understood. I am definitely using a soc_camera based driver.

I too am having this same problem. soc_camera based driver with v4l2src results in nvvidconv error.

@x1tester62 what changes did you make to use io_mode=rw? I tried specifying this on the commandline and got:

ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: The driver of device ‘/dev/video1’ does not support the IO method 1

Update: after modifying the vi driver to add the RW capabilitiy I can now use nvvidconv with my soc_camera driver:

tegra_camera/common.c
cap->capabilities = V4L2_CAP_VIDEO_CAPTURE | V4L2_CAP_STREAMING | V4L2_CAP_READWRITE;

What are the adverse implications/side effects of adding READWRITE to the capabilities?

Cool to hear. I didn’t have to add that to my capabilities for it to work though. Thats interesting.

I too would like to know the side effects and also why this is fixing the issue.

I just ran a test with dumping the v4l2src to /dev/null with and without io_mode=rw

gst-launch-1.0 -v v4l2src device=/dev/video1 io_mode=rw ! ‘video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1’ ! filesink location=/dev/null

In the case using io_mode=rw “top” reports about 11% CPU usage.
In the case without specifying io_mode, “top” reports 0% CPU usage.

Makes sense.

So the optimal solution would still be to get dma_buf and/or userptr to work properly with nvvidconv. Still dont know why those don’t work…

I am curious why v4l2src in rw mode gets working for your case, as soc_camera only declares to support mmap and userptr.

q->io_modes = VB2_MMAP | VB2_USERPTR;

Even if you added the capability explicitly, it should be not working in real rw I/O mode thoroughly. Or, it just help to pass a cap query in v4l2src.

As I don’t have a >=1080p yuv sensor on hand, could you guys please help to try if nvcmaerasrc can work with nvvidconv in this case?

I honestly don’t know. Here are the outputs for the different IO-modes. I am using a gst-1.6.0 i compiled from source as shown here:

Factory Details:
  Rank                     primary (256)
  Long-name                Video (video4linux2) Source
  Klass                    Source/Video
  Description              Reads frames from a Video4Linux2 device
  Author                   Edgard Lima <edgard.lima@indt.org.br>, Stefan Kost <ensonic@users.sf.net>

Plugin Details:
  Name                     video4linux2
  Description              elements for Video 4 Linux
  Filename                 /home/root/gst_1.6.0/out/lib/gstreamer-1.0/libgstvideo4linux2.so
  Version                  1.6.0
  License                  LGPL
  Source module            gst-plugins-good
  Source release date      2015-09-25
  Binary package           GStreamer Good Plug-ins source release
  Origin URL               Unknown package origin

RW - Works fine and I CTRL-C-ed after 8 seconds to stop.

gst-launch-1.0 v4l2src device=/dev/video0 do-timestamp=true io-mode=1 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)30/1" !   queue ! nvvidconv ! 'video/x-raw(memory:NVMM), width=1920, height=1080,format=I420, framerate=30/1' !  omxh265enc bitrate=500000 control-rate=2 ! h265parse ! avmux_mpegts ! udpsink host=224.0.0.3 port=5057
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 8
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
===== NVENC blits (mode: 1) into block linear surfaces =====
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:08.711559003
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

MMAP

gst-launch-1.0 v4l2src device=/dev/video0 do-timestamp=true io-mode=2 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)30/1" !   queue ! nvvidconv ! 'video/x-raw(memory:NVMM), width=1920, height=1080,format=I420, framerate=30/1' !  omxh265enc bitrate=500000 control-rate=2 ! h265parse ! avmux_mpegts ! udpsink host=224.0.0.3 port=5057 --gst-debug=3
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:02.181019754  4953   0x14bc90 FIXME           videoencoder gstvideoencoder.c:657:gst_video_encoder_setcaps:<omxh265enc-omxh265enc0> GstVideoEncoder::reset() is deprecated
Framerate set to : 30 at NvxVideoEncoderSetParameter0:00:02.181165168  4953   0x14bc90 WARN              omxh265enc gstomxh265enc.c:114:gst_omx_h265_enc_set_format:<omxh265enc-omxh265enc0> Setting profile/level not supported by component
NvMMLiteOpen : Block : BlockType = 8
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
0:00:02.204898917  4953   0x14b6c0 WARN          v4l2bufferpool gstv4l2bufferpool.c:748:gst_v4l2_buffer_pool_start:<v4l2src0:pool:src> Uncertain or not enough buffers, enabling copy threshold
0:00:02.368595052  4953   0x14b6c0 WARN                 basesrc gstbasesrc.c:2943:gst_base_src_loop:<v4l2src0> error: Internal data flow error.
0:00:02.369372068  4953   0x14b6c0 WARN                 basesrc gstbasesrc.c:2943:gst_base_src_loop:<v4l2src0> error: streaming task paused, reason error (-5)
0:00:02.370948131  4953   0x14b6c0 WARN                   queue gstqueue.c:968:gst_queue_handle_sink_event:<queue0> error: Internal data flow error.
0:00:02.371548119  4953   0x14b6c0 WARN                   queue gstqueue.c:968:gst_queue_handle_sink_event:<queue0> error: streaming task paused, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2943): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
Execution ended after 0:00:01.981959786
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

USERPTR

gst-launch-1.0 v4l2src device=/dev/video0 do-timestamp=true io-mode=3 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)30/1" !   queue ! nvvidconv ! 'video/x-raw(memory:NVMM), width=1920, height=1080,format=I420, framerate=30/1' !  omxh265enc bitrate=500000 control-rate=2 ! h265parse ! avmux_mpegts ! udpsink host=224.0.0.3 port=5057
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 8
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: No downstream pool to import from.
Additional debug info:
gstv4l2object.c(3846): gst_v4l2_object_decide_allocation (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
When importing DMABUF or USERPTR, we need a pool to import from
Execution ended after 0:00:01.792497733
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...


Using MMAP w/o nvidconv (using videoconvert) works as shown:

gst-launch-1.0 v4l2src device=/dev/video0 do-timestamp=true io-mode=2 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)30/1" !   queue ! videoconvert ! 'video/x-raw, width=1920, height=1080,format=I420, framerate=30/1' !  omxh265enc bitrate=500000 control-rate=2 ! h265parse ! avmux_mpegts ! udpsink host=224.0.0.3 port=5057
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:02.135440279  5029   0x14b950 FIXME           videoencoder gstvideoencoder.c:657:gst_video_encoder_setcaps:<omxh265enc-omxh265enc0> GstVideoEncoder::reset() is deprecated
Framerate set to : 30 at NvxVideoEncoderSetParameter0:00:02.135576110  5029   0x14b950 WARN              omxh265enc gstomxh265enc.c:114:gst_omx_h265_enc_set_format:<omxh265enc-omxh265enc0> Setting profile/level not supported by component
NvMMLiteOpen : Block : BlockType = 8
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
===== NVENC blits (mode: 1) into block linear surfaces =====
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:08.120022804
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Hi x1tester62,

Could you please help to try a relative low resolution? such as 480p.

Just tried filesrc and videotesrc, both are working fine.

filesrc

gst-launch-1.0 videotestsrc num-buffers=100 ! 'video/x-raw, width=(int)1920, height=(int)1080, format=(string)UYVY, framerate=(fraction)30/1' ! filesink location=test_1920x1080_UYVY.yuv -v 
gst-launch-1.0 filesrc location=test_1920x1080_UYVY.yuv ! videoparse width=1920 height=1080 format=5 framerate=30 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420' ! nvoverlaysink -v

videotestsrc

gst-launch-1.0 videotestsrc num-buffers=100 ! 'video/x-raw, width=(int)1920, height=(int)1080, format=(string)UYVY, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420' ! nvoverlaysink -v

Hey nVConan.

Both those work for me as well as I have previously mentioned but w/ videotestsrc using such a different buffering system for the frames i do not know if it is comparible.

I will try to get a lower resolution source but it may take me some time since HD is priority currently.

I am excited to inform you guys that we have figured out the problem and fixed it. And You will be expected to get the fix in future release, probably next release.
It is because our nvvidconv doesn’t totally support the v4l2 memory type.

Glad to hear I wasn’t going crazy Conan lol. Thanks a ton.

Will the fix be documented? I don’t see myself upgrading past 23-2 for sometime for stability sake but may want to implement this fix.

Hi

Great to hear that nvvidconv will be updated to support different io-modes.
Can we also expect it to support newer versions of GStreamer?

I think for h265parse we need at least GStreamer 1.6.0, but support for 1.8.0 would be even better.

Yes, it will be documented in the release doc.

It should be compatible with newer versions of GStreamer. You can refer to our mm user guide to build your own gst to have a try.

http://developer.download.nvidia.com/embedded/L4T/r24_Release_v1.0/Docs/L4T_Tegra_X1_Multimedia_User_Guide_Release_24.1.pdf?autho=1467358769_c25ce8aab99e52f4f2899a5012709aba&file=L4T_Tegra_X1_Multimedia_User_Guide_Release_24.1.pdf

You can also follow this wikis to compile and use newer gstreamer versions

https://developer.ridgerun.com/wiki/index.php?title=Compile_gstreamer_on_tegra_X1

-David

Will this make it to the TK1 system?
I tried the 21.5 release but still see this problem there.