After all the problems I had and solved with the CSI cameras, I finally managed to make a working driver for the ov5640 camera, that is using the V4l2 framework. Cool about this camera is that it can output fullHD (1920x1080)@30fps in UYVU format. And all I want from the TX1 now is to be able to compress the video stream and possibly stream it over the network.
All this should be fine, except that the nvvidconv plugin makes things impossible. I cannot make it work with output from the camera.
So I verify the camera is working with this pipeline:
BOARD:
gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)60/1' ! videoconvert ! 'video/x-raw, format=(string)NV12' ! omxh264enc ! h264parse ! rtph264pay ! udpsink host=192.168.11.10 port=5001
PC:
CAPS=...
TO get CAPS run command of BOARD with -v option and copy udpsink0 caps. Remove the “”
gst-launch-1.0 --gst-debug=0 udpsrc port=5001 ! $CAPS ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink sync=true
But this uses SW conversion, and frame rate keeps being low (opposite to the 100% usage on one of the cores).
I can also check the nvvidconv is doing something
BOARD:
gst-launch-1.0 -v videotestsrc ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! h264parse ! rtph264pay ! udpsink host=192.168.11.10 port=5001
This is also working.
But when I replace videoconvert with nvvidconv I get a mysterious error:
gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! h264parse ! rtph264pay ! udpsink host=192.168.11.10 port=5001
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
Execution ended after 0:00:00.092171955
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Any idea where the problem might be?