Opencv streaming issue

hi,

I am developing my own camera for Xavier NX. I can use v4l2-ctl to grab image by below command:
$ v4l2-ctl --set-fmt-video=width=1280,height=1024,pixelformat=RG8 --set-ctrl bypass_mode=0 --stream-mmap --stream-count=1 --stream-to=test.raw -d /dev/video0
I can also increase --stream-count to grab multiple frames.

But I can’t use opencv to grab streaming image. Here is test code:

#include “opencv2/core.hpp”
#include “opencv2/imgproc.hpp”
#include “opencv2/highgui.hpp”
#include “opencv2/videoio.hpp”
#include

using namespace cv;
using namespace std;

#define VIDEO_DEV 0

int main()
{
cout << "Built with OpenCV " << CV_VERSION << endl;

Mat image;
VideoCapture capture;

capture.open(VIDEO_DEV);
if(capture.isOpened())
{
    cout << "Capture is opened" << endl;

    for(;;)
    {
        capture >> image;
        printf("Image size: W=%d, H=%d\n", image.cols, image.rows);

        if(image.empty()) {
            cout << "image is empty" << endl;
            break;
        }

        imshow("Sample", image);
        if(waitKey(10) >= 0)
            break;
    }
}
else
{
    cout << "No capture" << endl;
    image = Mat::zeros(480, 640, CV_8UC1);
    imshow("Sample", image);
    waitKey(0);
}

return 0;

}

Here is console output when running the above code:

Built with OpenCV 4.1.1
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (1757) handleMessage OpenCV | GStreamer warning: Embedded video playback halted; module v4l2src0 reported: Internal data stream error.
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (886) open OpenCV | GStreamer warning: unable to start pipeline
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (480) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created
Capture is opened
Image size: W=1280, H=1024

Here is output from serial debug output:

[ 6835.095879] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 128, err_data 131072
[ 6837.748735] tegra194-vi5 15c10000.vi: no reply from camera processor
[ 6837.748896] tegra194-vi5 15c10000.vi: uncorr_err: request timed out after 2500 ms
[ 6837.749076] tegra194-vi5 15c10000.vi: err_rec: attempting to reset the capture channel
[ 6837.752841] tegra194-vi5 15c10000.vi: err_rec: successfully reset the capture channel
[ 6837.820134] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 134, err_data 131072
[ 6837.853334] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 128, err_data 131072
[ 6840.564528] tegra194-vi5 15c10000.vi: no reply from camera processor
[ 6840.564688] tegra194-vi5 15c10000.vi: uncorr_err: request timed out after 2500 ms
[ 6840.564869] tegra194-vi5 15c10000.vi: err_rec: attempting to reset the capture channel
[ 6840.566593] tegra194-vi5 15c10000.vi: err_rec: successfully reset the capture channel
[ 6840.610837] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 134, err_data 131072

Any suggestion or advice will be great help.

Thanks in advance.

Hal.

@lr75021 have you gotten a gstreamer pipeline to work? e.g., gst-launch-1.0?

Hi,
RG8 looks to be Bayer format. Please refer to document for sensor driver programming:
https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide%2Fcamera_sensor_prog.html%23wwconnect_header

When the driver and device tree is ready, you can capture by running

$ gst-launch-1.0 nvarguscamerasrc ! nvoverlaysink

And then you can try this python sample:

hi DaneLLL,

I tried gst-launch-1.0
$ gst-launch-1.0 nvarguscamerasrc ! nvoverlaysink

but output like this:

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:557 No cameras available
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.355366777
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

ls /dev can find /dev/video0 and dev/video1, and I can use v4l2-ctl to capture images.

Thanks,

hal.

when running above python code, it returns error as below:

Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:557 No cameras available
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1

same issue it can’t find the camera.

Here are other output for your reference:

$ v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: ‘RGGB’
Name : 8-bit Bayer RGRG/GBGB
Size: Discrete 1280x1024
Interval: Discrete 0.033s (30.000 fps)

$ media-ctl -p
Media controller API version 0.1.0

Media device information

driver tegra194-vi5
model NVIDIA Tegra Video Input Device
serial
bus info
hw revision 0x3
driver version 0.0.0

Device topology

  • entity 1: sc130gs 9-0030 (1 pad, 1 link)
    type V4L2 subdev subtype Sensor flags 0
    device node name /dev/v4l-subdev0
    pad0: Source
    [fmt:SRGGB8_1X8/1280x1024 field:none colorspace:srgb]
    -> “15a00000.nvcsi–2”:0 [ENABLED]

  • entity 3: 15a00000.nvcsi–2 (2 pads, 2 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev1
    pad0: Sink
    <- “sc130gs 9-0030”:0 [ENABLED]
    pad1: Source
    -> “vi-output, sc130gs 9-0030”:0 [ENABLED]

  • entity 6: vi-output, sc130gs 9-0030 (1 pad, 1 link)
    type Node subtype V4L flags 0
    device node name /dev/video0
    pad0: Sink
    <- “15a00000.nvcsi–2”:1 [ENABLED]

  • entity 18: sc130gs 10-0030 (1 pad, 1 link)
    type V4L2 subdev subtype Sensor flags 0
    device node name /dev/v4l-subdev2
    pad0: Source
    [fmt:SRGGB8_1X8/1280x1024 field:none colorspace:srgb]
    -> “15a00000.nvcsi–1”:0 [ENABLED]

  • entity 20: 15a00000.nvcsi–1 (2 pads, 2 links)
    type V4L2 subdev subtype Unknown flags 0
    device node name /dev/v4l-subdev3
    pad0: Sink
    <- “sc130gs 10-0030”:0 [ENABLED]
    pad1: Source
    -> “vi-output, sc130gs 10-0030”:0 [ENABLED]

  • entity 23: vi-output, sc130gs 10-0030 (1 pad, 1 link)
    type Node subtype V4L flags 0
    device node name /dev/video1
    pad0: Sink
    <- “15a00000.nvcsi–1”:1 [ENABLED]

Thanks,

hal.

Hi,
Looks like the driver is not ready. Should miss some settings in device tree. Please follow sensor driver programming guide:
https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide%2Fcamera_sensor_prog.html%23

hi,

I have gone through all my codes line by line with imx219’s code, but didn’t find anything I can modify.
Is there any other command I can debug?
Your help will be greate appreciated.

Hal.

It must be the device tree have problem for below message. Maybe be the devname in tegra_camera_platform not match with kernel driver report.(sc130gs 9-0030)

Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:557 No cameras available

hello lr75021,

the port binding and sensor default configuration should be correct since you’re able to enable camera stream with v4l2 standard control to grab several images.

please refer to Camera Architecture Stack, device tree (i.e. DT) properties were important configuration while you’d enable camera via libargus. those camera pipeline initial configurations were define by DT properties.

it usually related to pixel clock settings, please check Sensor Pixel Clock session and examine your sensor pixel clock calculation formulas.
please also check Debug Tips chapter for reference to check your sensor driver.
thanks

Additional note: This would just be a workaround until you get it through argus.

Maybe that simple gstreamer pipeline could work for your case (debayering with CPU will make some load, though):

gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-bayer, format=rggb, width=1280, height=1024, framerate=30/1 ! bayer2rgb ! videoconvert ! xvimagesink

hi Honey_Patouceul,

Thanks for the advice. It is working by using this command line.

hal

hi JerryChang,

Thanks for your advice. I will look into pixel clock and let you know later.

As I can use gst-launch-1.0 for streaming now, I am wondering it might just need to set parameter correctly in opencv.

Best,

Hal.

Hi,
If you can run

$ gst-launch-1.0 nvarguscamerasrc ! nvoverlaysink

Please try the sample:
OpenCV Video Capture with GStreamer doesn't work on ROS-melodic
You would need to modify width and height accordingly.

hi DaneLLL,

here is output:

$ gst-launch-1.0 -v nvarguscamerasrc ! nvoverlaysink

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:557 No cameras available
Got EOS from element “pipeline0”.
Execution ended after 0:00:00.095911760
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

Looks like the same error.

Please help to advise if I can check anything else?

Thanks,

hal.

Hi,
It is till an issue in your sensor driver. For driver programming, the document is with clear information and we have shared guidance/tips. If you are not able to investigate further, probably you can consider to use existing camera modules from our partners. We have the partner list and please take a look at

Not sure, but you may have your driver to provide RG10 (10 bits bayer) for using ISP as IMX219.
Otherwise sacrify one core for debayering as in my previous pipeline. You would try adding queue plugins around such as (not tested, you might try yourself):

cv::VideoCapture capture("v4l2src device=/dev/video0 ! video/x-bayer, format=rggb, width=1280, height=1024, framerate=30/1 ! bayer2rgb ! queue ! videoconvert ! video/x-raw, format=BGR ! queue ! appsink", cv::CAP_GSTREAMER);

Hi Honey_Patouceul,

Thanks for looking into my case.

I have tried above command and here is log of output.


Built with OpenCV 4.1.1
Command:
v4l2src device=/dev/video0 ! video/x-bayer, format=rggb, width=1280, height=1024, framerate=30/1 ! bayer2rgb ! queue ! videoconvert ! video/x-raw, format=BGR ! queue ! appsink
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (1757) handleMessage OpenCV | GStreamer warning: Embedded video playback halted; module v4l2src1 reported: Internal data stream error.
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (886) open OpenCV | GStreamer warning: unable to start pipeline
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (480) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created
Capture is opened
Image size: W=1280, H=1024
Image size: W=1280, H=1024

[ 2366.906651] sc130gs 9-0030: sc130gs_set_mode: mode index:0
[ 2367.147868] sc130gs 9-0030: sc130gs_start_streaming.
[ 2367.225872] sc130gs 9-0030: sc130gs_stop_streaming.
[ 2367.276405] sc130gs 9-0030: sc130gs_set_mode: mode index:0
[ 2367.317731] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 128, err_data 131072
[ 2367.517623] sc130gs 9-0030: sc130gs_start_streaming.
[ 2369.875597] tegra194-vi5 15c10000.vi: no reply from camera processor
[ 2369.875789] tegra194-vi5 15c10000.vi: uncorr_err: request timed out after 2500 ms
[ 2369.875942] tegra194-vi5 15c10000.vi: err_rec: attempting to reset the capture channel
[ 2369.878853] tegra194-vi5 15c10000.vi: err_rec: successfully reset the capture channel
[ 2369.942343] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 134, err_data 131072
[ 2369.975637] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 128, err_data 131072
[ 2372.691584] tegra194-vi5 15c10000.vi: no reply from camera processor
[ 2372.691750] tegra194-vi5 15c10000.vi: uncorr_err: request timed out after 2500 ms
[ 2372.691940] tegra194-vi5 15c10000.vi: err_rec: attempting to reset the capture channel
[ 2372.694628] tegra194-vi5 15c10000.vi: err_rec: successfully reset the capture channel
[ 2372.733040] tegra194-vi5 15c10000.vi: corr_err: discarding frame 0, flags: 134, err_data 131072
[ 2372.745384] sc130gs 9-0030: sc130gs_stop_streaming.

It looks like running to grab images frame by frame, but no image comes back.
Not sure where is the problem, please help to advise.

Best,

Hal.

Be sure no other software is already using the camera.
You may add control bypass_mode=0 for not interfering with argus daemon:

cv::VideoCapture capture("v4l2src device=/dev/video0 extra-controls=s,bypass_mode=0 ! video/x-bayer, format=rggb, width=1280, height=1024, framerate=30/1 ! bayer2rgb ! queue ! videoconvert ! video/x-raw, format=BGR ! queue ! appsink", cv::CAP_GSTREAMER);

Would this be working, it would just be a workaround.
You’d better fix your camera driver or DT settings as Jerry mentionned, so that you could use HW debayering with Argus.

hi Honey_Patouceul,

Thanks for advice again.
I had fix a bug in driver, and now it is working in opencv.

P.S.
We can resolve this issue now.
I will create another ticket to discuss argus driver.

Best,

hal.