Jetson Xavier NX and MIPI Arducam Gstream error using nvvidconv and NVMM

Hi, I have been trying to get gstreamer to work in python and I can get the conversion to work.
This:
gst-launch-1.0 v4l2src ! ‘video/x-raw, width=(int)1600, height=(int)1300, framerate=(fraction)120/1, format=(string)GRAY8’ ! videoconvert ! xvimagesink -ev
Open up the stream and shows video. Same in does not convert. I got it to display a 20fps by doing :
gst_str = ('v4l2src device=/dev/video0 ! video/x-raw , width=(int)1600 , ’
'height=(int)1300 , format=(string)GRAY8 , framerate=(fraction)60/1 ! videoconvert ! video/x-raw,
format=BGRx ! videoconvert ! appsink ')

This works with warning:

[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1

, but I think, for my tensorflow project I must have gstreamer working with NVMM and nvvidconv. I assume that I need to utilise the NVMM memory?.

So I try to write the gst like so:
gst_str = ('v4l2src device=/dev/video0 ! ’
'video/x-raw(memory:NVMM), ’
'width=(int)1600, height=(int)1300, ’
'format=(string)GRAY8, framerate=(fraction)30/1 ! ’
'nvvidconv ! ’
'video/x-raw(memory:NVMM), width=(int){}, height=(int){}, ’
'format=(string)BGRx ! ’
‘videoconvert ! appsink’).format(width, height)

I get new errors and cant open pipeline:

[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (711) open OpenCV | GStreamer warning: Error opening bin: could not link v4l2src0 to nvvconv0, v4l2src0 can’t handle caps video/x-raw(memory:NVMM), width=(int)1600, height=(int)1300, format=(string)GRAY8, framerate=(fraction)30/1
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (480) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created

Any ideas? I am lost…

regads,
Magnus

videoconvert doesn’t support NVMM memory. Furthermore, I think that for GRAY8 conversion you may have to use I420 format, that can then be converted to BGRx. You would try:

gst_str = ('v4l2src ! video/x-raw, width=1600, height=1300, framerate=30/1, format=GRAY8 ! nvvidconv ! video/x-raw(memory:NVMM), format=I420 ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink')
1 Like

Thank. That worked.

But does it make sence? My understanding of the pipe is it first converts into I420 using nvvidconv, then to raw BGRx then finally back to raw BGR using videoconvert. How can I utilize NVMM all the way to the appsink? Is that possible?

I still have a warning

[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1

Cant get rid of it…

br,
Magnus

Opencv videoio doesn’t support NVMM memory and doesn’t support 4 channels formats. It supports BGR, I420 and GRAY8 from standard memory.

The warning is harmless, a live stream has no duration so current position cannot be computed.

Thank you.
I figured it had to do with initial frame, annoying but acceptable.

Thanks for the NVMM info :D

If the warning is annoying, you may just comment it in cap_gstreamer.cpp and rebuild/re-install opencv without it.

Furthermore, all the above only makes sense if you want to use opencv algorithms expecting BGR format.
However, in your case all information from camera being in luminance only, you may also just read the GRAY8 from standard memory from v4l2src (or directly use V4L API of videoCapture) and process one channel frames instead.

1 Like

Thanks. I will try that. I have extreamly bad fps and I have no idea what my shutter time is. I get blurry and few images.
I will try direct from GRAY8. I thought there was some good idea to have the image in NVMM and that the Cude or Tensorflow could process the image faster from that memory?

Running:

gst-launch-1.0 -v videotestsrc ! video/x-raw, width=1600, height=1300, framerate=30/1, format=GRAY8 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=I420' ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! fpsdisplaysink video-sink=fakesink text-overlay=false

gives a solid 30 fps with my NX.

I’d guess the problem is with your camera capture. Try adding option io-mode=2 to v4l2src:

gst_str = ('v4l2src io-mode=2 ! video/x-raw, width=1600, height=1300, framerate=30/1, format=GRAY8 ! nvvidconv ! video/x-raw(memory:NVMM), format=I420 ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink')

Also note if you are using imshow for display that it may not be fast on jetson for high resolution*fps.
An alternative could using a videoWriter with a gtsreamer pipeline to another display sink such as this one.

1 Like

Hi, I got a 60fps with the arducam and the framdump was also exceptionally good with res at 1600x1300.
What I cant get working though is the fpsdisplaysink. But I have that question in another thread.