Problems while testing Gstreamer loopback

We are using the SDK of the Jetson TX2. We’ve made some tests with the Omnivision camera that comes with the SDK and with this other camera: LI-JETSON-KIT-IMX274CS.

I’ve been doing some tests with Gstreamer following the “Accelerated GStreamer user guide” revision 28.2. On the “Camera capture with GStreamer-1.0” section I can find three ways to implement a video loopback with GStreamer:

1.- nvgstcapture: the loopback works.

2.- gst-launch-1.0 v4l2src device=/dev/video0 ! "video/x-raw,width=640,height=480,format=(string)I420" ! xvimagesink -e: the loopback doesn't work. I've tried with different resolutions and formats which are supposed to be supported on the camera.

3.- gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),  width=1920, height=1080, format=(string)NV12,  framerate=(fraction)30/1' ! nvoverlaysink -e: I don't have the "nvarguscamerasrc" plugin on my system, so the loopback doesn't work. Also tried with different resolutions and formats.

Finally I make it work with the following Gstreamer chain: gst-launch-1.0 nvcamerasrc sensor-id=0 ! video/x-raw(memory:NVMM),width=3840, height=2160, framerate=20/1, format=(string)I420’ ! nvegltransforrm ! nveglglessink -e

Is there any way to make gst-launch-1.0 work with v4l2src?

The nvgstcapture application and nvcamerasrc plugin are binary files and I can’t change anything on them. How can I add a new sensor (one which is not already supported) without losing the ISP support on the Jetson TX2?

Thanks.

Hi @mcalvo

The difference between nvcamerasrc and v4l2src is that the first one goes through the ISP, while the later doesn’t.

In your case, the sensors your are using produce raw Bayer (at least the imx274 does), so you need the ISP in order to convert it to RGB or a YUV format. That’s why v4l2src is not working for you.

Thanks for your reply. Now I understand the problem with v4l2src.