Streaming FLIR Boson video using Gstreamer 1.0 in Jetson TX2

Hi. I have installed Ubuntu 16.04 using jetapck 3.3 on my jetson TX2. As gstreamer comes along with the package so i tested using gst-launch-1.0 a usb camera (logitech) it works fine. I also use the pieline " gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=xxx.xxx.x.xx port=5600" for sending the stream, which also works fine. However i have a FLIR Boson camera which gives me output if i use the pipeline :“gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! autovideosink”. But it does not work if i use the pipeline: “gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=512 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=xxx.xxx.x.xx port=5600”.
I always get below error:
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2950): gst_base_src_loop (): /gstpipeline:pipeline0/gstv4l2src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.036807627
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

can you help me fixing above pipeline??

The resolution you’re requiring may not be supported by your sensor.
You would check available modes with:

sudo apt-get update
sudo apt-get install v4l-utils

v4l2-ctl -d0 --list-formats-ext

If you want a non supported resolution for your encoded stream, you may try videoscale:

gst-launch-1.0 v4l2src device=/dev/video0 ! videoscale ! 'video/x-raw, width=640, height=512' ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600

You may also use HW scaling and encoding:

gst-launch-1.0 v4l2src device=/dev/video0 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=I420, width=640, height=512' ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600

HI… I have tried the pipeline said by you using videoscale. But now it gives me a different error. i also tried to find out the formats supported by camera. This was the output.
Index : 0 Type: Video Capture Pixel Format: ‘YU12’ Name: Planar YUV 4:2:0 Size: Discrete 640x512 Interval: Discrete 0.111s (9.000 fps) Interval: Discrete 0.133s (7.500 fps)
Index : 1 Type: Video Capture Pixel Format: 'Y16 ’ Name: 16-bit Greyscale Size: Discrete 640x512 Interval: Discrete 0.111s (9.000 fps) Interval: Discrete 0.133s (7.500 fps)
Index : 2 Type: Video Capture Pixel Format: ‘NV12’ Name: Y/CbCr 4:2:0 Size: Discrete 640x512 Interval: Discrete 0.111s (9.000 fps) Interval: Discrete 0.133s (7.500 fps)
Index : 3 Type: Video Capture Pixel Format: ‘’ Name: 3132564e-0000-0010-8000-00aa003 Size: Discrete 640x512 Interval: Discrete 0.111s (9.000 fps) Interval: Discrete 0.133s (7.500 fps)

So based on this output i set the pipeline suggested by you
gst-launch-1.0 v4l2src device=/dev/video0 ! videoscale ! ‘video/x-raw, width=640, height=512’ ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600

But it throws below error…
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
libv4l2: error set_fmt gave us a different result then try_fmt!
libv4l2: warning dest fmt changed after adjusting src fmt for fps change, restoring original src fmtERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
Execution ended after 0:00:00.659794175
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …i
Freeing pipeline …

I do not understand if i run the pipeline " gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=xxx.xxx.x.xx port=5600" for same FLIR Boson camera on my HP Laptop having ubuntu 16.04 LTS, it streams and sends the video to destination without any error. I can receive the video where as if i run the same command on my Jetson TX2 having same os and connecting same camera it throws above error.

Please help and suggest for fixing this.

Seems yor camera provides the expected resolution, so you don’t need scaling.
Also seems there is an unknown format from your camera driver (index 3). This might be the problem.
You may try:

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw, format=NV12, width=640, height=512 ! videoconvert ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600

Hi. I have set the Pipeline as you said . Now the pipeline started with following warnings:

Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
WARNING: from element /GstPipeline:pipeline0/GstVideoConvert:videoconvert0: Internal GStreamer error: code not implemented. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstvideofilter.c(292): gst_video_filter_transform (): /GstPipeline:pipeline0/GstVideoConvert:videoconvert0:
invalid video buffer received
WARNING: from element /GstPipeline:pipeline0/GstVideoConvert:videoconvert0: Internal GStreamer error: code not implemented. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstvideofilter.c(292): gst_video_filter_transform (): /GstPipeline:pipeline0/GstVideoConvert:videoconvert0:
invalid video buffer received
WARNING: from element /GstPipeline:pipeline0/GstVideoConvert:videoconvert0: Internal GStreamer error: code not implemented. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstvideofilter.c(292): gst_video_filter_transform (): /GstPipeline:pipeline0/GstVideoConvert:videoconvert0:
invalid video buffer received
WARNING: from element /GstPipeline:pipeline0/GstVideoConvert:videoconvert0: Internal GStreamer error: code not implemented. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstvideofilter.c(292): gst_video_filter_transform (): /GstPipeline:pipeline0/GstVideoConvert:videoconvert0:
invalid video buffer received
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:02.376941968
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …
nvbuf_utils: dmabuf_fd -1 mapped entry NOT found

Also i am able to receive output stream on my client PC but it shows only green screen output. My receiving pipeline is “gst-launch-1.0 udpsrc port=5600 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink” .

videoconvert may not be mandatory if format is NV12 as jpegenc supports it as input, you may try to remove it.
I suspect however something is wrong with your camera driver.
You may add -v to gst-launch so that we can see what caps are used.
Also, does it work with videotestsrc ?

gst-launch-1.0 -v videotestsrc ! video/x-raw, format=NV12, width=640, height=512 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600

gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, format=NV12, width=640, height=512 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600

Your receiver pipeline looks fine to me.
I’d suggest removing the JPG/RTP for now and just try to get camera to display:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, format=NV12, width=640, height=512 ! videoconvert ! xvimagesink

gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, format=NV12, width=640, height=512 ! nvvidconv ! nvoverlaysink

Hi… With videotestsrc the pipeline works fine. I am able to get the sample video output on receiver side.

With “gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, format=NV12, width=640, height=512 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600” i am getting below error:
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:sink: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”

ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
Execution ended after 0:00:00.691234530
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

With the Pipeline "gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, format=NV12, width=640, height=512 ! videoconvert ! xvimagesink:
I am getting an output window of green color with below error:
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = “video/x-raw,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1,\ format=(string)YV12”
/GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = “video/x-raw,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1,\ format=(string)YV12”
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
WARNING: from element /GstPipeline:pipeline0/GstVideoConvert:videoconvert0: Internal GStreamer error: code not implemented. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstvideofilter.c(292): gst_video_filter_transform (): /GstPipeline:pipeline0/GstVideoConvert:videoconvert0:
invalid video buffer received
WARNING: from element /GstPipeline:pipeline0/GstVideoConvert:videoconvert0: Internal GStreamer error: code not implemented. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstvideofilter.c(292): gst_video_filter_transform (): /GstPipeline:pipeline0/GstVideoConvert:videoconvert0:
invalid video buffer received
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:01.579245607
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

With Pipeline “gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, format=NV12, width=640, height=512 ! nvvidconv ! nvoverlaysink”. I am getting below error:
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = “video/x-raw(memory:NVMM),\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1,\ format=(string)NV12”
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = “video/x-raw(memory:NVMM),\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1,\ format=(string)NV12”
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = “video/x-raw,\ format=(string)NV12,\ width=(int)640,\ height=(int)512,\ pixel-aspect-ratio=(fraction)1/1,\ interlace-mode=(string)progressive,\ framerate=(fraction)9/1”
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
Execution ended after 0:00:00.660947227
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

If i test with the pipeline “gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! autovideosink” . It works fine and shows me output video from camera.

Also i have tested with Raspberry Pi 3 using the pipeline “gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! nvjpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5600” . I am getting expected output in client PC . The only problem it is not working with Jetson TX2. I am also thinking in jetson somwhere the camera driver has been messed up.

I’d suggest you use -v in the working case to get working caps and specify same caps for other pipelines.

I notice that it tries 9 fps while your sensor/driver can also 7.5 fps. You may also try 15/2 framerate.