IP camera RSTP using with Jetson Nano.

You would just edit the patched gstCamera.cpp, in buildLaunchStr() find the extra user pipeline section and add sync=false to appsink such as:

	else // GST_SOURCE_USERPIPELINE
    {
		ss << mCameraStr;
		ss << " ! videoconvert ! video/x-raw, format=RGB, width=(int)" << mWidth << ", height=(int)" << mHeight << " ! "; 
		ss << "appsink name=mysink sync=false";
		mSource = GST_SOURCE_USERPIPELINE;
	}

Not tried, but you’ll tell us.

PS: It may also help to specify caps video/x-raw(memory:NVMM), format=BGRx between nvvidconv and videoconvert. I420 or NV12 may be used here, but would make videoconvert much slower, while it will just have to remove the extra fourth byte from BGRx format.

OMG. I have literally been trying to get this hack to work for a year and finally got it to work. Thank you. My issue is that I am way out of practice on C programming and have never applied a patch in my life. I have been using darknet to recognize RTSP of our chickens using a custom YOLOv3 model at 20 fps and now I can use detectnet-camera at 100 fps. Yay.

1 Like

Glad it worked out for you.
Hope I shouldn’t be sorry for the chickens ;-)

Only for eggs. We love our 6 chickens. You can check out my chicken naming robot here: GitHub - DennisFaucher/ChickenDetection: Create a custom machine learning model to recognize our six chickens by name

1 Like

Hi, thanks for the quick reply. The last format=BGRx did the trick there! Very much appreciate it!

Accessing .mp4 from gstCamera works like a charm. I am playing around with different syntaxes for my rtsp camera. The gst-launch-1.0 CLI that I have gotten to work with my RTSP camera is “gst-launch-1.0 rtspsrc location=rtsp://dennis:password@192.168.86.42:88/videoMain ! queue ! decodebin ! nvvidconv ! videoconvert ! xvimagesink” What would this look like in “gstCamera* camera = gstCamera::Create(” format in gstCamera.cpp ? TIA

You would just cut after nvvidconv and add output caps video/x-raw, format=BGRx:

rtspsrc location=rtsp://dennis:password@192.168.86.42:88/videoMain ! queue ! decodebin ! nvvidconv ! video/x-raw, format=BGRx
1 Like

It works @ 100 fps. Thank you so much.

gstCamera* camera = gstCamera::Create(640, 480, "rtspsrc location=rtsp://dennis:password@192.168.86.42:88/videoMain ! queue ! decodebin ! nvvidconv ! video/x-raw, format=BGRx");

Hello,

I have a problem with both methods.

For the first one i start new process :

css-jetson-dev@cssjetsondev-desktop:~$ gst-launch-1.0 rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=100 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video1

That’s what I get:

Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 

And thats whats in my code:

camera = jetson.utils.gstCamera(1920,1080,"/dev/video1")

And that’s the error I get:

(python3:10191): GStreamer-CRITICAL **: 08:58:37.917: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
[gstreamer] gstCamera failed to create pipeline
[gstreamer]    (no source element for URI "/dev/video1")
[gstreamer] failed to init gstCamera (GST_SOURCE_NVCAMERA, camera /dev/video1)
[gstreamer] gstCamera attempting to initialize with GST_SOURCE_V4L2, camera /dev/video1
[gstreamer] gstCamera pipeline string:
v4l2src device=/dev/video1 ! video/x-raw, width=(int)1920, height=(int)1080, format=YUY2 ! videoconvert ! video/x-raw, format=RGB ! videoconvert !appsink name=mysink
[gstreamer] gstCamera successfully initialized with GST_SOURCE_V4L2, camera /dev/video1
jetson.utils -- PyDisplay_New()
jetson.utils -- PyDisplay_Init()
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- display device initialized
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstCamera failed to set pipeline state to PLAYING (error 0)
[gstreamer] gstCamera failed to capture frame
Traceback (most recent call last):
  File "detectnet-camera.py", line 65, in <module>
    img, width, height = camera.CaptureRGBA()
Exception: jetson.utils -- gstCamera failed to CaptureRGBA()
PyTensorNet_Dealloc()
jetson.utils -- PyCamera_Dealloc()
jetson.utils -- PyDisplay_Dealloc()

I’am able to see livestream using :

gst-launch-1.0 -v playbin uri=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ uridecodebin0::source::latency=100

The second one:

rtsp_src="rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=100 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  "

camera = jetson.utils.gstCamera(1280,720,rtsp_src)

I get :

[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstCamera::Create('rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  ') as user pipeline, may fail...
[gstreamer] gstCamera attempting to initialize with GST_SOURCE_NVARGUS, camera rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  
[gstreamer] gstCamera pipeline string:
rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720   ! videoconvert ! video/x-raw, format=BGR, width=(int)1280, height=(int)720 ! appsink name=mysink
[gstreamer] gstCamera successfully initialized with GST_SOURCE_USERPIPELINE, camera rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  
jetson.utils -- PyDisplay_New()
jetson.utils -- PyDisplay_Init()
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- display device initialized
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
Opening in BLOCKING MODE 
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> videoconvert0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> nvvconv0
[gstreamer] gstreamer changed state from NULL to READY ==> nvv4l2decoder0
[gstreamer] gstreamer changed state from NULL to READY ==> h264parse0
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264depay0
[gstreamer] gstreamer changed state from NULL to READY ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> videoconvert0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvvconv0
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvv4l2decoder0
[gstreamer] gstreamer changed state from READY to PAUSED ==> h264parse0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264depay0
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer msg new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> videoconvert0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvvconv0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvv4l2decoder0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> h264parse0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264depay0
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtspsrc0
[gstreamer] gstreamer msg progress ==> rtspsrc0
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 

Which results in creating window from glDisplay ‘Nvidia Jetson’ i presume but it seems like it doesn’t has anything apart from the name bar. I have to force close it.

Seems the pipeline correctly launched, so it might be another issue.
I have seen such case where nvv4l2decoder was stalling when h264parse was used. So I’d suggest to replace it with omxh264dec or remove h264parse for a try.

Also note that @dusty_nv has recently added support for other video sources such as RTSP, so my dirty patch is obsolete.
Better check out the new version.

Thanks for the notice, I have missed that branch completly.
I got problems again with those solutions, link to github issue
but I think I almost managed to make it work on v4l2 loopback,
My camera uses h264 codecs.

My code:

import jetson.utils
import argparse
import sys


parser = argparse.ArgumentParser()
parser.add_argument("output_URI", type=str, default="", nargs='?', help="URI of the output stream")
opt = parser.parse_known_args()[0]

camera = jetson.utils.gstCamera(1280,720,"/dev/video1")
image, width, height = camera.CaptureRGBA()
jetson.utils.saveImageRGBA("test.jpg", img, width, height)

using :
gst-launch-1.0 rtspsrc location=rtsp://login:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video1

css-jetson-dev@cssjetsondev-desktop:~/Documents/Bitbucket/cssai$ python3 ./samples/person-detection/testing-rtsp.py 
[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstCamera -- attempting to create device v4l2:///dev/video1
[gstreamer] gstCamera -- found v4l2 device: Dummy video device (0x0000)
[gstreamer] v4l2-proplist, device.path=(string)/dev/video1, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)"v4l2\ loopback", v4l2.device.card=(string)"Dummy\ video\ device\ \(0x0000\)", v4l2.device.bus_info=(string)platform:v4l2loopback-000, v4l2.device.version=(uint)264588, v4l2.device.capabilities=(uint)2233499650, v4l2.device.device_caps=(uint)86016002;
[gstreamer] gstCamera -- found 38 caps for v4l2 device /dev/video1
[gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [1] video/x-raw, format=(string)UYVY, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [2] video/x-raw, format=(string)Y42B, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [3] video/x-raw, format=(string)I420, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [4] video/x-raw, format=(string)YV12, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [5] video/x-raw, format=(string)Y41B, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [6] video/x-raw, format=(string)YVU9, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [7] video/x-raw, format=(string)YUV9, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [8] video/mpegts, systemstream=(boolean)true;
[gstreamer] [9] image/jpeg, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [10] image/jpeg, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [11] video/x-dv, systemstream=(boolean)true, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 };
[gstreamer] [12] video/x-raw, format=(string)xRGB, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ sRGB };
[gstreamer] [13] video/x-raw, format=(string)BGRx, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ sRGB };
[gstreamer] [14] video/x-raw, format=(string)RGB, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ sRGB };
[gstreamer] [15] video/x-raw, format=(string)BGR, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ sRGB };
[gstreamer] [16] video/x-raw, format=(string)NV12, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [17] video/x-raw, format=(string)BGR15, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ sRGB };
[gstreamer] [18] video/x-raw, format=(string)RGB16, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ sRGB };
[gstreamer] [19] video/x-raw, format=(string)RGB15, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ sRGB };
[gstreamer] [20] video/x-bayer, format=(string)bggr, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [21] video/x-bayer, format=(string)gbrg, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [22] video/x-bayer, format=(string)grbg, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [23] video/x-bayer, format=(string)rggb, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [24] video/x-raw, format=(string)GRAY8, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [25] video/x-vp9, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 };
[gstreamer] [26] video/x-vp8, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 };
[gstreamer] [27] video/x-wmv, wmvversion=(int)3, format=(string)WVC1, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [28] video/x-wmv, wmvversion=(int)3, format=(string)WVC1, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [29] video/mpeg, mpegversion=(int)4, systemstream=(boolean)false, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [30] video/mpeg, mpegversion=(int)4, systemstream=(boolean)false, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [31] video/mpeg, mpegversion=(int)2, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [32] video/mpeg, mpegversion=(int)2, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [33] video/x-h263, variant=(string)itu, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [34] video/x-h264, stream-format=(string)avc, alignment=(string)au, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [35] video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], colorimetry=(string){ 2:4:7:1 }, parsed=(boolean)true;
[gstreamer] [36] video/x-raw, format=(string)GRAY16_LE, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] [37] video/x-raw, format=(string)YVYU, width=(int)[ 1, 8192 ], height=(int)[ 1, 8192 ], interlace-mode=(string){ progressive, interleaved }, colorimetry=(string){ 2:4:7:1 };
[gstreamer] unrecognized codec - video/mpegts
[gstreamer] unrecognized codec - video/x-dv
[gstreamer] unrecognized codec - video/x-wmv
[gstreamer] unrecognized codec - video/x-wmv
[gstreamer] unrecognized codec - video/x-h263
[gstreamer] gstCamera -- couldn't find a compatible codec/format for v4l2 device /dev/video1
[gstreamer] gstCamera -- device discovery failed, but /dev/video1 exists
[gstreamer]              support for compressed formats is disabled
[gstreamer] gstCamera pipeline string:
[gstreamer] v4l2src device=/dev/video1 ! appsink name=mysink
[gstreamer] gstCamera successfully created device v4l2:///dev/video1
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstCamera failed to set pipeline state to PLAYING (error 0)
Traceback (most recent call last):
  File "./samples/person-detection/testing-rtsp.py", line 22, in <module>
    image, width, height = camera.CaptureRGBA()
Exception: jetson.utils -- gstCamera failed to CaptureRGBA()

removing h264parse gives me:
WARNING: erroneous pipeline: could not link rtph264depay0 to nvvconv0

replacing nvv4l2decoder with omxh264dec gives me:

css-jetson-dev@cssjetsondev-desktop:~$ gst-launch-1.0 rtspsrc location=rtsp://user:pass@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! omxh264dec ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video1
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://user:pass@172.16.1.3:554/Streaming/Channels/101/
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request

(gst-launch-1.0:9353): GStreamer-CRITICAL **: 13:32:17.257: gst_caps_is_empty: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:9353): GStreamer-CRITICAL **: 13:32:17.257: gst_caps_truncate: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:9353): GStreamer-CRITICAL **: 13:32:17.257: gst_caps_fixate: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:9353): GStreamer-CRITICAL **: 13:32:17.257: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

(gst-launch-1.0:9353): GStreamer-CRITICAL **: 13:32:17.257: gst_structure_get_string: assertion 'structure != NULL' failed

(gst-launch-1.0:9353): GStreamer-CRITICAL **: 13:32:17.257: gst_mini_object_unref: assertion 'mini_object != NULL' failed
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
Allocating new output: 1920x1088 (x 11), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 1920, nFrameHeight = 1080

and when I try to run the program it gives me some error about wrong format but I can copy it since it crashes.

Hi guys,

I am facing the same problem. So far I have been installing a fresh Jetson enviroment with dusty repository from github.

I am able to get image from a USB camera using jetson.inference and jetson.utils liraries.

I have an IP camera (accesible from windows IE) and I have been trying without any success yet to get image from it. I did what @Honey_Patouceul suggested option 1 and 2 also using @markopo files and compiling.

I installed v4l2loopback:

jetson@jetson-desktop:~$ sudo su
[sudo] password for jetson: 
root@jetson-desktop:/home/jetson# cd /usr/src/
root@jetson-desktop:/usr/src# mkdir v4l2loopback
root@jetson-desktop:/usr/src# git clone https://github.com/umlaeute/v4l2loopback.git v4l2loopback
Cloning into 'v4l2loopback'...
remote: Enumerating objects: 107, done.
remote: Counting objects: 100% (107/107), done.
remote: Compressing objects: 100% (53/53), done.
remote: Total 2117 (delta 63), reused 95 (delta 53), pack-reused 2010
Receiving objects: 100% (2117/2117), 871.54 KiB | 805.00 KiB/s, done.
Resolving deltas: 100% (1214/1214), done.
root@jetson-desktop:/usr/src# cd v4l2loopback/
root@jetson-desktop:/usr/src/v4l2loopback# make
Building v4l2-loopback driver...
make -C /lib/modules/`uname -r`/build M=/usr/src/v4l2loopback modules
make[1]: Entering directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'
  CC [M]  /usr/src/v4l2loopback/v4l2loopback.o
  Building modules, stage 2.
  MODPOST 1 modules
  CC      /usr/src/v4l2loopback/v4l2loopback.mod.o
  LD [M]  /usr/src/v4l2loopback/v4l2loopback.ko
make[1]: Leaving directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'
root@jetson-desktop:/usr/src/v4l2loopback# make install
make -C /lib/modules/`uname -r`/build M=/usr/src/v4l2loopback modules_install
make[1]: Entering directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'
  INSTALL /usr/src/v4l2loopback/v4l2loopback.ko
  DEPMOD  4.9.140-tegra
make[1]: Leaving directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'

SUCCESS (if you got 'SSL errors' above, you can safely ignore them)

I want at least see an image from the terminal, so far I am not sure if I can access the camera. I configure it from Windows IE to accept RSTP connection port 554.

someone can point me in the right direction for the gst-lunch command?

I have tried this:

jetson@jetson-desktop:~$ gst-launch-1.0 rtspsrc location=rtsp://192.168.1.163:554/video/h264 latency=200 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video1
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Cannot identify device ‘/dev/video1’.
Additional debug info:
v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
system error: No such file or directory
Setting pipeline to NULL …
Freeing pipeline …

jetson@jetson-desktop:~$ gst-launch-1.0 rtspsrc location=rtsp://192.168.1.163:554/video/h264 latency=200 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video0
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Cannot identify device ‘/dev/video0’.
Additional debug info:
v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
system error: No such file or directory
Setting pipeline to NULL …
Freeing pipeline …

jetson@jetson-desktop:~$ gst-launch-1.0 rtspsrc location=rtsp://192.168.1.163:554/video/h264 latency=200 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=0
Setting pipeline to PAUSED …
ERROR: Pipeline doesn’t want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Cannot identify device ‘0’.
Additional debug info:
v4l2_calls.c(609): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
system error: No such file or directory
Setting pipeline to NULL …
Freeing pipeline …

Here is the ping answer:

jetson@jetson-desktop:~$ ping 192.168.1.163
PING 192.168.1.163 (192.168.1.163) 56(84) bytes of data.
64 bytes from 192.168.1.163: icmp_seq=1 ttl=63 time=4.08 ms
64 bytes from 192.168.1.163: icmp_seq=2 ttl=63 time=3.68 ms
64 bytes from 192.168.1.163: icmp_seq=3 ttl=63 time=3.12 ms

Thanks

it seems that the attempt to sink to /dev/video1 doesn’t find any /dev/video1
is it present in the system?
also there is no device ‘/dev/video0’
if not it can be created e.g. with
steps from the excerpt below
What are you trying to do?

sudo su
cd /usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9
mkdir v4l2loopback
git clone https://github.com/umlaeute/v4l2loopback.git v4l2loopback
cd v4l2loopback && git checkout -b v0.10.0
make
make install
apt-get install -y v4l2loopback-dkms v4l2loopback-utils
modprobe v4l2loopback devices=1 video_nr=2 exclusive_caps=1
echo options v4l2loopback devices=1 video_nr=2 exclusive_caps=1 > /etc/modprobe.d/v4l2loopback.conf
echo v4l2loopback > /etc/modules
update-initramfs -u

Hi @_av,

Thanks for your quick answer.

It fails:

[sudo] password for jetson: 
root@jetson-desktop:/home/jetson# clear

root@jetson-desktop:/home/jetson# cd /usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9# mkdir v4l2loopback
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9# git clone https://github.com/umlaeute/v4l2loopback.git v4l2loopback
Cloning into 'v4l2loopback'...
remote: Enumerating objects: 107, done.
remote: Counting objects: 100% (107/107), done.
remote: Compressing objects: 100% (53/53), done.
remote: Total 2117 (delta 63), reused 95 (delta 53), pack-reused 2010
Receiving objects: 100% (2117/2117), 871.54 KiB | 662.00 KiB/s, done.
Resolving deltas: 100% (1214/1214), done.
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9# cd v4l2loopback && git checkout -b v0.10.0
Switched to a new branch 'v0.10.0'
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# make
Building v4l2-loopback driver...
make -C /lib/modules/`uname -r`/build M=/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback modules
make[1]: Entering directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'
  CC [M]  /usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback/v4l2loopback.o
  Building modules, stage 2.
  MODPOST 1 modules
  CC      /usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback/v4l2loopback.mod.o
  LD [M]  /usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback/v4l2loopback.ko
make[1]: Leaving directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# make install
make -C /lib/modules/`uname -r`/build M=/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback modules_install
make[1]: Entering directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'
  INSTALL /usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback/v4l2loopback.ko
  DEPMOD  4.9.140-tegra
make[1]: Leaving directory '/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9'

SUCCESS (if you got 'SSL errors' above, you can safely ignore them)

root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# apt-get install -y v4l2loopback-dkms v4l2loopback-utils
Reading package lists... Done
Building dependency tree       
Reading state information... Done
E: Unable to locate package v4l2loopback-dkms
E: Unable to locate package v4l2loopback-utils

Did I miss something?

sudo apt update
then repeat apt steps
apt-get install -y v4l2loopback-dkms v4l2loopback-utils
What are you trying to do? Could you extend what is your use case, please?
you are also missing the rest of steps after the apt command, in case you are trying to execute steps from my post above
otherwise the log provided above is not full probably?

I want to get image from an IP camera starting from terminal commands gst-lunch. Because I read that I need to modify the gstCamera files, which I did but I still having errors.

root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# sudo apt update
Reading package lists... Done
Building dependency tree       
Reading state information... Done
All packages are up to date.
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# apt-get install -y v4l2loopback-dkms v4l2loopback-utils
Reading package lists... Done
Building dependency tree       
Reading state information... Done
E: Unable to locate package v4l2loopback-dkms
E: Unable to locate package v4l2loopback-utils
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# apt-get install -y v4l2loopback-dkms v4l2loopback-utils
Reading package lists... Done
Building dependency tree       
Reading state information... Done
E: Unable to locate package v4l2loopback-dkms
E: Unable to locate package v4l2loopback-utils
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback#

root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# modprobe v4l2loopback devices=1 video_nr=2 exclusive_caps=1
modprobe: FATAL: Module v4l2loopback not found in directory /lib/modules/4.9.140-tegra
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# echo options v4l2loopback devices=1 video_nr=2 exclusive_caps=1 > /etc/modprobe.d/v4l2loopback.conf
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# echo v4l2loopback > /etc/modules
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback# update-initramfs -u
update-initramfs: Generating /boot/initrd.img-4.9.140-tegra
Warning: couldn't identify filesystem type for fsck hook, ignoring.
I: The initramfs will attempt to resume from /dev/zram3
I: (UUID=1609c8b2-16c0-4970-904f-b8ac731a0e3f)
I: Set the RESUME variable to override this.
/sbin/ldconfig.real: Warning: ignoring configuration file that cannot be opened: /etc/ld.so.conf.d/aarch64-linux-gnu_EGL.conf: No such file or directory
/sbin/ldconfig.real: Warning: ignoring configuration file that cannot be opened: /etc/ld.so.conf.d/aarch64-linux-gnu_GL.conf: No such file or directory
root@jetson-desktop:/usr/src/linux-headers-4.9.140-tegra-ubuntu18.04_aarch64/kernel-4.9/v4l2loopback#

I will start from scratch just in case I did something wrong.

I only can see from your post that you are not able to install
v4l2loopback-dkms v4l2loopback-utils
Could you extend on how exactly you

you are still missing all steps from my script after the apt install line
did you ever tried executing some of further lines? after the apt install line?
probably you want to sink ip camera rtsp stream to a local video loopback device?
or want them recorded to file? to movie file? avi file? mp4 file? to raw format? to images JPG files?
regarding the apt you may need to get multiverse universe enabled
regarding the virtual camera you may need to execute other lines from the quoted script to bring up virtual interfaces
regarding other unknowns - they wil need to be defined in details
what is the output of the command below?
ls /dev/video*

Following this topic. I saw form @markopo video, that he can stream video from the terminal using the following command even before start coding into python.

gst-launch-1.0 rtspsrc location=rtsp://192.168.1.163:554/video/h264 latency=200 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video1

I modified it for my local connection and port.

Not sure if this info is relevant but here is the configuration of the IP Camera:

Ports:

My end goal is take images jpg from an IP camera to do text recognition.
The camera that I am using is: Samsung SNV-7084
(I already have the script working using a USB camera)

I used a spare SD with fresh installation dusty and the v4l2loopback was successful.

jetson@jetson-desktop:~$ ls /dev/video*
/dev/video2