IP camera RSTP using with Jetson Nano.

You would just edit the patched gstCamera.cpp, in buildLaunchStr() find the extra user pipeline section and add sync=false to appsink such as:

	else // GST_SOURCE_USERPIPELINE
    {
		ss << mCameraStr;
		ss << " ! videoconvert ! video/x-raw, format=RGB, width=(int)" << mWidth << ", height=(int)" << mHeight << " ! "; 
		ss << "appsink name=mysink sync=false";
		mSource = GST_SOURCE_USERPIPELINE;
	}

Not tried, but you’ll tell us.

PS: It may also help to specify caps video/x-raw(memory:NVMM), format=BGRx between nvvidconv and videoconvert. I420 or NV12 may be used here, but would make videoconvert much slower, while it will just have to remove the extra fourth byte from BGRx format.

OMG. I have literally been trying to get this hack to work for a year and finally got it to work. Thank you. My issue is that I am way out of practice on C programming and have never applied a patch in my life. I have been using darknet to recognize RTSP of our chickens using a custom YOLOv3 model at 20 fps and now I can use detectnet-camera at 100 fps. Yay.

1 Like

Glad it worked out for you.
Hope I shouldn’t be sorry for the chickens ;-)

Only for eggs. We love our 6 chickens. You can check out my chicken naming robot here: https://github.com/DennisFaucher/ChickenDetection

1 Like

Hi, thanks for the quick reply. The last format=BGRx did the trick there! Very much appreciate it!

Accessing .mp4 from gstCamera works like a charm. I am playing around with different syntaxes for my rtsp camera. The gst-launch-1.0 CLI that I have gotten to work with my RTSP camera is “gst-launch-1.0 rtspsrc location=rtsp://dennis:password@192.168.86.42:88/videoMain ! queue ! decodebin ! nvvidconv ! videoconvert ! xvimagesink” What would this look like in “gstCamera* camera = gstCamera::Create(” format in gstCamera.cpp ? TIA

You would just cut after nvvidconv and add output caps video/x-raw, format=BGRx:

rtspsrc location=rtsp://dennis:password@192.168.86.42:88/videoMain ! queue ! decodebin ! nvvidconv ! video/x-raw, format=BGRx
1 Like

It works @ 100 fps. Thank you so much.

gstCamera* camera = gstCamera::Create(640, 480, "rtspsrc location=rtsp://dennis:password@192.168.86.42:88/videoMain ! queue ! decodebin ! nvvidconv ! video/x-raw, format=BGRx");

Hello,

I have a problem with both methods.

For the first one i start new process :

css-jetson-dev@cssjetsondev-desktop:~$ gst-launch-1.0 rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=100 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx ! identity drop-allocation=true ! v4l2sink device=/dev/video1

That’s what I get:

Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 

And thats whats in my code:

camera = jetson.utils.gstCamera(1920,1080,"/dev/video1")

And that’s the error I get:

(python3:10191): GStreamer-CRITICAL **: 08:58:37.917: gst_element_make_from_uri: assertion 'gst_uri_is_valid (uri)' failed
[gstreamer] gstCamera failed to create pipeline
[gstreamer]    (no source element for URI "/dev/video1")
[gstreamer] failed to init gstCamera (GST_SOURCE_NVCAMERA, camera /dev/video1)
[gstreamer] gstCamera attempting to initialize with GST_SOURCE_V4L2, camera /dev/video1
[gstreamer] gstCamera pipeline string:
v4l2src device=/dev/video1 ! video/x-raw, width=(int)1920, height=(int)1080, format=YUY2 ! videoconvert ! video/x-raw, format=RGB ! videoconvert !appsink name=mysink
[gstreamer] gstCamera successfully initialized with GST_SOURCE_V4L2, camera /dev/video1
jetson.utils -- PyDisplay_New()
jetson.utils -- PyDisplay_Init()
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- display device initialized
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstCamera failed to set pipeline state to PLAYING (error 0)
[gstreamer] gstCamera failed to capture frame
Traceback (most recent call last):
  File "detectnet-camera.py", line 65, in <module>
    img, width, height = camera.CaptureRGBA()
Exception: jetson.utils -- gstCamera failed to CaptureRGBA()
PyTensorNet_Dealloc()
jetson.utils -- PyCamera_Dealloc()
jetson.utils -- PyDisplay_Dealloc()

I’am able to see livestream using :

gst-launch-1.0 -v playbin uri=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ uridecodebin0::source::latency=100

The second one:

rtsp_src="rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=100 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  "

camera = jetson.utils.gstCamera(1280,720,rtsp_src)

I get :

[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstCamera::Create('rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  ') as user pipeline, may fail...
[gstreamer] gstCamera attempting to initialize with GST_SOURCE_NVARGUS, camera rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  
[gstreamer] gstCamera pipeline string:
rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720   ! videoconvert ! video/x-raw, format=BGR, width=(int)1280, height=(int)720 ! appsink name=mysink
[gstreamer] gstCamera successfully initialized with GST_SOURCE_USERPIPELINE, camera rtspsrc location=rtsp://username:password@172.16.1.3:554/Streaming/Channels/101/ latency=0 ! rtph264depay! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=BGRx, width=1280,height=720  
jetson.utils -- PyDisplay_New()
jetson.utils -- PyDisplay_Init()
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- display device initialized
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
Opening in BLOCKING MODE 
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> videoconvert0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> nvvconv0
[gstreamer] gstreamer changed state from NULL to READY ==> nvv4l2decoder0
[gstreamer] gstreamer changed state from NULL to READY ==> h264parse0
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264depay0
[gstreamer] gstreamer changed state from NULL to READY ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> videoconvert0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvvconv0
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvv4l2decoder0
[gstreamer] gstreamer changed state from READY to PAUSED ==> h264parse0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264depay0
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer msg new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> videoconvert0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvvconv0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvv4l2decoder0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> h264parse0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264depay0
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtspsrc0
[gstreamer] gstreamer msg progress ==> rtspsrc0
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 

Which results in creating window from glDisplay ‘Nvidia Jetson’ i presume but it seems like it doesn’t has anything apart from the name bar. I have to force close it.

Seems the pipeline correctly launched, so it might be another issue.
I have seen such case where nvv4l2decoder was stalling when h264parse was used. So I’d suggest to replace it with omxh264dec or remove h264parse for a try.

Also note that @dusty_nv has recently added support for other video sources such as RTSP, so my dirty patch is obsolete.
Better check out the new version.