How to run RTP Camera in deepstream on Nano

Thanks a lot DaneLLL and vdsx for the help. I am now able to get tracker to work although tiler does not work very well. The perf fps drops when I use tiler compared to disabling it.

I also got deepstream-test3-app to work thanks to the help. However, this app does not have tracker and I wish to use tracker along with detector. Could you guide me how to go about adding tracker to this app? I have a preference for thsi app since it is easy to follow and will help me build the entire system that I am planning to build.

DaneLLL-

Trendnet TV-IP314PI now working with your new and improved so libs.

Thanks!

Hi neophyte1,
For clearness, you may start a new post. We would like to keep this for failure of running certain IP cameras.

Hi,
We have root caused the issues and will be releasing an updated package in next in ~2 weeks.
Before the release, please use quick fix in #35

Hi DaneLLL,

Could you please throw some light on the color format “NvBufferColorFormat_NV12_709_ER”. The documentaion reads this description “BT.709 colorspace - Y/CbCr ER 4:2:0 multi-planar” but I am unable to understand what “ER” stands for.

How is this color format different from “NvBufferColorFormat_NV12”?

I wish to convert this frame buffer to RGB / RGBA color format for saving the images. Kindly let me know how I should go about it.

Please help me out.

Thanks.

hello,

the quick fix solved the problem

thank you

Hi,

NvBufferColorFormat_NV12 is YUV420 BT.601 with Y in [16,235].
NvBufferColorFormat_NV12_709 is YUV420 BT.709 with Y in [16,235] .
NvBufferColorFormat_NV12_709_ER is YUV420 BT.709 with Y in [0,255] .

nvvideoconvert is the hardware engine which supports RGBA conversion. However, RGB is not supported. So for RGB, you may try software converter videoconvert.

Hi DaneLLL,

Thanks for the information. I am trying to access the frame in the tracking probe function in deepstream-app. However, I am unable to understand the dataSize, pitch, width and height values got from NvBufSurfaceParams

/**
 * Buffer probe function after tracker.
 */
static GstPadProbeReturn
tracking_done_buf_prob (GstPad * pad, GstPadProbeInfo * info, gpointer u_data)
{
  NvDsInstanceBin *bin = (NvDsInstanceBin *) u_data;
  guint index = bin->index;
  AppCtx *appCtx = bin->appCtx;
  GstBuffer *buf = (GstBuffer *) info->data;
  NvDsBatchMeta *batch_meta = gst_buffer_get_nvds_batch_meta (buf);
  if (!batch_meta) {
    NVGSTDS_WARN_MSG_V ("Batch meta not found for buffer %p", buf);
    return GST_PAD_PROBE_OK;
  }

  GstMapInfo in_map_info;
  NvBufSurface *surface = NULL;

  memset (&in_map_info, 0, sizeof (in_map_info));
  if (!gst_buffer_map (buf, &in_map_info, GST_MAP_READ)) {
    g_print ("Error: Failed to map gst buffer\n");
  }
  surface = (NvBufSurface *) in_map_info.data;  

  int batch_size= surface->batchSize;

  for(int i=0; i<batch_size; ++i)
  {
	uint32_t data_size =  surface->surfaceList[i].dataSize;
	uint32_t pitch =  surface->surfaceList[i].pitch;
	uint32_t width =  surface->surfaceList[i].width;
	uint32_t height =  surface->surfaceList[i].height;

	void *dataPtr = surface->surfaceList[i].dataPtr;
	
	printf("Size of the frame buffer : %d\n",data_size);
	printf("Pitch of the frame buffer : %d\n",pitch);
	printf("width of the frame buffer : %d\n",width);
	printf("height of the frame buffer : %d\n",height);

	NvBufSurfaceColorFormat color_format= surface->surfaceList[i].colorFormat;

        if (color_format == NVBUF_COLOR_FORMAT_NV12)
           printf("color_format: NVBUF_COLOR_FORMAT_NV12 \n");
        else if (color_format == NVBUF_COLOR_FORMAT_NV12_ER)
           printf("color_format: NVBUF_COLOR_FORMAT_NV12_ER \n");
        else if (color_format == NVBUF_COLOR_FORMAT_NV12_709)
           printf("color_format: NVBUF_COLOR_FORMAT_NV12_709 \n");
        else if (color_format == NVBUF_COLOR_FORMAT_NV12_709_ER)
           printf("color_format: NVBUF_COLOR_FORMAT_NV12_709_ER \n");
  }

  /*
   * Output KITTI labels with tracking ID if configured to do so.
   */
  write_kitti_track_output(appCtx, batch_meta);

  if (appCtx->primary_bbox_generated_cb)
    appCtx->primary_bbox_generated_cb (appCtx, buf, batch_meta, index);
  return GST_PAD_PROBE_OK;
}

Following is the output :

Size of the frame buffer : 1572864
Pitch of the frame buffer : 1280
width of the frame buffer : 1280
height of the frame buffer : 720

I am unable to comprehend what is going on. Kindly let me know where I am going wrong or is there any information which I am missing out on.

Thanks.

Hi neophyte1,
For clearness, please make a new post.

Hi DaneLLL,

I have created a new post. Kindly find the same at : https://devtalk.nvidia.com/default/topic/1061205/deepstream-sdk/rtsp-camera-access-frame-issue/

Thanks.

hi @DaneLLL

i had an issue with running rtsp so i used the hot fix you provided, and now i have this issue:

**PERF: FPS 0 (Avg)	
**PERF: 0.00 (0.00)	
** ERROR: <cb_newpad3:396>: Failed to link depay loader to rtsp src
** ERROR: <cb_newpad3:396>: Failed to link depay loader to rtsp src
ERROR from udpsrc3: Internal data stream error.
Debug info: gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0/GstUDPSrc:udpsrc3:
streaming stopped, reason not-linked (-1)
Quitting
App run failed

can you help me solve this please?

Hi ibondokji,
Please share the model ID of your IP camera. Also please check if you can run the test app in #29

Hi DaneLLL,

camera model: uniview IPC2324EBR-DPZ28

i have running the test app and a black screen pops up. terminal results with display=1:

Using launch string: rtspsrc location=rtsp://admin:*pass*@192.168.1.** ! rtph264depay ! h264parse ! nvv4l2decoder ! appsink name=mysink sync=false 
[INFO] (NvEglRenderer.cpp:110) <renderer0> Setting Screen width 1920 height 1080
Opening in BLOCKING MODE

terminal results with display=0:

Invalid MIT-MAGIC-COOKIE-1 keynvbuf_utils: Could not get EGL display connection
Using launch string: rtspsrc location=rtsp://admin:*pass*@192.168.1.** ! rtph264depay ! h264parse ! nvv4l2decoder ! appsink name=mysink sync=false 
Invalid MIT-MAGIC-COOKIE-1 key[ERROR] (NvEglRenderer.cpp:98) <renderer0> Error in opening display
[ERROR] (NvEglRenderer.cpp:154) <renderer0> Got ERROR closing display
Segmentation fault (core dumped)

thanks

Hi ibondokji,
Does it work if you execute ‘export DISPLAY=:1’?

hi danelll, no just a black screen pops up.

Hi ibondokji,
Your issue is similar to
[url]Jetson Nano Camera with remote Desktop on pipeline IP camera RTSP - Jetson Nano - NVIDIA Developer Forums

Suggest you re-flash a whole package through sdkmanager and try again.

Hi Danell,

I still have issues running deepstream with RTSP feed on xavier. I am running on a Basler BIP2-1600-25c camera and followed through the recommendations of this post. I am getting an error and black screen with PERF 0.00 results when i run the command deepstream-app -c myconfig.txt. Config file used shown below.

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cudaJPEG-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
#uri=file://../../samples/streams/sample_1080p_h264.mp4
uri=rtsp://192.168.22/2
#uri=rtsp://192.168.21/0
num-sources=1
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=2
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink1]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=3
container=1
codec=1
sync=0
bitrate=2000000
output-file=out.mp4
source-id=0

[sink2]
enable=0
type=4
codec=1
sync=0
bitrate=4000000
rtsp-port=8554
udp-port=5400

[osd]
enable=1
gpu-id=0
border-width=2
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
#width=1920
#height=1080
width=1280
height=720
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0

# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
model-engine-file=model_own_int8.engine
labelfile-path=labels8.txt
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
#interval=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yoloV3_own.txt

[tracker]
enable=0
tracker-width=640
tracker-height=368
ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_klt.so
gpu-id=0
enable-batch-process=1

[tests]
file-loop=0

Error message as follows:

** INFO: <bus_callback:163>: Pipeline ready

ERROR from src_elem0: Could not open resource for reading and writing.
Debug info: gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0:
Failed to connect. (Generic error)
Reset source pipeline reset_source_pipeline 0x7f5b6a6080
,ERROR from src_elem0: Could not open resource for reading and writing.
Debug info: gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0:
Failed to connect. (Generic error)

**PERF: FPS 0 (Avg)	
**PERF: 0.00 (0.00)

Any idea what i am missing? Tested the feed and i can connect to it using the built in vidoes app in xavier. I have also applied the prebuilt libs in case that could help me out but no luck.

Hi ralvarezl8d7l,
Are you able to run this sample successfully?

Hi Danell,

Here is the result bufferformat: NvBufferColorFormat_NV12. Looks like its running normally.

Hi ralvarezl8d7l,
Then please check if you can run this single-rtsp-source config file