Thanks a lot DaneLLL and vdsx for the help. I am now able to get tracker to work although tiler does not work very well. The perf fps drops when I use tiler compared to disabling it.
I also got deepstream-test3-app to work thanks to the help. However, this app does not have tracker and I wish to use tracker along with detector. Could you guide me how to go about adding tracker to this app? I have a preference for thsi app since it is easy to follow and will help me build the entire system that I am planning to build.
Could you please throw some light on the color format âNvBufferColorFormat_NV12_709_ERâ. The documentaion reads this description âBT.709 colorspace - Y/CbCr ER 4:2:0 multi-planarâ but I am unable to understand what âERâ stands for.
How is this color format different from âNvBufferColorFormat_NV12â?
I wish to convert this frame buffer to RGB / RGBA color format for saving the images. Kindly let me know how I should go about it.
NvBufferColorFormat_NV12 is YUV420 BT.601 with Y in [16,235].
NvBufferColorFormat_NV12_709 is YUV420 BT.709 with Y in [16,235] .
NvBufferColorFormat_NV12_709_ER is YUV420 BT.709 with Y in [0,255] .
nvvideoconvert is the hardware engine which supports RGBA conversion. However, RGB is not supported. So for RGB, you may try software converter videoconvert.
Thanks for the information. I am trying to access the frame in the tracking probe function in deepstream-app. However, I am unable to understand the dataSize, pitch, width and height values got from NvBufSurfaceParams
/**
* Buffer probe function after tracker.
*/
static GstPadProbeReturn
tracking_done_buf_prob (GstPad * pad, GstPadProbeInfo * info, gpointer u_data)
{
NvDsInstanceBin *bin = (NvDsInstanceBin *) u_data;
guint index = bin->index;
AppCtx *appCtx = bin->appCtx;
GstBuffer *buf = (GstBuffer *) info->data;
NvDsBatchMeta *batch_meta = gst_buffer_get_nvds_batch_meta (buf);
if (!batch_meta) {
NVGSTDS_WARN_MSG_V ("Batch meta not found for buffer %p", buf);
return GST_PAD_PROBE_OK;
}
GstMapInfo in_map_info;
NvBufSurface *surface = NULL;
memset (&in_map_info, 0, sizeof (in_map_info));
if (!gst_buffer_map (buf, &in_map_info, GST_MAP_READ)) {
g_print ("Error: Failed to map gst buffer\n");
}
surface = (NvBufSurface *) in_map_info.data;
int batch_size= surface->batchSize;
for(int i=0; i<batch_size; ++i)
{
uint32_t data_size = surface->surfaceList[i].dataSize;
uint32_t pitch = surface->surfaceList[i].pitch;
uint32_t width = surface->surfaceList[i].width;
uint32_t height = surface->surfaceList[i].height;
void *dataPtr = surface->surfaceList[i].dataPtr;
printf("Size of the frame buffer : %d\n",data_size);
printf("Pitch of the frame buffer : %d\n",pitch);
printf("width of the frame buffer : %d\n",width);
printf("height of the frame buffer : %d\n",height);
NvBufSurfaceColorFormat color_format= surface->surfaceList[i].colorFormat;
if (color_format == NVBUF_COLOR_FORMAT_NV12)
printf("color_format: NVBUF_COLOR_FORMAT_NV12 \n");
else if (color_format == NVBUF_COLOR_FORMAT_NV12_ER)
printf("color_format: NVBUF_COLOR_FORMAT_NV12_ER \n");
else if (color_format == NVBUF_COLOR_FORMAT_NV12_709)
printf("color_format: NVBUF_COLOR_FORMAT_NV12_709 \n");
else if (color_format == NVBUF_COLOR_FORMAT_NV12_709_ER)
printf("color_format: NVBUF_COLOR_FORMAT_NV12_709_ER \n");
}
/*
* Output KITTI labels with tracking ID if configured to do so.
*/
write_kitti_track_output(appCtx, batch_meta);
if (appCtx->primary_bbox_generated_cb)
appCtx->primary_bbox_generated_cb (appCtx, buf, batch_meta, index);
return GST_PAD_PROBE_OK;
}
Following is the output :
Size of the frame buffer : 1572864
Pitch of the frame buffer : 1280
width of the frame buffer : 1280
height of the frame buffer : 720
I am unable to comprehend what is going on. Kindly let me know where I am going wrong or is there any information which I am missing out on.
I still have issues running deepstream with RTSP feed on xavier. I am running on a Basler BIP2-1600-25c camera and followed through the recommendations of this post. I am getting an error and black screen with PERF 0.00 results when i run the command deepstream-app -c myconfig.txt. Config file used shown below.
[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl
[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory, applicable for Tesla
#(2): nvbuf-mem-cudaJPEG-device - Allocate Device cuda memory, applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory, applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0
[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
#uri=file://../../samples/streams/sample_1080p_h264.mp4
uri=rtsp://192.168.22/2
#uri=rtsp://192.168.21/0
num-sources=1
gpu-id=0
# (0): memtype_device - Memory type Device
# (1): memtype_pinned - Memory type Host Pinned
# (2): memtype_unified - Memory type Unified
cudadec-memtype=0
[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=2
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0
[sink1]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=3
container=1
codec=1
sync=0
bitrate=2000000
output-file=out.mp4
source-id=0
[sink2]
enable=0
type=4
codec=1
sync=0
bitrate=4000000
rtsp-port=8554
udp-port=5400
[osd]
enable=1
gpu-id=0
border-width=2
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0
[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
#width=1920
#height=1080
width=1280
height=720
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0
# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
model-engine-file=model_own_int8.engine
labelfile-path=labels8.txt
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
#interval=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_yoloV3_own.txt
[tracker]
enable=0
tracker-width=640
tracker-height=368
ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_klt.so
gpu-id=0
enable-batch-process=1
[tests]
file-loop=0
Error message as follows:
** INFO: <bus_callback:163>: Pipeline ready
ERROR from src_elem0: Could not open resource for reading and writing.
Debug info: gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0:
Failed to connect. (Generic error)
Reset source pipeline reset_source_pipeline 0x7f5b6a6080
,ERROR from src_elem0: Could not open resource for reading and writing.
Debug info: gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0:
Failed to connect. (Generic error)
**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)
Any idea what i am missing? Tested the feed and i can connect to it using the built in vidoes app in xavier. I have also applied the prebuilt libs in case that could help me out but no luck.