Using deepstream installed from deb file on Jetson Nano and Jetson NX. Deepstream-image-meta example captures green image. Running the same example on dgpu 1080ti and 2060 machines the image is captured correctly. Is there a different memory buffer to use on the Jetson or a setting that I am missing.
I am using deepstream 5.0, i was looking at the memory configuration of the demo deepstream-app which has a parameter for cudadec-memtype:
(0): memtype_device - Memory type Device
(1): memtype_pinned - Memory type Host Pinned
(2): memtype_unified - Memory type Unified
I was thinking that maybe the difference between the two systems (dgpu and jetson) boiled down to physical memory and where the different systems decides to store the decoded image data. some further testing shows that if I run the same code with sample_720p.h264 file the capture of the vehicle object works on the Jetson NX but when i use my live rtsp stream or an mp4 file recorded from the rtsp stream i just see a green box captured. This also led me to believe that the memory locations might be different based on a difference in the decodebin pipeline config maybe. The OSD and display looks fine for all scenarios.
If the bounding box color is green, you can change the nvdsosd element to CPU mode (see gst-inspect for the element), and it’ll fix the problem. If the whole image is green, I haven’t experienced that.
the whole image is green, which is strange since the h264 file and the mp4 file decoded should exit the decoding stage of the h264 to the same memory and nothing should be different from that point in the pipeline on.
The code used to capture the image after parsing the nvdsanalytics meta data and determining an interesting event occurred is below and the obj_ctx_handle is created from nvds_obj_enc_create_context () function.
GstMapInfo inmap = GST_MAP_INFO_INIT;
if (!gst_buffer_map (buf, &inmap, GST_MAP_READ)) {
GST_ERROR ("input buffer mapinfo failed");
}else{
//strftime(tBuff, 10, "%d%m%Y", ptm);
NvBufSurface *ip_surf = (NvBufSurface *) inmap.data;
gst_buffer_unmap (buf, &inmap);
NvDsObjEncUsrArgs userData = { 0 };
/* To be set by user */
userData.saveImg = TRUE;
userData.attachUsrMeta = TRUE;
sprintf(userData.fileNameImg, "./images/%d_%lx_car.jpg", frame_meta->frame_num, obj_meta->object_id);
/* Preset */
userData.objNum = 1;
/*Main Function Call */
nvds_obj_enc_process (obj_ctx_handle, &userData, ip_surf, obj_meta, frame_meta);
}
Enter the commands:
$ sudo apt update
$ sudo apt install --reinstall nvidia-l4t-gstreamer
If apt prompts you to choose a configuration file, reply Y for yes
(to use the NVIDIA updated version of the file).
Yes running with an MP4 file that I captured from a rtsp stream i get an all green image when running on jetson and works correctly on dpgu. Now using the sample_1080p_h264.mp4 stream it works on both devices, I have the video available in dropbox Dropbox - File Deleted
The issue is that I have the same problem with the live rtsp stream. For some reason the jetson doesn’t want to capture the image properly with identical code used on the dgpu. Also note that the jetson does play the mp4 fine.
I haven’t yet, been doing most of my development on dGPU which it does work. Eventually I will need to move this to a Jetson Platform where this will become more of an issue for me.