Deepstream Back-to-Back-Detector

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) dGPU aws T4
• DeepStream Version 5.0 devel
• JetPack Version (valid for Jetson only)
• TensorRT Version 7
• NVIDIA GPU Driver Version (valid for GPU only) 440.82

When I run the https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps/tree/master/back-to-back-detectors
The app keeps displaying “Running”. It is not displaying any output,
What changes should I make?
root@bfa7d0cd1ac2:/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/back-to-back-detectors# ./back-to-back-detectors …/…/…/…/samples/streams/sample_720p.h264
Warn: ‘threshold’ parameter has been deprecated. Use ‘pre-cluster-threshold’ instead.
Warn: ‘threshold’ parameter has been deprecated. Use ‘pre-cluster-threshold’ instead.
Now playing: …/…/…/…/samples/streams/sample_720p.h264
Running…

Hi, I’m not too familiar with AWS T4 instances, but I assume that “root@bfa7d0cd1ac2” is your SSH login. Do you actually have X11 forwarded over SSH, or is this a virtual desktop connection like VNC? Other examples run without problems?

1 Like

Hi
I see you running from cloud, we can setup virtual display for local server with Tesla GPU, following this, https://elinux.org/Deepstream/FAQ, but not sure about cloud.
another option is you could change sink to fakesink
//sink = gst_element_factory_make (“nveglglessink”, “nvvideo-renderer”);
sink = gst_element_factory_make ('fakesink", “nvvideo-renderer”);
or you could run deepstram-app first with config using sink type File or RTSPStreaming, from these type, you also could view the output. then do some customization for back to back detector app, you could refer to sources/apps/apps-common/src/deepstream_sink_bin.c::create_encode_file_bin or ::create_udpsink_bin

1 Like

thanks!