How to accepts mutiple RTSP streams with DeepStream 7.0

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson Orin NX
• DeepStream Version
7.0
• JetPack Version (valid for Jetson only)
JetPack 6.0
• TensorRT Version
TensorRT 8.6.2.3
• NVIDIA GPU Driver Version (valid for GPU only)
NVIDIA UNIX Open Kernel Module for aarch64 540.3.0
• Issue Type( questions, new requirements, bugs)
I wish to detect objects from RTSP source and create RTSP output, however it always come with “Stream format not found, dropping the frame”, then log “streaming stopped, reason not-negotiated (-4)”. Adapted from the official DeepStream example source30_1080p_dec_infer-resnet_tiled_display_int8.

Could you attach the format of your rtsp source? It is possible that the decoder does not support this particular format.

rtsp://192.168.50.45:554/live

and settings of our rtsp: h.264, bitrate 3000kbps, fps 60, 1920x1080

Could you try the command below and attach the log?

$apt-get install ffmpeg
$ffprobe rtsp://192.168.50.45:554/live

You can also put GST_DEBUG=3 in front of your deepstream-app command and attach the log agian. Thanks

We had solved this problem, but encountered another problem when we’re trying using our own model on detection. We changed .engine to our own, but it turned out

btw we are using yolov8, and we had already changed the onnx into .engine

You can check if the layerd name of your own model as the name in the configuration file.

Thanks, we solved the problem

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.