**• Hardware Platform: Jetson
**• DeepStream Version:6.0.1
**• JetPack Version (valid for Jetson only): 4.6.4
**• TensorRT Version:8.2
The response of the udpsink rtsp video stream output by deepstream was too slow to play through the tool in the LAN. I set buffer-size to 1048576(110241024) and bitrate to 2000000, and the output rtsp streams were very slow. It takes more than 10 seconds, the fastest time is about 6 seconds, please tell me how I can reduce it to less than 1 second like other rtsp services.
what do you mean about “It takes more than 10 seconds” ? do you mean client’s video picture coming out needs 10 seconds? did you try at the server side which run the rtspserver?
The rtsp service is running all the time, I mean it takes time to connect the rtsp service for the first time, after connecting the screen and the camera are synchronized, and it takes time to reconnect the deepstream output rtsp service every time the application screen switches.
how do you play the rtsp stream? can you use some other players to play the steam on the rtsp server side running deepstream app? please rule out the network and player 's problem.
if using VLC, you can lower the rtsp receiving buffers. some players will start to decode after receiving enough buffers.
@fanzh I have tested and verified on multiple platforms and applications such as Android, iOS, Windows11 VLC, Mac OS X, Ubuntu media player, etc. It has been ruled out that it is due to player and system platform reasons. You can use the Python <deepstream-rtsp-in-rtsp-out.py> example to verify if the connection time is relatively long
@fanzh
Yes. It is to run the deepstream program to output the rtsp video stream, and then connect the rtsp video stream output by deepstream in the LAN to test with different applications and system platforms
you can modify the encoder 's idrinterval to improve. you can get the explanation by
gst-inspect-1.0 nvv4l2h264enc.
encoder.set_property(‘idrinterval’, 60)