DeepStream RTSP Augmented Output Stream

Hello All,

I just downloaded and ran the example DeepStream application on my own camera feed. I was able to view the results in the on screen display as well as write mp4 to file. Is it possible with the current pre-release to stream the augmented video feed back out as RTSP? It appears like that functionality might not be implemented yet.

Thanks in advance!

hi dtbrown,
In our APP, this function is not supported.
But you can modify our pipeline to support this, use RTPh264pay + udpsink after encode.

which APP are u using, nvgstiva or nvgstiva-ui?

expect a few seconds delay.

How to sink YUV420 frames (in device) into RTSP?

In server side, we have YUV420 images ready in device (by using NPP). We want to RTSP (with H264 format) to client in real-time. Copy the frames from device to host then use gstreamer or FMMPEG to do RTSP is not an acceptable approach (it takes too much time to copy data from device to host). I know DeepStream can do this inside device, but did not find the right example code (prefer C++). Please give some suggestions.

Thanks!