I just downloaded and ran the example DeepStream application on my own camera feed. I was able to view the results in the on screen display as well as write mp4 to file. Is it possible with the current pre-release to stream the augmented video feed back out as RTSP? It appears like that functionality might not be implemented yet.
In server side, we have YUV420 images ready in device (by using NPP). We want to RTSP (with H264 format) to client in real-time. Copy the frames from device to host then use gstreamer or FMMPEG to do RTSP is not an acceptable approach (it takes too much time to copy data from device to host). I know DeepStream can do this inside device, but did not find the right example code (prefer C++). Please give some suggestions.