I have modifed above example with the uridecodebin as the source in order to play .mp4s or other rtsp src, it won’t work straightway. I understand that I should only demux the h264 stream after the uridecodebin then it might be working, but I dont know how to do it. Can you please help?
Regards,
Kai
Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) • DeepStream Version • JetPack Version (valid for Jetson only) • TensorRT Version • NVIDIA GPU Driver Version (valid for GPU only) • Issue Type( questions, new requirements, bugs) • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) • Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
From the log, it seems the uridecodebin has successfully created following elements which are almost identical to the test1-rtsp-out example, which is filesrc->h264parse->capsfilter->nvv4l2decoder. The example works for the .h264file, but when I change the source to uridecodebin , it jus hangs at frame3.
I will try to play more to see what is the problem
howevver, it is never linked in the pipeline, instead, a updsink is linked at the end of the pileline. Once I removed the overlaysink from the pipeline, it started to work.
You never link the sink element to the pipeline. There is already a udpsink in your pipeline. You must make sure your pipeline before you coding.
What pipeline on earth do you want?