Hi, I am able to do camera preview and recording with multimedia APIs by following the provided examples. Next step would be streaming, but i don’t see examples about video streaming. Could you please point me to related examples? Thanks
The default samples is to demonstrate hardware capability for encoding YUV into H264/H265 stream. There is no additional code for multiplexing into MP4/MKV files or streaming out. You would need to refer to public code(such as ffmpeg or live555) and do integration.
For a quick solution, you may consider use gstreamer. There is existing implementation for video streaming.
Thank you for your response! In our application, we have to change the building ISP configuration for a better image quality based on multimedia APIs. Is it possible to integrate gstreamer command for streaming into multimedia API? Namely, the API will grab frames and encode to h264, then gstreamer will stream them out .
Please take a look at:
How do I pass gstreamer a NvBuffer as a source for the omxh265enc - #3 by DaneLLL
This is a possible solution to hook jetson_multiemdia_api + gstreamer. Or you may use nvarguscamerasrc plugin directly.
Awesome! Thank you so much!
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.