Hello people, how are you doing? Fine I hope.
So, I´m kinda new in this universe of video analytics and AI, so please be patient with me :).
I´m working in a project using Jetson nano (later on we will move to Jetson Xavier). The project consists in reading a camera and a LiDaR sensor and use both inputs to detect an specific condition of an environment. After detecting the condition, I will perform some actions using the GPIOs, such as turning a relay on or off, activate / deactivate a buzzer, blink LED and so on.
For the video processing, I´m using this repository as a baseline: GitHub - dusty-nv/jetson-inference: Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson., which is helping a lot with the work.
At some point, I need to stream all the data from jetson board to a computer or a tablet. In shorts, I need to stream the video from the camera, the processed video from jetson inference, the LiDaR data and the GPIOs actions. The challenge is to maintain some kind of stream server inside jetson. I do not have a network to stream out all this data.
I started to look if there is any solution for this matter and came up with 2 approaches:
nginx: It can build a RTMP server, but the jetson-inference (GitHub - dusty-nv/jetson-inference: Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.) libs has RTP streaming only. Is it possible to use RTP with RTMP? Or it is necessary some kind of protocol parsing?
Gstreamer: Could solve the problem, but does GStreamer support any kind of datatype? Or is exclusive to multimedia?
Does anyone solved a similar problem? I´m looking for a direction, but examples are always welcome :).
P.S.: There is no problem in using more than 1 tool to stream all the data, like GStreamer for the videos and a message broker for the rest. I´m trying to find if there is a solution that could do all the streaming.