I don’t sort out encoding/decoding hardware module clearly. (1) If I handle MJPEG stream from IP-camera, Jetson hardware module processes it automatically or I need to figure out it in code fairly?
I’m using Python for handle two MJPEG streams from IP-cameras in two different threads. What mean handle: receive frame by OpenCV, resize, undistort and send to main thread. Main thread concat two frames and display it in a window.
(2) Do I need to use another language (C/C++) to solve this task with all Jetson encoding/decoding opportunities?
(3) Should I use Libargus or another library in JetPack for (de/en)coding MJPEG?
(4) Does GStreamer provide API to (de/en)code MJPEG on jetson hardware level?
If I’m not mistaken, libargus is low-level API for cameras, connected directly to Jeston? Are there any opportunity to handle IP-camera stream by libargus?
For example, use gstreamer to get IP-camera MPEG stream and send it to libargus.
Or may be I’m wrong, and this operations could be done simpler by gstreamer only?