Multimedia API video encoding/decoding on Jetson TX2

Hello, Jetson team and developers.

I don’t sort out encoding/decoding hardware module clearly. (1) If I handle MJPEG stream from IP-camera, Jetson hardware module processes it automatically or I need to figure out it in code fairly?

I’m using Python for handle two MJPEG streams from IP-cameras in two different threads. What mean handle: receive frame by OpenCV, resize, undistort and send to main thread. Main thread concat two frames and display it in a window.

(2) Do I need to use another language (C/C++) to solve this task with all Jetson encoding/decoding opportunities?
(3) Should I use Libargus or another library in JetPack for (de/en)coding MJPEG?
(4) Does GStreamer provide API to (de/en)code MJPEG on jetson hardware level?

Thanks for your attention!

Please try nvjpegdec https://devtalk.nvidia.com/default/topic/1012417/

If I’m not mistaken, libargus is low-level API for cameras, connected directly to Jeston? Are there any opportunity to handle IP-camera stream by libargus?

For example, use gstreamer to get IP-camera MPEG stream and send it to libargus.

Or may be I’m wrong, and this operations could be done simpler by gstreamer only?

Hi rostislav,
IP camera is a kind of rtspsrc. It is a rtsp streaming case like:
https://devtalk.nvidia.com/default/topic/1014789/jetson-tx1/-the-cpu-usage-cannot-down-use-cuda-decode-/post/5188538/#5188538

gstreamer has full implementation. IF you don’t use it, you have to implement all elements like rtspsrc, rtph264depay, h264parse

Argus is for camera directly connecting to Tegra CSI ports.