[QUESTION] Gstreamer, ffmpeg, Video Codec SDK, which one shoud I use?

I’m developing a C++ video processing program using Jetson and dGPU (RTX4080). This requires the use of nvenc and nvdec.
After research, I found that there are three ways to use nvenc/nvdec:

I have a few questions now:

  • I tried to use Video Codec SDK on the jetson AGX orin platform, but without success. The likely reason is that the link library is missing some symbols. I didn’t find any relevant documentation explaining whether Video Codec SDK can be used on the Jetson platform. Hope you can answer this question (not the missing symbol issue).
  • I have developed a version of the program using gstreamer and can use nvenc/nvdec. But I haven’t tried using ffmpeg. My question is,
    • Is there any difference in encoding/decoding efficiency between the two?
    • If I just encode/decode images and videos without involving rtsp/rtp/rtcp and other network processing, is using ffmpeg a more appropriate choice?

Forgot to mention, the operating systems I use are Jetpack 5.1.2 (arm64) and Ubuntu 20.04 (x64).

Is there anyone who can answer this question?

Hello,

This topic will get better visibility if posted in the Jetson forums. I will move it over for you.

Best,
Tom

Hi,
For now the software stacks are not fully compatible on Jetson platforms and dGPU. On Jetson platforms, we have jetson_multimedia_api. For dGPU , we have Video Codec SDK.

For gstreamer, certain plugins can work on both Jetson platforms and dGPU. You can install DeepStream SDK and use the plugins for both platforms. Please check the documents for DeepStream SDK:
NVIDIA Metropolis Documentation
GStreamer Plugin Overview — DeepStream documentation 6.4 documentation

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.