How to run video decoding on dGPU(A10)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
A10 on X86 platform
• DeepStream Version
6.1
• JetPack Version (valid for Jetson only)
• TensorRT Version
8.4.3.1-1+cuda11.6
• NVIDIA GPU Driver Version (valid for GPU only)
Driver Version: 515.65.01
• Issue Type( questions, new requirements, bugs)
Question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Does the Deepstream sample application runs the video decoding on A10?.

I have setup with x86 server Linux and A10 card and running deepstream application, is there a way to identify where the decoding is performed , like CPU or dGPU?.

NVIDIA GPUs contain one or more hardware-based decoder and encoder(s) (separate from the CUDA cores) which provides fully-accelerated hardware-based video decoding and encoding for several popular codecs. With decoding/encoding offloaded, the graphics engine and the CPU are free for other operations.

you can run “nvidia-smi” command as below, if you see “dec” percentage is non-zero, it means GPU HW decoding is working.

$ nvidia-smi dmon -i 0
# gpu   pwr gtemp mtemp    sm   mem   enc   dec  mclk  pclk
# Idx     W     C     C     %     %     %     %   MHz   MHz
    0    27    22    24     0     0     0     0  1215   930
    0    27    22    23     0     0     0     0  1215   930

Thank you , this exactly I was looking for.
I was able to get my data with this command…thx

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.