How to use media and inferencing on X86 (linux) with A10 GPU

Please provide complete information as applicable to your setup.

• Hardware Platform (X86 Server + Nvidia A10 GPU)
• DeepStream Version 5.1
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions)

Hello,
I would like to use Deepstream SDK to do video analytics on X86 server with Nvidia A10 card. I am currently running the DS app (on Xavier) which includes decoding (using NV HW accl.), inferencing, and fakesink. I would like to migrate this to X86 host as I can do the decoding on the host X86 CPU and inferencing on the A10 card. Is there a Nvidia DS plugin that does decoding on the X86 CPU? Any DS sample code pointer that achieves the similar objective - that is, does video analytics by using the host CPU for decoding and NV A10 for inferencing would be very much appreciated.

Suppose GStreamer will find the SW video decoder for video decoding. Please ensure you installed GStreamer SW video decoder.

Thank for you reply. If I use DeepStreamer don’t I need a Nvidia plug-in that support decode on x86 CPU (AMD or Intel)?

Nvidia don’t provide SW video decoder. GStreamer have SW video decoder based on FFMPEG(libav).

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.