• Hardware Platform ( GPU): A30X
• DeepStream Version: 6.1.1
• TensorRT Version: 8.4.1.5
**• NVIDIA GPU Driver Version: 515.65.01 **
• Issue Type: questions
I would like to confirm 2 points about the issue I wrote about a bug in A30 before.
Title: Nvv4l2decoder got stuck in Deepstream 6.1.1
• Hardware Platform ( GPU): A30X
• DeepStream Version: 6.1.1
• TensorRT Version: 8.4.1.5
**• NVIDIA GPU Driver Version: 515.65.01 **
• Issue Type: questions
I started the DeepStream docker container (nvcr.io/nvidia/deepstream: 6.1.1-devel) and ran the following pipeline:
gst-launch-1.0 -e filesrc location=./xxx.mp4 !
qtdemux !
h264parse !
nvv4l2decoder !
fakesink dump=True
However, nvv4l2decoder got stuck in Deepstream6.1.1.
Do you know why nvv4l2decoder stucks?
Is there a pro…
Regarding the bug of the A30 card, are the drivers that have been fixed published?
Where can I find out the details of the bug and the scope of impact?
Driver 520.61.05 can work.
Thank you for your quick reply.
I have a question about Driver 520.61.05.
When I try to download from a public site
It is not published as a driver supported by the A30X, but is it possible to use it with the A30X?
There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks
Driver 520.61.05 works with A30 A30 Tensor Core GPU for AI Inference | NVIDIA
system
Closed
January 9, 2023, 6:31am
8
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.