VSS Inference Failure with nvila-8b-video Model
-
Hardware Platform: 8x NVIDIA H20 GPUs
-
Ubuntu Version: 22.04
-
NVIDIA GPU Driver Version: 550.127.08
-
Issue Type: bugs
-
When running inference for the
nvila-8b-videomodel in VSS on our server with 8 H20 GPUs, the server reports an error and the inference fails.
-
There is no problem when running the
nvila-15b-lite-highres-litamodel in VSS, but the error occurs after switching to thenvila-8b-videomodel. -
We appreciate any help or suggestions to resolve this issue.
