VSS Inference Failure with nvila-8b-video Model

VSS Inference Failure with nvila-8b-video Model

  • Hardware Platform: 8x NVIDIA H20 GPUs

  • Ubuntu Version: 22.04

  • NVIDIA GPU Driver Version: 550.127.08

  • Issue Type: bugs

  • When running inference for the nvila-8b-video model in VSS on our server with 8 H20 GPUs, the server reports an error and the inference fails.

  • There is no problem when running the nvila-15b-lite-highres-lita model in VSS, but the error occurs after switching to the nvila-8b-video model.

  • We appreciate any help or suggestions to resolve this issue.

Thanks. We will investigate the problem ASAP and see if we can support this model in the future.