CUDAExecutionProvider is not in available provider

Hi Nvidia community,

I am trying to run a deep learning model on my NVIDIA Jetson Xavier NX board, but I am encountering an error that says ‘CUDAExecutionProvider is not in available provider names’, when I run onnxruntime code.

Is anyone else experiencing this issue on the Jetson Xavier NX board, or does anyone have any suggestions for how I can resolve this? I would really appreciate any help or guidance that you can provide. Thanks in advance!

Hi,

Could you share how you install the ONNXRuntime?
Based on the error, it seems that the package doesn’t build with CUDA support.

You can find the prebuilt package from us on the below page.
The package has CUDA support so you can use it instead.

https://elinux.org/Jetson_Zoo#ONNX_Runtime

Thanks.

Hi @AastaLLL,

I installed as per link that you provided before(Jetson Zoo - eLinux.org).

My onnx packages:

onnx                1.13.0
onnxruntime         1.12.0
onnxruntime-gpu     1.12.1

My code:

import time
import soundfile as sf
import onnxruntime
import yaml

from ttstokenizer import TTSTokenizer

with open("/home/jetson/.cache/espnet_onnx/kan-bayashi/ljspeech_vits/config.yaml", "r", encoding="utf-8") as f:
    config = yaml.safe_load(f)
# Create tokenizer
tokenizer = TTSTokenizer(config["token"]["list"])

model = onnxruntime.InferenceSession(
   "/home/jetson/.cache/espnet_onnx/kan-bayashi/ljspeech_vits/full/vits.onnx",
  providers=[
    ("CUDAExecutionProvider", {"cudnn_conv_algo_search": "DEFAULT"}),
    "CPUExecutionProvider"
  ]
)

# Tokenize inputs
inputs = tokenizer("Hello, How are you?")
start = time.time()
# Generate speech
outputs = model.run(None, {"text": inputs})
speech = outputs[0]
print("Time:", time.time() - start)
# Write to file
sf.write("out.wav", speech, 22050)


There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Hi,

Which JetPack do you use?
Could you share the output with the following code?

>>> import onnxruntime as ort
>>> ort.get_available_providers()

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.