Cuda 12 on widows server

hello,
I installed cuda 12.0 for windows server 2019, and cudnn 8.8.1.3 on wnidows server 2019, when i want get Inference by my onnx model i get this error:

2 0 2 3 - 0 4 - 0 4 1 4 : 0 3 : 2 6 . 6 4 7 3 8 0 8 [ E : o n n x r u n t i m e : D e f a u l t , p r o v i d e r _ b r i d g e _ o r t . c c : 1 3 0 4 o n n x r u n t i m e : : T r y G e t P r o v i d e r I n f o _ C U D A ] D : \ a \ _ w o r k \ 1 \ s \ o n n x r u n t i m e \ c o r e \ s e s s i o n \ p r o v i d e r _ b r i d g e _ o r t . c c : 1 1 0 6 o n n x r u n t i m e : : P r o v i d e r L i b r a r y : : G e t [ O N N X R u n t i m e E r r o r ] : 1 : F A I L : L o a d L i b r a r y f a i l e d w i t h e r r o r 1 2 6 " " w h e n t r y i n g t o l o a d " C : \ U s e r s \ t o s a n d m \ A p p D a t a \ L o c a l \ P r o g r a m s \ P y t h o n \ P y t h o n 3 9 \ l i b \ s i t e - p a c k a g e s \ o n n x r u n t i m e \ c a p i \ o n n x r u n t i m e _ p r o v i d e r s _ c u d a . d l l "

Traceback (most recent call last):
File “”, line 1, in
File “C:\Users\tosandm\AppData\Local\Programs\Python\Python39\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py”, line 360, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File “C:\Users\tosandm\AppData\Local\Programs\Python\Python39\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py”, line 408, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:537 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn’t able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (Redirecting…), make sure they’re in the PATH, and that your GPU is supported.

my onnxruntime version is 1.14.1

2 Likes