Deserialize cuda engine return none

when running deserialize cuda engine the engine returns none
i used keras model that i convert into onnx file and then converted again to trt file.

i used this github page as refrence keras_imagenet/README_tensorrt.md at master · jkjung-avt/keras_imagenet · GitHub

and used this models:
eyemodeleasier.h5 (12.7 MB)
eyemodeleasier.onnx (4.2 MB)
eyemodeleasier1.trt (4.3 MB)

thanks in advance

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

hello

i have allready uploaded all the files in the post before

i run the check_model.py and print(onnx.checker.check_model(model)) and it prints none

for the trtexec --verbose where is the log file located after running the onnx model with trtexec --verbose? thanks

Hi,

This looks like CUDA context related issue while using the model for inference. We recommend you to please share us issue repro scripts and complete error logs to try from our end for better debugging.

Also, please refer to the following samples and make sure your inference script is correct.

Thank you.