Hi. i used my own architecture to train a mnist model with tensorflow 1.15.
i saved my model and build a frozen graph .pb file.
after that i convert the .pb file to .uff file with convert_to_uff.py script that come from with jetpack.
i using jetpack 4.5 and tensorflow 1.15
for making a TRT engine i get this error :
[TensorRT] ERROR: UffParser: Could not open /home/plate/project/tensorrt_demos/my_mnist/models/my_model.uff
[TensorRT] ERROR: Network must have at least one output
[TensorRT] ERROR: Network validation failed.
Traceback (most recent call last):
File “/home/plate/project/tensorrt_demos/my_mnist/test.py”, line 64, in
File “/home/plate/project/tensorrt_demos/my_mnist/test.py”, line 50, in main
with build_engine(model_file) as engine:
plz help me to solve this problem.