Description
I would like to ask you how to use TensorRT models with Streamlit app. There is probably some issue with threads but I don’t understand TensorRT and Streamlit internals
Environment
TensorRT Version: TensorRT v8205
GPU Type: GPU 0: NVIDIA GeForce RTX 2080 Ti
Nvidia Driver Version: 510.73.05
CUDA Version: 11.6
CUDNN Version:
Operating System + Version: Ubuntu 22.04 LTS
Python Version (if applicable):
TensorFlow Version (if applicable): 2.9.1
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag): Nvidia Docker Container nvcr.io/nvidia/tensorflow:22.06-tf2-py3
This is the exception message:
2022-07-21 16:48:42.472 Uncaught app exception
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/streamlit/scriptrunner/script_runner.py", line 557, in _run_script
exec(code, module.__dict__)
File "/workspace/medin-gravirovani-gui/main.py", line 37, in <module>
model_grav = supp.get_model_grav(VARS['resolution'], optimized=False, trt_enabled=VARS['trt_enabled'])
File "/workspace/medin-gravirovani-gui/support.py", line 114, in get_model_grav
if trt_enabled: return tf.saved_model.load(os.path.join(TRT_MODELS_PATH, 'grav'))
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/saved_model/load.py", line 782, in load
result = load_partial(export_dir, None, tags, options)["root"]
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/saved_model/load.py", line 912, in load_partial
loader = Loader(object_graph_proto, saved_model_proto, export_dir,
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/saved_model/load.py", line 192, in __init__
init_op = node._initialize() # pylint: disable=protected-access
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/eager/execute.py", line 54, in quick_execute
tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
tensorflow.python.framework.errors_impl.InternalError: Graph execution error:
Detected at node 'InitializeTRTResource' defined at (most recent call last):
File "/usr/lib/python3.8/threading.py", line 890, in _bootstrap
self._bootstrap_inner()
File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/usr/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/lib/python3.8/dist-packages/streamlit/scriptrunner/script_runner.py", line 296, in _run_script_thread
self._run_script(request.rerun_data)
File "/usr/local/lib/python3.8/dist-packages/streamlit/scriptrunner/script_runner.py", line 557, in _run_script
exec(code, module.__dict__)
File "/workspace/medin-gravirovani-gui/main.py", line 37, in <module>
model_grav = supp.get_model_grav(VARS['resolution'], optimized=False, trt_enabled=VARS['trt_enabled'])
File "/workspace/medin-gravirovani-gui/support.py", line 114, in get_model_grav
if trt_enabled: return tf.saved_model.load(os.path.join(TRT_MODELS_PATH, 'grav'))
Node: 'InitializeTRTResource'
Expect engine cache to be empty, but got 1 entries.
[[{{node InitializeTRTResource}}]] [Op:__inference_restored_function_body_165408]