hello, after I use trt.utils.write_engine_to_file to write a serialized engine to a file, then I use trt.utils.load_engine to deserialize from this binary engine file, but the console produce too much log:
I set Logger level to trt.Logger.ERROR, but it does not work and the console still produce these logs.
how to turn off these logs? can you give some advises?
To help us debug, can you provide a repro package containing the source to write serialized engine, deserialize engine that demonstrate the symptoms you are seeing?
Also, can you provide details on the platforms you are using?
Linux distro and version
GPU type
nvidia driver version
CUDA version
CUDNN version
Python version [if using python]
Tensorflow version
TensorRT version
root@fe42e86bbeb2:/mnt/tensorrt-infer# python serializeAndDeserializeEngine.py
/usr/lib/python2.7/dist-packages/tensorrt/legacy/infer/__init__.py:5: DeprecationWarning: The infer submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The infer submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python2.7/dist-packages/tensorrt/legacy/parsers/__init__.py:4: DeprecationWarning: The parsers submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The parsers submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python2.7/dist-packages/tensorrt/legacy/utils/__init__.py:59: DeprecationWarning: The utils submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The utils submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python2.7/dist-packages/tensorrt/legacy/lite/__init__.py:59: DeprecationWarning: The lite submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The lite submodule will been removed in a future version of the TensorRT Python API")
[TensorRT] ERROR: UFFParser: Invalid UFF file, cannot be opened
[TensorRT] ERROR: Network must have at least one output
Traceback (most recent call last):
File "serializeAndDeserializeEngine.py", line 62, in <module>
main()
File "serializeAndDeserializeEngine.py", line 57, in main
trt.legacy.utils.write_engine_to_file("./engines/1376_800.engine", engine.serialize())
AttributeError: 'NoneType' object has no attribute 'serialize'
Maybe you uploaded a bad UFF file? Anyways, I think the trick is to set verbosity BEFORE any other imports.… that way when you import TF and TRT, they set the appropriate level.
so do this FIRST
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '0'
os.environ['TF_CPP_MIN_VLOG_LEVEL'] = '0'
… and don’t import os again.
Play around with the level to get your desired setting.
I also meet the same problem.
“TypeError: deserialize_cuda_engine():incompatible function arguments.”
When using write_engine_to_file to create engine file,No error occur.
replying to #10, I’m still getting the following error.
root@ed63b37475ff:/mnt/tensorrt-infer# python serializeAndDeserializeEngine.py
/usr/lib/python3.5/dist-packages/tensorrt/legacy/infer/__init__.py:5: DeprecationWarning: The infer submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The infer submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python3.5/dist-packages/tensorrt/legacy/parsers/__init__.py:4: DeprecationWarning: The parsers submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The parsers submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python3.5/dist-packages/tensorrt/legacy/utils/__init__.py:59: DeprecationWarning: The utils submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The utils submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python3.5/dist-packages/tensorrt/legacy/lite/__init__.py:59: DeprecationWarning: The lite submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
warn_deprecated("The lite submodule will been removed in a future version of the TensorRT Python API")
[TensorRT] ERROR: UFFParser: Invalid UFF file, cannot be opened
[TensorRT] ERROR: Network must have at least one output
Aside from the uff problem, can you try the following?
import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '0'
os.environ['TF_CPP_MIN_VLOG_LEVEL'] = '0'
import sys, cv2
import common
I change some lines in the main function of samply.py in offical python sample ‘network_api_pytorch_mnist’ to learn about the serialization of engine,while model.py stay still.
Three tests in main function all failed. sample.py.zip (1.58 KB)