using `trt.utils.load_engine` to deserialize from binary engine file could produce too much logs

hello, after I use trt.utils.write_engine_to_file to write a serialized engine to a file, then I use trt.utils.load_engine to deserialize from this binary engine file, but the console produce too much log:

a6=\xb3K\t>\x8e\xb5$\xbe\xa5\xae\x1d\xbd:")>\xe4\x17\x04=@\x9a\xc8\xbd\x82I\x0c=\xb2L\xf5\xbdP4/\xbeO\xb0\xd0\xbdXN\x1f\xbb}\xb7\r\xbeUU\x83\xbd\xf6J<\xbd\xa7\x15\xbf\xbc\xa6\xa4\x0e>\xf1\x83%\xbe] J\xbd&p\x90=x_6=>CA\xbeu\x96\xe9\xbdyf\x93=\xc8\xc1\xad<\xa0\xa9\x07=\xff\x82t:O\xd0[=j\xa8C\xbd\xed\xba\x06<\x1f\xff\xe4=\x18\xf5\xd7\xbdT~\x91=\x0ed\x1a=\xb2\x13\xa9=\x17,\xc5=\xcf#\xd8\xbd\xdf+x\xbd\x84\xdcZ< \r/\xbd\x15y\x8d=oz\x07>\xe8i\x97=~\x10\t\xbeqr^=\x95\xcf\x01\xbcC\xc6;\xbd\x9e\x8cc\xbd\xe0\xefO\xbd\xf04P\xbcjCt<bnA\xbd\x84\xbf\xec\xbci\'R\xbdj\x97_<v?\xe0=\xcc\xf6\xcf\xba\xea\xc5\x9b=\x96\xf1\x9d<\x98\xa5\x08\xbdg>\x9b=\x17\x8a\xff;\xecN\x03\xbe\xf9\x03\x18>\xbed8\xbe\xc9nS=kVh=]\xfcy=\x9a\xe7\xd1<\xfe\x0c\xbe\xbd\xfc{\xc7=\x11\xdc\x1a>\x8e1\x05>\xc8\x03\xa9\xbd\xf1(\x07>\x1b%\t\xbe2\xe7\x1a\xbe\xc7\xa9K>\x96v\xb4=\xc8\xb9\x87=\x14\xfc\xb3=\xc5\xc3U=z\xcb\x01\xbe\xa5\xe6^=\xbb\xf2\xb2\xbc\xb3\xa0\xa5=\xfc\xcc\xc9\xbdcP\x8c\xbd\x87\xe2\xdc\xbdDK\\\xbd\xa9\x9d9>\xb2\x9c\x89<A\xfb\xce\xbd}\xe9\xa6=\xa2k><\xb4f>\xbc\x8b\xca\x9b\xbd\'\xb47=h\x97\xed<\x18;G>\xbcRJ=\x81\x0e\x88=\xdb`\xe0;PIw\xbd\xe3\xcaZ=\xcd\x857<4\x11z\xbcpi\xc1\xbd\x0b\x1bV\xbbC\xa4\xc2\xbc\xe7\xc7\x18\xbe\x15g\x1a\xbeOr\x85=\x84u\x16=R\xd3\x93=\xed)u=\x05M|=\x08d\xa1<\xe7\xab\x14=\x1c\xc0:\xbd\xb7\xad\xe1=\xe3\xb7(\xbe,\x0b\xb9\xbc1L2\xbe7\xf1\xa4\xbd\xb59E=SD1>>\xc7\xd9\xbc\xea\xfal=\xad}\xe1\xbd\x9e\x93n\xbd\xaa@\x91\xbdK\x96\x13\xbe\xa2\xef\xb1=\xf3\xc3(=@\xe1r\xbc\x13\x0f%=\xf0W\xe5\xbd\x94\xe3\xaa<\x04n|=}^{\xbd\xd2\xee\x86\xbd\x12\x9eu=\xa5\xb8\x91<\xb2U+\xbdL\xda\x91=Yl\x03\xbe\x0e?\x0c=\x96\xf8\xaa=\x91\x01\xa7<4\xae\x00\xbe\xc4\xf6\x1a\xbe\x02JD=>F\x06<La\xcc\xbd\x13\xb1\xd5=\x90\xc8\xb6=\x1b#&\xbe?\x06d<^gS="\x04\xcf\xbd\xde\xe1\x8d\xbdE&R\xbc\xd6\x9b\xbe<\x1a\xf2\xf5\xbd\x98\x11\x1f<\xd3SD\xbcx \x19>\xc7\xe0\x0b\xbe\xb5\x0e\x0b>\xbbK\x16>\x01\x8f\x07>\xa6w*=r%2\xbd\x1fN\x17\xbe\xab\x1d\xa9<\xb5\x9fb\xbd\x18\x8b\x91=p\x0e\xc2\xbd\xe8\x06\x93==\xd0\r=\xd0\x89\x8d\xbd\x89O\x10\xbe\x95\xe6s\xbd\x1e\xd8\x95=\xf6\x925\xbd\xd9\x16\x99\xbd\xacv\xe8=\x08)\xe6=f\x0b(=\\f\xd6=\n\\\x9e\xbd|Q\xd1<\x87\x10\xcc\xbc\xca\x87\x06\xbe*\r9=\xce\t\x82=\xcbOF>\x19\x90\xcc\xbaJj\xab\xbdsI\x9d=\x14\xe9\xab=\x0f\xd4\xfe=\xc0\xeb\xb8\xbd\x0f\xe2\x1f<\xd9\xb0\xc0=\x0f\xeb\t\xbe\xc2V\xf7\xbc\xe9\xc3\x02\xbb\x15u\x8a\xbd\xc6d\xbe\xbc\x9d\xd1<>\x03\x13m\xbd}\x98\x15=\x85W\xe7<F\x7f\x05>\x00~\x94\xbd\x92q\n\xbd\xd6\xae\xd0<\xa0F\xb7\xbd\xa1\xe9]=\xbd\xd4{\xbdo\t\x86=\x0e=\x10\xbdC\xc8\r=\x9d,$\xbep\xb3)\xbe9\xa3\xd5=\xbdj \xbd\xe0I\xdd=\xf5\x0cs<\xb8+,>\xdc\x84J\xbd\x13\xc2A\xbe)\x80\xa2=.\xec\x8f=V\x9c\xa2\xbdfT\xe3\xba\xe2F\xe4\xbd\x1c\x10(=\xce\x80\xc5\xbcR\xe0\x82\xbc;\x0f\x80=\xf3?\x02\xbe\xfa\x1b\x15=\'\xb3\xbd\xbd\xd0i?\xbe\x07\xb3\xa3=o\xd8\x89\xbd\x1cP\xfe=\x12=\xa1=\x13L\xbe8\x8f\xb4\x11\xbe\x1d]\x9e=\x85\xd2\x07<\x1a}\xe6\xbd\x8e1\x92=W\x02\x0c\xbd\xf369<\x82\xaf\xdb\xbdYy<<\x19X\xe4\xbc\x1c\xab\xab<\x05}\xa6=Qj9>\xce\xad\xc8\xbd\x15\xed\xed=\xfd\xe1\x99\xbc\xc93\n\xbe\x0b\xcd3<};p\xbd\xdaG0\xbef6\x8e<\xbc\xef\xc0=5\x9b\x97\xbd\x86\xaat=\xa1B\x0b\xbe\x91\x98\x8c\xbd3f">a\xb7\xec<2_\xf1\xbd\xd7\x9e\xb6\xbd\x1e\x9a\xa5\xbd\x81\xf4\xe2=\xd3\xd5\xe9<\x9e\xd6?\xbdm\xc0\xc6=\x94O\xb4\xbc\x8d\x1e\xa0<%Eu\xbc\xbd\x82\xb6<\xd1\xd0\xc8\xbd-\x89H\xbd0\x1f|\xbd\x18%\xad=s3\x8f\xbb\x0c\r\xb4<1K\x8e=\xc4\x1e\x81\xbd\xa7\xfa\\=D\x1cx\xbc\xee\xd2\xd0=E\x9a\t=\xec\xeb\xe1=O\xc2)=\x1b\x1f2>\xae\xf6\xa8\xbdG\xaaH\xbe.\xb1\xd7<\xeb\xa9\xcf=\x02"\xd1=YC\xd8=}\xef\xa2=#4,>\x82\xe9\x88\xbck\xb0">c/T\xbdF(\xc3=z\xa6\xee<?\xcf\xd0=_\xc2C=\xda+\xa3=\xcc\x7f5\xbd"\x05#\xbe\x95 \x9f=\x1a\xd8\x10\xbe6\xb0\xcb=\xaa\x1e\x0b<\xe1\x18\xa3\xbc06\x81\xbc=\x00\xf3=F\x9c\xed\xba\xfeB\x13>$\x90#=P\xc0\xcd\xbc\x85C[=\xc2,\x96\xbb6\x10$>7)\x8d=\xe7^m<\x11*\xc8=c\x06\xe3\xbd\\\x00==\xec\x91\xdc=!\x13-<\x1f\x06x=\xdd\x98U<>\x06\xb3<-\xa4\xf3\xbaP\'i\xbc\x0c\xc4\n\xbd\xc5\xc4\x9d\xbd\xb4\x9d\x9b<\xe2\xf3\xd7\xbd\xd8\x99\xb6=\x1c"\x17>Yf\xca<\xa8\xd7\xa1\xbdH\xa4/=`Y\xf8=\x84"\xb4=\x9d\x14\t\xbd\x14\xae\x9c\xbc\xe1d\xdc\xbd\x17A7=\xc1\xe3!>\xebA\xcd\xbd(\xfd\xc5=\x0

I set Logger level to trt.Logger.ERROR, but it does not work and the console still produce these logs.
how to turn off these logs? can you give some advises?

Hello,

looks like you have Tensorflow verbosity set >= 2.

try

import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '0'
os.environ['TF_CPP_MIN_VLOG_LEVEL'] = '0'

which should be minimal verbosity.

@NVES thanks for your reply, but it does not work. how to fix it?

Hello,

To help us debug, can you provide a repro package containing the source to write serialized engine, deserialize engine that demonstrate the symptoms you are seeing?

Also, can you provide details on the platforms you are using?

Linux distro and version
GPU type
nvidia driver version
CUDA version
CUDNN version
Python version [if using python]
Tensorflow version
TensorRT version

@NVES thanks for your prompt reply

my code is https://github.com/IvyGongoogle/tensorrt-infer

Linux distro and version:

LSB Version:	:core-4.1-amd64:core-4.1-noarch
Distributor ID:	CentOS
Description:	CentOS Linux release 7.2.1511 (Core)
Release:	7.2.1511
Codename:	Core

GPU type: Tesla P40
nvidia driver version: NVIDIA-SMI 384.66
CUDA version: 9.0
CUDNN version: 7.2.1
Python version [if using python]: python2.7
Tensorflow version: 1.4.1
TensorRT version: 5.0.0.10

@NVES can you give some advises?

Hello,

I’m getting

root@fe42e86bbeb2:/mnt/tensorrt-infer# python serializeAndDeserializeEngine.py
/usr/lib/python2.7/dist-packages/tensorrt/legacy/infer/__init__.py:5: DeprecationWarning: The infer submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The infer submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python2.7/dist-packages/tensorrt/legacy/parsers/__init__.py:4: DeprecationWarning: The parsers submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The parsers submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python2.7/dist-packages/tensorrt/legacy/utils/__init__.py:59: DeprecationWarning: The utils submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The utils submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python2.7/dist-packages/tensorrt/legacy/lite/__init__.py:59: DeprecationWarning: The lite submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The lite submodule will been removed in a future version of the TensorRT Python API")
[TensorRT] ERROR: UFFParser: Invalid UFF file, cannot be opened
[TensorRT] ERROR: Network must have at least one output
Traceback (most recent call last):
  File "serializeAndDeserializeEngine.py", line 62, in <module>
    main()
  File "serializeAndDeserializeEngine.py", line 57, in main
    trt.legacy.utils.write_engine_to_file("./engines/1376_800.engine", engine.serialize())
AttributeError: 'NoneType' object has no attribute 'serialize'

Maybe you uploaded a bad UFF file? Anyways, I think the trick is to set verbosity BEFORE any other imports.… that way when you import TF and TRT, they set the appropriate level.

so do this FIRST

import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '0'
os.environ['TF_CPP_MIN_VLOG_LEVEL'] = '0'

… and don’t import os again.
Play around with the level to get your desired setting.

@NVES, I try to set above verbosity you said BEFORE any other imports, but it stills not work.

can you verify your repro runs without erorr as described in #7, then I can make progress?

@NVES sorry, please see the latest repro https://github.com/IvyGongoogle/tensorrt-infer

Ubuntu 16.04 LTS
GPU type:1050Ti
nvidia driver version:390.87
CUDA version:9.0
CUDNN version:7.13
Python version:3.5
TensorRT version: 5.0

I also meet the same problem.
“TypeError: deserialize_cuda_engine():incompatible function arguments.”
When using write_engine_to_file to create engine file,No error occur.

Hi AndrewGong,

replying to #10, I’m still getting the following error.

root@ed63b37475ff:/mnt/tensorrt-infer# python serializeAndDeserializeEngine.py
/usr/lib/python3.5/dist-packages/tensorrt/legacy/infer/__init__.py:5: DeprecationWarning: The infer submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The infer submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python3.5/dist-packages/tensorrt/legacy/parsers/__init__.py:4: DeprecationWarning: The parsers submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The parsers submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python3.5/dist-packages/tensorrt/legacy/utils/__init__.py:59: DeprecationWarning: The utils submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The utils submodule will been removed in a future version of the TensorRT Python API")
/usr/lib/python3.5/dist-packages/tensorrt/legacy/lite/__init__.py:59: DeprecationWarning: The lite submodule will been removed in a future version of the TensorRT Python API
You can suppress these warnings by setting `tensorrt.legacy._deprecated_helpers.SUPPRESS_DEPRECATION_WARNINGS=True` after importing, or setting the `TRT_SUPPRESS_DEPRECATION_WARNINGS` environment variable to 1
  warn_deprecated("The lite submodule will been removed in a future version of the TensorRT Python API")
[TensorRT] ERROR: UFFParser: Invalid UFF file, cannot be opened
[TensorRT] ERROR: Network must have at least one output

Aside from the uff problem, can you try the following?

import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '0'
os.environ['TF_CPP_MIN_VLOG_LEVEL'] = '0'
import sys, cv2
import common

instead of

import sys, os, cv2
import common

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '0'
os.environ['TF_CPP_MIN_VLOG_LEVEL'] = '0'

@NVES

Ubuntu 16.04 LTS
GPU type:1050Ti
nvidia driver version:390.87
CUDA version:9.0
CUDNN version:7.13
Python version:3.5
TensorRT version: 5.0

I change some lines in the main function of samply.py in offical python sample ‘network_api_pytorch_mnist’ to learn about the serialization of engine,while model.py stay still.
Three tests in main function all failed.
sample.py.zip (1.58 KB)

@NVES thanks your reply. I follow your suggestions and use code:

import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '0'
os.environ['TF_CPP_MIN_VLOG_LEVEL'] = '0'
import sys, cv2
import common

but it still not works. this drives me mad.

@NVES when I use this method with build_engine_uff(ModelData.MODEL_UFF_FILE) as engine:, the problem mentioned above no longer appear.