Description
runtime.deserialize_cuda_engine(serialized_engine) return a NoneType
I have search from the forum, however there is no suitable solution.
Environment
TensorRT Version :
GPU Type : GTX1660
Nvidia Driver Version : 465.89
CUDA Version : 11.3
CUDNN Version : 8.4.0
Operating System + Version : win10
Python Version (if applicable) : 3.8
TensorFlow Version (if applicable) :
PyTorch Version (if applicable) : ‘1.11.0+cu113’
Baremetal or Container (if container which image + tag) :
Code below:
logger = trt.Logger(trt.Logger.WARNING)
def build_engine(onnx_file_path):
explicit_batch_flag = 1 << int(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH)
with trt.Builder(logger)as builder,builder.create_network(1 << int(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH)) as network,builder.create_builder_config()as config:
parser = trt.OnnxParser(network,logger)
success = parser.parse_from_file(onnx_file_path)
for idx in range(parser.num_errors):
print(parser.get_error(idx))
if not success:
pass
with open(onnx_file_path,'rb') as model:
print('Beginning ONNX file parsing')
parser.parse(model.read())
print("Complete parsing of ONNX file")
#config.set_memory_pool_limit(trt.MemoryPoolType.WORKSPACE,1<<20)
config.max_workspace_size=1<<30
builder.max_batch_size = 1
if builder.platform_has_fast_fp16:
config.set_flag(trt.BuilderFlag.FP16)
print('Building an engine...')
'''last_layer = network.get_layer(network.num_layers-1)
network.mark_output(last_layer.get_output(0))'''
engine = builder.build_serialized_network(network,config)
print("Completed create Engine")
return engine
engine= build_engine("7_class_cuda.onnx")
with open("sample.engine","wb")as f:
f.write(engine)
f.close()
with open("sample.engine","rb") as f:
serialized_engine = f.read()
runtime = trt.Runtime(logger)
engine_ = runtime.deserialize_cuda_engine(serialized_engine)
NVES
June 16, 2022, 10:37am
#2
Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
validating your model with the below snippet
check_model.py
import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!
hi
Seems like my onnx model is too large that can’t upload, as it is 119MB. So do you have any idea to upload my onnx model?
But it would be ok if I didn`t load the file from “sample.engine”(you can find this part in my code) and directly use the engine obtained from my build_engine function.
Hi,
Which version of the TensorRT are you using? Please use the latest TensorRT version.
Also please refer to the following sample and make sure, your script is correct.
#
# SPDX-FileCopyrightText: Copyright (c) 1993-2022 NVIDIA CORPORATION & AFFILIATES. All rights reserved.
# SPDX-License-Identifier: Apache-2.0
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import os
# This sample uses an ONNX ResNet50 Model to create a TensorRT Inference Engine
This file has been truncated. show original
Thank you.
my tensorrt version is 8.4.1.5. Actually, my problem can be simplified to I can`t read the engine from a binary data file.
Hi,
Have you tried given sample code?
Sorry for replying to you so late, yes I have tried that code. But unfortunately, when I retried that code, it didn`t work—Same code and Same process, got a different result. @spolisetty
Could you please share with us the latest script along with the model and error logs for better debugging.
code:
onnx2trt.ipynb (9.8 KB)
Model:
https://drive.google.com/file/d/1K-pgNR67id0j8vSQOIiRdFcRRREoLRrM/view?usp=sharing
Error log:
IPKernelApp] WARNING | No such comm: 1f3bd409-eddd-4eb6-8f97-c8341ad22ae8
[I 09:24:02.003 NotebookApp] Saving file at /onnx2trt.ipynb
[07/05/2022-09:24:16] [TRT] [E] 1: [pluginV2Runner.cpp::nvinfer1::rt::load::293] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
[07/05/2022-09:24:16] [TRT] [E] 4: [runtime.cpp::nvinfer1::Runtime::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
@spolisetty