TensorRT 8.2.0.6 Python parse report an error - Failed to add input to the network

Description

I’m using both TRT 8.2.0.6 Python and C++ versions to inference my onnx model.
While I’m getting to activate the onnx parse operation, only C++ version is successfully completed and no error is reported while the Python raise an error -
*In node -1 (importInput): UNSUPPORTED_NODE: Assertion failed: (tensor = ctx->network()->addInput(input.name().c_str(), trtDtype, trt_dims)) && "Failed to add input to the network.

Environment

TensorRT Version: 8.2.0.6
GPU Type: GeForce GTX 1080
Nvidia Driver Version: 460.32.03
CUDA Version: 11.2
CUDNN Version: 8.2.1
Operating System + Version: Linux Ubuntu 18.04
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable): NA
PyTorch Version (if applicable): NA
Baremetal or Container (if container which image + tag): Baremetal

Relevant Files

Onnx model

Steps To Reproduce

        
        self.logger = trt.Logger(trt.Logger.INTERNAL_ERROR)

        self.builder = trt.Builder(self.logger)
        self.builder.max_batch_size = maxBatchSize

        self.config = self.builder.create_builder_config()

        self.EXPLICIT_BATCH = 1 << (int)(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH)
        self.network = self.builder.create_network(self.EXPLICIT_BATCH)
        self.parser = trt.OnnxParser(self.network, self.logger)
        self.runtime = trt.Runtime(self.logger)
        self.model = model
        try:
            parseResult = self.parser.parse_from_file(self.modelPath)
        except BaseException as e:
            msg = e
            self.isParsed = False
            print(ca.bcolors.FAIL + 'TRT parse model ERROR - ', msg)

        if(self.isParsed == True):
            print(ca.bcolors.HEADER + 'Completed Onnx file parsing')
            print(ca.bcolors.HEADER + "TensorRT model parse report:")
            if (not parseResult):
                for error in range(self.parser.num_errors):
                    print(ca.bcolors.FAIL + str(self.parser.get_error(error)))
            else:
                print(ca.bcolors.OKGREEN + "Model parsing OK!")

        

Hi,

Please refer python samples here and make sure your script is correct.

If you still face this issue we recommend you to please share complete issue repro script with sample input to try from our end and also verbose error logs for better debugging.

Thank you.

Hello,
Sorry for the late response.
After I checked the onnx_resnet50.py sample I found that it doesn’t use the same parse API which I described above.
Th sample use the parse API which get as an argument the model after it was binary loaded.
When I’m using the same API (parse) in my code everything works fine and I’m not getting any error.
but, If I returned back to the API parse_from_file, I always get the error I described above.

I also made the following changes in the onnx_resnet50.py in function build_engine_onnx.
I added these lines:

result = parser.parse_from_file(model_file)
for error in range(parser.num_errors)
print(parser.get_error(error ))

And no error were raised…but then when the I continue to run and perform the original function lines of code:

with open(model_file, ‘rb’) as model:
if not parser.parse(model.read()):
print (‘ERROR: Failed to parse the ONNX file.’)
for error in range(parser.num_errors):
print (parser.get_error(error))
return None

Now I got the same error:

In node -1 (importInput): UNSUPPORTED_NODE: Assertion failed: ( tensor = ctx->network()->addInput(input.name().c_str(), trtDtype, trt_dims)) && "Failed to add input to the network.

Can you explain please why these two APIs works differently on the same model?
Can you explain please why these two APIs impact one on each other?
Thanks,