Conversion to tensorRT error . [graphShapeAnalyzer.cpp::throwIfError::1306] Error Code 9

@NVES
Thanks for the answer.

Later, I found the following thread mentioning a similar error.

I understood about Reshape is not good with the wildcard(-1) specification at TensorRT.
I then used the following script to remove wildcard specified from the Reshape node.

Graph edit code.

import graphsurgeon as gs
.
.
.
graph = gs.DynamicGraph(input_path)

old_node = graph.find_nodes_by_path("Postprocessor/Reshape/shape")
new_node = gs.create_plugin_node(name='Postprocessor/Reshape/shape', op='Const', dtype=tf.int32, value=np.array([2034, 4], dtype=np.int32))
new_node.attr['value'].tensor.dtype = 3  # 3=DT_INT32
graph.collapse_namespaces({"Postprocessor/Reshape/shape": new_shape_node})

graph.write(output_path]

The following images were confirmed by Netron.

Original Graph

Edited Graph

I confirmed that Inference using the edited model was correct.

Then I tried to convert again and got another error.


 [E] Error[9]: [graphShapeAnalyzer.cpp::throwIfError::1306] Error Code 9: Internal Error (Postprocessor/Reshape: reshape changes volume)

The error message has changed, but the error is occurring on the same node (Postprocessor/Reshape).
What does this error mean?

Reshape node not change total size because 1x2034x4(input size) = 2034x4(reshaped size).

I want some advice.