Export Model to ONNX

I accidentally posted this question in Containers: TensorRT, when I meant to post it here:

Is it possible to export a model that was created using the INetworkDefinition functions to an ONNX file so we can visualize the network?

I’ve seen the examples for importing from the ONNX format, but not the other way around.


I don’t think currently it’s possible to export optimized TRT model back to ONNX.


It doesn’t even have to be a ONNX file. Maybe just a graph that shows the architecture.

You could at least print some info about all the layer and tensor names to at least be able to read how the layers are connected

for (int layer_idx = 0; layer_idx < networkDefinition.getNbLayers(); ++layer_idx)
   // ... print the name of the layer

   for (int input_idx = 0; input_idx < layer->getNbInputs(); ++input_idx)
      // ... print the names of the input tensors

   // ... do the same for output tensors

you could parse this somehow and create a graph yourself