Expose additional ONNX protobuf model properties via nvonnxparser::IParser

When parsing an ONNX model to build a TRT network, it would be useful to expose the additional parameters from the ONNX protobuf. Currently only the graph is exposed.

  auto onnx_parser = unique_ptr<nvonnxparser::IParser>(nvonnxparser::createParser(*network, getLogger()));
  onnx_parser->parseFromFile(onnx_path.c_str(), static_cast<int>(nvinfer1::ILogger::Severity::kINFO)))

These additions properties include:

  • doc_string
  • domain
  • ir_version
  • metadata_props
  • model_version
  • opset_import
  • producer_name
  • producer_version

The metadata_props are particularly important to me, as the metadata contain details how to execute/interpret the model, for example the class_names and preprocessing steps.

In python, reading these parameters from the ONNX is straight forward, but via C++ it’s much more involved, having the compile/include the ONNX repo and avoid issues with protobuf size limits.

As the ONNX is already being parsed by TRT, it would be straight forward to expose these properties somewhere on this interface.

I would create a PR for this myself, but I believe the code is closed source. Let me know if it’s not.



Please reach out Issues · NVIDIA/TensorRT · GitHub to create a feature request.

Thank you.

Fyi, I raised an issue here: [Feature Request] Expose additional ONNX protobuf model properties via nvonnxparser::IParser · Issue #894 · onnx/onnx-tensorrt · GitHub

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.