TensoRT ONNX parser

Hi,

I am trying to import the Faster RCNN model from the onnx zoo to tensorRT.

https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/faster-rcnn

My system specifications are:

  • Windows 10
  • CUDA 10.1
  • TensorRT 6.0.1.5
  • ONNX model version: 1.5 Opset version: 10

I found several issues in the model parsing I would like to discuss.
Firstly, my sample code:

// Create the inference builder
IBuilder* builder = createInferBuilder(gLogger);

// Create the network definition
INetworkDefinition* network = builder->createNetwork();


nvonnxparser::IParser* parser = nvonnxparser::createParser(*network, gLogger);

// what is the verbosity
size_t verbosity{};
parser->parseFromFile(DEPLOY_FILE, verbosity);


if (!parser->parse(DEPLOY_FILE, verbosity))
{
    std::cout << "Failed to parse onnx file." << std::endl;
    return nullptr;
}

With this code I recieve an error like that:

WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
ERROR: Parameter check failed at: Network.cpp::nvinfer1::Network::addInput::671, condition: isValidDims(dims, hasImplicitBatchDimension())
ERROR: ModelImporter.cpp:80 In function importInput:
[8] Assertion failed: *tensor = importer_ctx->network()->addInput( input.name().c_str(), trt_dtype, trt_dims)
Inference returned: 1

Continuing this thread TensorRT onnx parser, when reading the documentation of TensorRT6 and TensorRT7, if feel like it is mixed. Specifically in section 2.2.5 Importing An ONNX Model Using The C++ ParserAPI.
Is there a mix between functions?

nvonnxparser::IONNXParser* parser = nvonnxparser::createONNXParser(network, gLogger); // TensorRT6
nvonnxparser::IParser
parser = nvonnxparser::createParser(*network, gLogger); // TensorRT7
https://docs.nvidia.com/deeplearning/sdk/pdf/TensorRT-Developer-Guide.pdf
https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/tensorrt-601/pdf/TensorRT-Developer-Guide.pdf

What are my options, should I upgrade to tensorRT 7? I find a lot of comments with error compatibilities with onnx. Any kind of suggestion is more than welcomed but I believe the documentation should be more thorough on the version compatibility.

Thank you for your time

I will recommend you to try latest TRT version.
Also, please use ONNX opset 11 to generate your model.

Thank

2 Likes

Continuing this thread TensorRT onnx parser , when reading the documentation of TensorRT6 and TensorRT7, if feel like it is mixed. Specifically in section 2.2.5 Importing An ONNX Model Using The C++ ParserAPI.
Is there a mix between functions?

nvonnxparser::IParser* parser = nvonnxparser::createParser(*network, gLogger); is correct, I believe the former nvonnxparser::createONNXParser you found was from TensorRT <= 5.0, and was outdated in TRT6 docs and fixed in TRT7 docs.

1 Like