onnx to tensorRT error in documentation?

Dear Sir/Madam,

Could you check that the example in
https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#python_topics
and in 3.2.5. Importing From ONNX Using Python
and in 2. Create the build, network, and parser:
is correct?

I got it working by (well, it does not crash immediately as before):

with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
    with open(model_path, 'rb') as model:
        parser.parse(model.read())

It would be also nice to have documentation how to to prediction with tensorRT once one has done this preparation…

Terveisin, Markus

Hi Markus,

Please see this ONNX Resnet50 example that comes with the TensorRT Release for a reference of both parsing and doing inference:

/usr/src/tensorrt/samples/python/introductory_parser_samples/onnx_resnet50.py