TensorRT5 onnx parser API issue

I found the doc of TensorRT5 (in the downloaded pdf SWE-SWDOCTRT-001-DEVG_v5.0.2) inconsistent with api.

The MNIST sample in section 9.1.4.1 is different in the actual sample code sampleOnnxMNIST.cpp

In pp68-69,
this is the sample from the doc

nvonnxparser::IONNXParser* parser = nvonnxparser::createONNXParser(*config);

However there is no createONNXParser in the header file NvOnnxParser.h, and there is not a api that accepts a config object either (it should be createONNXParser, according to the doc)

In ONNXParser::parse previously I can specify datatype. I noticed this is changed now. Could you also explain how can I specify a datatype (int8) now in ONNXParser?

Same issue.
When using TensorRT4.0, i can set fp16 mode and actually speed up 180%.
But using TensorRT5.0, i cannot find a param to set parser datatype, only using buider.setfp16mode(true) and i cannot speed up.

I had the same problem, and solved by changing all my codes according to the new samples in JetPack4.0.
including the Make.config file.

Hi jongchan,

I had the same problem. Could you share how to change the code?

Thank.