Benifits of tensorRT API

what’s the benifits of tensorRT API compared with using onnxparser or caffeparser?
is there any effeciency or performance befinits using tensorRT API ?

i mean,using tensorRT API implement DNN comparing with using trtexec or onnxparser/caffeparser converting the model,since there are much more work using tensorRT API implement DNN.

@jay_rodge @NVES @NVESJ @NVES_R @NVES_K @spolisetty

Hi,
Please check the below link, as they might answer your concerns

Thanks!

I cann’t get any answers from the link,the link just tell what’s the use of API.
I want to know it’s better to use API implement network or just use trtexec/onnxparser/caffeparser. And it’s reason.

Hi @mrsunqichang,

UFF and Caffe Parser have been deprecated from TensorRT 7 onwards, hence request you to try ONNX parser.
Please check the below link for the same.

You can also use trtexec alternatively.
https://github.com/NVIDIA/TensorRT/blob/master/samples/opensource/trtexec/README.md

Thank you.

Hi @mrsunqichang,

Both trtexec and python API helps you to build trt engine.
If you’re using custom plugin you can prefer python API.
Where as trtexec is a command line tool for building trt engine.
For more information please refer,

https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec

Thank you.