It’s a little weird to ask this question but I am confused.
I use command ldd to check libnvinfer.so.8, but there is no link to cudnn but libnvinfer_plugin.so.8 has one. It seems ok to run inference only linking to libnvinfer.so.
And I also check libnvinfer.so.7 in TensorRT-7.0.0.11, it needs cudnn.
So
What‘s the differences between libnvinfer.so.8 and libnvinfer_plugin.so.8?
Is that ok to run inference only linking libnvinfer.so.8 but no cudnn offered ?
Hi,
Can you try running your model with trtexec command, and share the “”–verbose"" log in case if the issue persist
You can refer below link for all the supported operators list, in case any operator is not supported you need to create a custom plugin to support that operation
Also, request you to share your model and script if not shared already so that we can help you better.
Meanwhile, for some common errors and queries please refer to below link:
Those plugins are only required to implement some special ONNX operations and are not necessary for every TRT engine.
It is possible to forbid CUDNN tactics (e.g. using this flag of trtexec ).
--tacticSources=tactics Specify the tactics to be used by adding (+) or removing (-) tactics from the default
tactic sources (default = all available tactics).
Note: Currently only cuDNN, cuBLAS and cuBLAS-LT are listed as optional tactics.
Tactic Sources: tactics ::= [","tactic]
tactic ::= (+|-)lib
lib ::= "CUBLAS"|"CUBLAS_LT"|"CUDNN"