Hi:
I tried to test BackgroundMattingV2 's onnx model on TensorRT platform, but the function “nvonnxparser::IParse::parse()” return fail,and trt report below errors:
Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
validating your model with the below snippet
check_model.py
import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command. https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!
Hi,
I had export their original pth model to onnx format on pytorch v1.9.0 with different configs,but regardless of which of the exported model, the “parse()” function always report below errors:
TensorRT_WARNING: onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
TensorRT_WARNING: onnx2trt_utils.cpp:246: One or more weights outside the range of INT32 was clamped
TensorRT_ERROR: INVALID_ARGUMENT: getPluginCreator could not find plugin ScatterND version 1
ERROR: builtin_op_importers.cpp:3773 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found, are the plugin name, version, and namespace correct?”
Assertion failed: false, file G:\VC15\QuickBroadCast\test-app\testBackgroundMattingV2\testBackgroundMattingV2.cpp, line 115