When i try to do inference i get this error: 2: [pluginV2DynamicExtRunner.cpp::execute::115] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed. )
The Xavier is running :
Jetpack 4.6.1
Tensorrt 8.2.1.8
Hi @bruker58, can you try running this on a single image? I wonder if a CUDA error is actually occurring during the post-processing, which messes up TensorRT on the next frame.
Anyhow, the ONNX pre/post-processing in jetson.inference.detectNet is configured for the models trained with train_ssd.py from the Hello AI World tutorial with PyTorch. It probably needs adjusting to support your different model (such as different interpretation of the output tensors, and possibly different coefficients used for mean-pixel subtraction during pre-processing/ect)