Please provide complete information as applicable to your setup.
**• Hardware Platform (Jetson / GPU)**Orin 64G development kit
• DeepStream Version7.1
• JetPack Version (valid for Jetson only) Version: 6.1+b123
• TensorRT Version10.3.0.30-1+cuda12.5
Following the instruction here , and when running build command ./build_engine.sh
inside the folder deepstream_parallel_inference_app/tritonserver
, I have the following errors.
[01/07/2025-14:32:24] [E] Error[3]: IExecutionContext::executeV2: Error Code 3: API Usage Error (Parameter check failed, condition: nullPtrAllowed. Tensor "input_1:0" is bound to nullptr, which is allowed only for an empty input tensor, shape tensor, or an output tensor associated with an IOuputAllocator.)
[01/07/2025-14:32:24] [E] Error[2]: [calibrator.cpp::calibrateEngine::1236] Error Code 2: Internal Error (Assertion context->executeV2(bindings.data()) failed. )
[01/07/2025-14:32:24] [E] Engine could not be created from network
[01/07/2025-14:32:24] [E] Building engine failed
[01/07/2025-14:32:24] [E] Failed to create engine from model or file.
[01/07/2025-14:32:24] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v100300] # trtexec --onnx=./models/peoplenet/1/resnet34_peoplenet_int8.onnx --int8 --calib=./models/peoplenet/1/resnet34_peoplenet_int8.txt --saveEngine=./models/peoplenet/1/resnet34_peoplenet_int8.onnx_b8_gpu0_int8.engine --minShapes=input_1:0:1x3x544x960 --optShapes=input_1:0:8x3x544x960 --maxShapes=input_1:0:8x3x544x960
Building Model Secondary_CarMake...
./tao-converter: error while loading shared libraries: libnvparsers.so.8: cannot open shared object file: No such file or directory
Building Model Secondary_VehicleTypes...
./tao-converter: error while loading shared libraries: libnvparsers.so.8: cannot open shared object file: No such file or directory
How to solve the errors?