When I use prebuilt nvinferserver lib (libnvds_infer_server.so) provided with deepstream 6.2, everything works well with my setup where I use extraInputProcess feature. However, when I compile the lib myself using provided Makefile it doesn’t work. Sometimes pipeline crashes without an error when initializing the model or throws CUDA Errors. However, It works when built with optimization level 0 (O0).
I am wondering if you have provided the wrong source files in the docker image or cuda compilation has a driver or other dependencies.
It is just completely random.
Error 1:
Cuda failure: status=2
ERROR: Error: Could not allocate surface buffer
ERROR: Failed to add surface buffer into pool
ERROR: Failed to creat nvbufsurface
ERROR: cudaMalloc failed, cuda err_no:2, err_str:cudaErrorMemoryAllocation
ERROR: create cuda tensor buf failed, dt:kFp32, dims:3x700x500, name:
ERROR: Failed to create cuda tensor buffers
Error 2:
ERROR: cudaMalloc failed, cuda err_no:2, err_str:cudaErrorMemoryAllocation
ERROR: create cuda tensor buf failed, dt:kFp32, dims:4, name:BBOX
ERROR: create tensor: BBOX failed for map pool: extra_input_gpu_tensors
ERROR: create extra_input_gpu_tensors pool failed.
0:00:07.361871048 2144239 0x3e79ad0 ERROR nvinferserver gstnvinferserver.cpp:407:gst_nvinfer_server_logger:<secondary_inference_inference> nvinferserver[UID 10]: Error in loadExtraProcessor() <infer_cuda_context.cpp:194> [UID = 10]: extra processor allocating inputs failed., nvinfer error:NVDSINFER_RESOURCE_ERROR
0:00:07.361901753 2144239 0x3e79ad0 ERROR nvinferserver gstnvinferserver.cpp:407:gst_nvinfer_server_logger:<secondary_inference_inference> nvinferserver[UID 10]: Error in fixateInferenceInfo() <infer_cuda_context.cpp:173> [UID = 10]: Load extra processing functions failed., nvinfer error:NVDSINFER_RESOURCE_ERROR
0:00:07.361913367 2144239 0x3e79ad0 ERROR nvinferserver gstnvinferserver.cpp:407:gst_nvinfer_server_logger:<secondary_inference_inference> nvinferserver[UID 10]: Error in initialize() <infer_base_context.cpp:84> [UID = 10]: Infer context faied to initialize inference information, nvinfer error:NVDSINFER_RESOURCE_ERROR
0:00:07.361922991 2144239 0x3e79ad0 WARN nvinferserver gstnvinferserver_impl.cpp:588:start:<secondary_inference_inference> error: Failed to initialize InferTrtIsContext
• Hardware Platform (Jetson / GPU)
RTX A4000
• DeepStream Version
6.2 (Using docker image)
• JetPack Version (valid for Jetson only)
• TensorRT Version
Triton Server 23.03
• NVIDIA GPU Driver Version (valid for GPU only)
Latest
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)