Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU)
GPU
• DeepStream Version
5.0
• JetPack Version (valid for Jetson only)
• TensorRT Version
7.1
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
/* TLT model. Use NvDsInferCudaEngineGetFromTltModel function * provided by nvdsinferutils. */ cudaEngineGetFcn = NvDsInferCudaEngineGetFromTltModel; modelPath = safeStr(initParams.tltEncodedModelFilePath); dsInferError("modelPath===》"); dsInferError(modelPath.c_str());
I have printed the absolute model path, and checked everything is ok. But it raised always this error. I think this function NvDsInferCudaEngineGetFromTltModel is supposed to be non-opened in nvdsinferutils.
what should be the reasons for that?
PS: When I installed deepstream on another pc, the same source code, and this error no more showed.