How to read input tensor in c++ BLS backend as getting memory type 2 in BLS

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) RTX 4000
• DeepStream Version DS 6.2
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.5.0
• NVIDIA GPU Driver Version (valid for GPU only) 525.x.x.x
• Issue Type( questions, new requirements, bugs) question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hello,

I have built C++ BLS backend and BLS works fine if I use python application with grpc call for inferencing but if I use deepstream app do so, It is getting crashed.

I see input tensor memory is different in both the scenario, in case of python app, I am getting memory type 0 for input tensor which indicates CPU but if I use deepstream app then getting memory type 2 for input tensor which indicates GPU memory.

How can I access input image (tensor) from GPU memory into BLS model. below is my pipeline structure into triton-server.

ensemble_model → DALI → TensorRT MODEL ->BLS

DeepStream app can only work as Triton client. How did you get the memory type and where?

Hi @Fiona.Chen ,

I have created C/C++ BLS model and try to use BLS model as part of ensemble model from deepstream as described above.

BLS model expects input tensor as part of CPU memory but I am getting it in GPU memory.

I have figured it out way to copy from GPU to CPU memory and now it is working fine.
I used CUDA stream to fetch data from GPU memory to CPU memory.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

What is your nvinferserver configuration?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.