Does Python backend support KIND_GPU in tritonserver

My device is jetson orin nano 16gb and i want to run tritonserver in my device. But i cannot start tritonserver because error : GPU instances not supported. Other models is ready except two models that have python backend model


When i try to change instancegroup in file config.pbtxt of 2 unready model above from KIND_GPU to KIND_CPU, it works. Therefore, can anyone help me to solve this problem, what does i need to do to load all model with using GPU successfully. Thank you very much

Hi,

The Python backend does not support GPU Tensors and Async BLS

Are GPU Tensors used in your python models?
If not, could you share the model for us to reproduce?

Thanks.

Can you tell me how to check if GPU tensors used in python models. Thanks

Hi,

Please check this configuration:

Could you try to set the FORCE_CPU_ONLY_INPUT_TENSORS to be yes?
Thanks.

@AastaLLL I have try to set the FORCE_CPU_ONLY_INPUT_TENSORS to be both yes and no in the model configuration of two models align and preprocessing_without_landmark and instancegroup is KIND_GPU


But it stills got the same error

This is the image of excecute function in file model.py of model align

Can you help me to solve this ? Thanks

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Hi,

We want to reproduce this issue internally to check the root cause further.
Could you share the model and your steps in detail with us?

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.