My device is jetson orin nano 16gb and i want to run tritonserver in my device. But i cannot start tritonserver because error : GPU instances not supported. Other models is ready except two models that have python backend model
When i try to change instancegroup in file config.pbtxt of 2 unready model above from KIND_GPU to KIND_CPU, it works. Therefore, can anyone help me to solve this problem, what does i need to do to load all model with using GPU successfully. Thank you very much
@AastaLLL I have try to set the FORCE_CPU_ONLY_INPUT_TENSORS to be both yes and no in the model configuration of two models align and preprocessing_without_landmark and instancegroup is KIND_GPU
There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks
Hi,
We want to reproduce this issue internally to check the root cause further.
Could you share the model and your steps in detail with us?