Inference only with CPU

Is it possible to perform model inference for object detection using only the cpu as a device?

Thanks!

Hi @forflafor, if you are using a framework like PyTorch or TensorFlow, then you can choose to not have it use GPU acceleration and only run on CPU. However TensorRT only supports GPU (and DLA on Xavier).

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.