Wanted to the location of inference (either CPU or GPU)

I executed one of the sample python programs and the script name is sample.py from the directory
I have a clarification with this statement below. Wanted to check whether the do_inference() is executed on the CPU or GPU. If it is on the CPU, then wanted to know how to changed the code to perform inference on the GPU.
[output] = common.do_inference(context, bindings=bindings, inputs=inputs, outputs=outputs, stream=stream)


TensorRT uses GPU for inference.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.