Tao Pointpillars for CPU infrence

I have trained a pointpillars model from my application, I need to run this model on a computer with only an intel CPU and no GPU. It looked like the onnxruntime is the best solution for this. I have tried to import the model into the onnxruntime, but I am not getting the plugins to populate. Looking at the TensorRT repo the plugins all seem to pull in nvinfer and Cuda which to my understanding I can not use on my computer. Is my best option to rewrite the plugins to not use Cuda and NVinfer, or is there a better option that I am missing to run inference on a CPU only device.
Thanks

Similar to Using pointpillar .onnx model.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.