TensorRT inference in Windows7 system

If I have a trained model in Caffe C++, Can we create a TensorRT inference for the application running in the Windows operating system.

Hello,

yes, TensorRT does support the Windows platform. Specifically, Tensorrt supports Windows 10. For a full support matrix, please reference: https://developer.download.nvidia.com/compute/machine-learning/tensorrt/docs/5.0/GA_5.0.4.3/TensorRT-Support-Matrix-Guide.pdf

My question is, can we build a TensorRT Engine in Windows7 system. From the above link it is clear that it will support only windows10… Is there a mechanism to use TensorRT in Windows7 ?

Hello,

Microsoft has recently announced end of life for Windows 7 at start of 2020. TensorRT supports Windows 10. https://www.microsoft.com/en-us/windowsforbusiness/end-of-windows-7-support

I’ve also tried to run TensorRT on Windows 7 but it seemed impossible.

Hi,
Can you try running your model with trtexec command, and share the “”–verbose"" log in case if the issue persist
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec

You can refer below link for all the supported operators list, in case any operator is not supported you need to create a custom plugin to support that operation

Also, request you to share your model and script if not shared already so that we can help you better.

Thanks!