I have a problem when I build TensorRT on windows

Description

When I build TensorRT on windows, I have got this error with cmake, the cmake version is 3.13.3

Checking for one of the modules ‘zlib’
CMake Error at C:/Program Files (x86)/CMake/share/cmake-3.13/Modules/FindPkgConfig.cmake:679 (message):
None of the required ‘zlib’ found
Call Stack (most recent call first):
third_party/zlib.cmake:18 (pkg_search_module)
CMakeLists.txt:98 (include)

GPU_ARCHS is not defined. Generating CUDA code for default SMs: 35;53;61;70;75;80
Protobuf proto/trtcaffe.proto -> proto/trtcaffe.pb.cc proto/trtcaffe.pb.h
D:/Projects/TensorRT/TensorRT/VS2017_X64/parsers/caffe
CMake Error at parsers/CMakeLists.txt:34 (add_subdirectory):
The source directory

D:/Projects/TensorRT/TensorRT/parsers/onnx

does not contain a CMakeLists.txt file.

What can I do for this ‘zlib’ and ‘onnx’ on windows 10 ?

thanks.

Environment

TensorRT Version 7.2.1.6:
CUDA Version 11.1:
CUDNN Version 8.0:
Operating System Windows 10:

Hi @45696281,
Can you please check on the installation steps.

Thanks!

Thank you, I see, and I want trained tensorflow model , which just like model.ckpt ,and I frozen it as frozen_inference_graph.pb, so how can I convert it to TensorRT model? Is there have a example on Windows 10?

Thank you , I found convert_to_uff.py in my anaconda3 install path

Hi @45696281,
UFF parser has been deprecated from TRT 7 onwards.

You can either use TF-TRT conversion method.
or you can go with .pb << onnx << TRT engine approach.

Thanks!

Perfect! I got it.

Thank you!