ERROR: tlt/tlt_decode.cpp:274 failed to build network since parsing model errors

I created deepstream-test5 docker and while running it gives following errors . I am using peoplenet model so why it is giving errrors related to TAO/TLT ?
I have followed proper installations steps of deepstream-5 and able to run deepstream-test1, test2 and test3 applications.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version - 5
• TensorRT Version - 7.1
• NVIDIA GPU Driver Version (valid for GPU only) 470

Error->

0:00:00.226281164 1 0x563790252090 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 4]: Trying to create engine from model files
ERROR: …/nvdsinfer/nvdsinfer_func_utils.cpp:33 [TRT]: UffParser: Could not read buffer.
parseModel: Failed to parse UFF model
ERROR: tlt/tlt_decode.cpp:274 failed to build network since parsing model errors.
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:797 Failed to create network using custom network creation function
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:862 Failed to get cuda engine from custom library API
0:00:00.277570984 1 0x563790252090 ERROR nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<secondary_gie_0> NvDsInferContext[UID 4]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 4]: build engine file failed

Please follow README.md part Build TRT OSS Plugin to build TRT OSS library. or you can directly use TRT-OSS/x86/TRT**/**, make sure you have the compatible TRT version.

Hi @amycao
I am following deepstream_tao_apps/README.md at master · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub
this…
I used TRT_OSS_CHECKOUT_TAG = release/7.0 because I have deepstream sdk = 5.
After running cmake command I am receiving following error output -

Building for TensorRT version: 7.0.0.1, library version: 7.0.0
-- Targeting TRT Platform: x86_64
-- GPU_ARCHS defined as xy. Generating CUDA code for SM xy
-- CUDA version set to 10.2
-- cuDNN version set to 7.6
-- Protobuf version set to 3.0.0
-- Using libprotobuf /home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/build/third_party.protobuf/lib/libprotobuf.a
-- ========================= Importing and creating target nvinfer ==========================
-- Looking for library nvinfer
-- Library that was found /usr/lib/x86_64-linux-gnu/libnvinfer.so
-- ==========================================================================================
-- ========================= Importing and creating target nvuffparser ==========================
-- Looking for library nvparsers
-- Library that was found /usr/lib/x86_64-linux-gnu/libnvparsers.so
-- ==========================================================================================
-- Protobuf proto/trtcaffe.proto -> proto/trtcaffe.pb.cc proto/trtcaffe.pb.h
-- /home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/build/parsers/caffe
Generated: /home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/build/parsers/onnx/third_party/onnx/onnx/onnx_onnx2trt_onnx-ml.proto
Generated: /home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/build/parsers/onnx/third_party/onnx/onnx/onnx-operators_onnx2trt_onnx-ml.proto
-- 
-- ******** Summary ********
--   CMake version         : 3.19.4
--   CMake command         : /home/admin1/install/bin/cmake
--   System                : Linux
--   C++ compiler          : /usr/bin/g++
--   C++ compiler version  : 7.5.0
--   CXX flags             : -Wno-deprecated-declarations  -DBUILD_SYSTEM=cmake_oss -Wall -Wno-deprecated-declarations -Wno-unused-function -Wnon-virtual-dtor
--   Build type            : Release
--   Compile definitions   : _PROTOBUF_INSTALL_DIR=/home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/build;ONNX_NAMESPACE=onnx2trt_onnx
--   CMAKE_PREFIX_PATH     : 
--   CMAKE_INSTALL_PREFIX  : /usr/lib/x86_64-linux-gnu/..
--   CMAKE_MODULE_PATH     : 
-- 
--   ONNX version          : 1.6.0
--   ONNX NAMESPACE        : onnx2trt_onnx
--   ONNX_BUILD_TESTS      : OFF
--   ONNX_BUILD_BENCHMARKS : OFF
--   ONNX_USE_LITE_PROTO   : OFF
--   ONNXIFI_DUMMY_BACKEND : OFF
--   ONNXIFI_ENABLE_EXT    : OFF
-- 
--   Protobuf compiler     : 
--   Protobuf includes     : 
--   Protobuf libraries    : 
--   BUILD_ONNX_PYTHON     : OFF
-- Found TensorRT headers at /home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/include
-- Find TensorRT libs at /usr/lib/x86_64-linux-gnu/libnvinfer.so;/usr/lib/x86_64-linux-gnu/libnvinfer_plugin.so;TENSORRT_LIBRARY_MYELIN-NOTFOUND
-- Could NOT find TENSORRT (missing: TENSORRT_LIBRARY) 
ERRORCannot find TensorRT library.
-- Adding new sample: sample_char_rnn
--     - Parsers Used: none
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_dynamic_reshape
--     - Parsers Used: onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_fasterRCNN
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_googlenet
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_int8
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_int8_api
--     - Parsers Used: onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_mlp
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_mnist
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_mnist_api
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_movielens
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_movielens_mps
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_nmt
--     - Parsers Used: none
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_onnx_mnist
--     - Parsers Used: onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_plugin
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_reformat_free_io
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_ssd
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_uff_fasterRCNN
--     - Parsers Used: uff
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_uff_maskRCNN
--     - Parsers Used: uff
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_uff_mnist
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_uff_plugin_v2_ext
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_uff_ssd
--     - Parsers Used: uff
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: trtexec
--     - Parsers Used: caffe;uff;onnx
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Configuring done
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
TENSORRT_LIBRARY_MYELIN
    linked by target "nvonnxparser_static" in directory /home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/parsers/onnx
    linked by target "nvonnxparser" in directory /home/admin1/Downloads/setup/trt-oss/cmake-3.19.4/TensorRT/parsers/onnx

CMake Warning (dev) in plugin/CMakeLists.txt:
  Policy CMP0104 is not set: CMAKE_CUDA_ARCHITECTURES now detected for NVCC,
  empty CUDA_ARCHITECTURES not allowed.  Run "cmake --help-policy CMP0104"
  for policy details.  Use the cmake_policy command to set the policy and
  suppress this warning.

  CUDA_ARCHITECTURES is empty for target "nvinfer_plugin".
This warning is for project developers.  Use -Wno-dev to suppress it.

CMake Warning (dev) in plugin/CMakeLists.txt:
  Policy CMP0104 is not set: CMAKE_CUDA_ARCHITECTURES now detected for NVCC,
  empty CUDA_ARCHITECTURES not allowed.  Run "cmake --help-policy CMP0104"
  for policy details.  Use the cmake_policy command to set the policy and
  suppress this warning.

  CUDA_ARCHITECTURES is empty for target "nvinfer_plugin_static".
This warning is for project developers.  Use -Wno-dev to suppress it.

-- Generating done
CMake Generate step failed.  Build files cannot be regenerated correctly.

– Could NOT find TENSORRT (missing: TENSORRT_LIBRARY)
ERRORCannot find TensorRT library.

Did you specify TRT library path when do cmake?
-DTRT_LIB_DIR=

Hi @amycao ,

yes , I mentioned ```
-DTRT_LIB_DIR=/usr/lib/x86_64-linux-gnu/

Can you paste the output
dpkg -l|grep nvinfer

Hi @amycao ,
here is output

ii  libnvinfer-bin                                              7.1.3-1+cuda10.2                                amd64        TensorRT binaries
ii  libnvinfer-doc                                              7.1.3-1+cuda10.2                                all          TensorRT documentation
ii  libnvinfer-plugin7                                          7.1.3-1+cuda10.2                                amd64        TensorRT plugin libraries
ii  libnvinfer7                                                 7.1.3-1+cuda10.2                                amd64        TensorRT runtime libraries


2. Build TensorRT OSS Plugin

DeepStream Release TRT Version TRT_OSS_CHECKOUT_TAG
5.0 TRT 7.0.0 release/7.0
5.0.1 TRT 7.0.0 release/7.0
5.1 TRT 7.2.X 21.03
6.0 EA TRT 7.2.2 21.03
6.0 GA TRT 8.0.1 release/8.0

You used the wrong TRT version. please use the compatible version.

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks