Hi, from this [website]Deploying to DeepStream for YOLOv4-tiny - NVIDIA Docs I follow steps to install TensorRT OSS.
In the step /usr/local/bin/cmake .. -DGPU_ARCHS=53 -DTRT_LIB_DIR=/usr/lib/aarch64-linux-gnu/ -DCMAKE_C_COMPILER=/usr/bin/gcc -DTRT_BIN_DIR=
pwd/out
I got this error:
~/TensorRT/build$ /usr/local/bin/cmake .. -DGPU_ARCHS=53 -DTRT_LIB_DIR=/usr/lib/aarch64-linux-gnu/ -DCMAKE_C_COMPILER=/usr/bin/gcc -DTRT_BIN_DIR=`pwd`/out
Building for TensorRT version: 7.2.2, library version: 7
-- Targeting TRT Platform: x86_64
-- CUDA version set to 10.2
-- cuDNN version set to 7.6.5
-- Protobuf version set to 3.0.0
-- Setting up another Protobuf build for cross compilation targeting aarch64-Linux
-- Using libprotobuf /home/ubuntu/TensorRT/build/third_party.protobuf_aarch64/lib/libprotobuf.a
-- ========================= Importing and creating target nvinfer ==========================
-- Looking for library nvinfer
-- Library that was found /usr/lib/aarch64-linux-gnu/libnvinfer.so
-- ==========================================================================================
-- ========================= Importing and creating target nvuffparser ==========================
-- Looking for library nvparsers
-- Library that was found /usr/lib/aarch64-linux-gnu/libnvparsers.so
-- ==========================================================================================
-- GPU_ARCHS defined as 53. Generating CUDA code for SM 53
-- Protobuf proto/trtcaffe.proto -> proto/trtcaffe.pb.cc proto/trtcaffe.pb.h
-- /home/ubuntu/TensorRT/build/parsers/caffe
Generated: /home/ubuntu/TensorRT/build/parsers/onnx/third_party/onnx/onnx/onnx_onnx2trt_onnx-ml.proto
Generated: /home/ubuntu/TensorRT/build/parsers/onnx/third_party/onnx/onnx/onnx-operators_onnx2trt_onnx-ml.proto
--
-- ******** Summary ********
-- CMake version : 3.13.5
-- CMake command : /usr/local/bin/cmake
-- System : Linux
-- C++ compiler : /usr/bin/g++
-- C++ compiler version : 7.5.0
-- CXX flags : -Wno-deprecated-declarations -DBUILD_SYSTEM=cmake_oss -Wall -Wno-deprecated-declarations -Wno-unused-function -Wnon-virtual-dtor
-- Build type : Release
-- Compile definitions : _PROTOBUF_INSTALL_DIR=/home/ubuntu/TensorRT/build/third_party.protobuf;ONNX_NAMESPACE=onnx2trt_onnx
-- CMAKE_PREFIX_PATH :
-- CMAKE_INSTALL_PREFIX : /usr/lib/aarch64-linux-gnu/..
-- CMAKE_MODULE_PATH :
--
-- ONNX version : 1.6.0
-- ONNX NAMESPACE : onnx2trt_onnx
-- ONNX_BUILD_TESTS : OFF
-- ONNX_BUILD_BENCHMARKS : OFF
-- ONNX_USE_LITE_PROTO : OFF
-- ONNXIFI_DUMMY_BACKEND : OFF
-- ONNXIFI_ENABLE_EXT : OFF
--
-- Protobuf compiler :
-- Protobuf includes :
-- Protobuf libraries :
-- BUILD_ONNX_PYTHON : OFF
-- Found TensorRT headers at /home/ubuntu/TensorRT/include
-- Find TensorRT libs at /usr/lib/aarch64-linux-gnu/libnvinfer.so;/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so;TENSORRT_LIBRARY_MYELIN-NOTFOUND
-- Could NOT find TENSORRT (missing: TENSORRT_LIBRARY)
ERRORCannot find TensorRT library.
-- Adding new sample: sample_algorithm_selector
-- - Parsers Used: caffe
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_char_rnn
-- - Parsers Used: uff;caffe;onnx
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_dynamic_reshape
-- - Parsers Used: onnx
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_fasterRCNN
-- - Parsers Used: caffe
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: sample_googlenet
-- - Parsers Used: caffe
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_int8
-- - Parsers Used: caffe
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: sample_int8_api
-- - Parsers Used: onnx
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_mlp
-- - Parsers Used: caffe
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_mnist
-- - Parsers Used: caffe
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_mnist_api
-- - Parsers Used: caffe
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_movielens
-- - Parsers Used: uff
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_movielens_mps
-- - Parsers Used: uff
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_nmt
-- - Parsers Used: none
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_onnx_mnist
-- - Parsers Used: onnx
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_plugin
-- - Parsers Used: caffe
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: sample_reformat_free_io
-- - Parsers Used: caffe
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_ssd
-- - Parsers Used: caffe
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: sample_uff_fasterRCNN
-- - Parsers Used: uff
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: sample_uff_maskRCNN
-- - Parsers Used: uff
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: sample_uff_mnist
-- - Parsers Used: uff
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_uff_plugin_v2_ext
-- - Parsers Used: uff
-- - InferPlugin Used: OFF
-- - Licensing: opensource
-- Adding new sample: sample_uff_ssd
-- - Parsers Used: uff
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: sample_onnx_mnist_coord_conv_ac
-- - Parsers Used: onnx
-- - InferPlugin Used: ON
-- - Licensing: opensource
-- Adding new sample: trtexec
-- - Parsers Used: caffe;uff;onnx
-- - InferPlugin Used: OFF
-- - Licensing: opensource
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
TENSORRT_LIBRARY_MYELIN
linked by target "nvonnxparser_static" in directory /home/ubuntu/TensorRT/parsers/onnx
linked by target "nvonnxparser" in directory /home/ubuntu/TensorRT/parsers/onnx
-- Configuring incomplete, errors occurred!
See also "/home/ubuntu/TensorRT/build/CMakeFiles/CMakeOutput.log".
See also "/home/ubuntu/TensorRT/build/CMakeFiles/CMakeError.log".
I serached this issue from github and there there is similar and same issues but their solutions didn’t solve my problem, don’t know why. Is there any package you could advice to convert onnx to engine or to use onnx in deepstream directly?
Note: I have tried build into my Jetson Nano directly w/o any container.