TensorRT OSS build

Hi, from this [website]Deploying to DeepStream for YOLOv4-tiny - NVIDIA Docs I follow steps to install TensorRT OSS.
In the step /usr/local/bin/cmake .. -DGPU_ARCHS=53 -DTRT_LIB_DIR=/usr/lib/aarch64-linux-gnu/ -DCMAKE_C_COMPILER=/usr/bin/gcc -DTRT_BIN_DIR=pwd/out I got this error:

~/TensorRT/build$ /usr/local/bin/cmake .. -DGPU_ARCHS=53  -DTRT_LIB_DIR=/usr/lib/aarch64-linux-gnu/ -DCMAKE_C_COMPILER=/usr/bin/gcc -DTRT_BIN_DIR=`pwd`/out
Building for TensorRT version: 7.2.2, library version: 7
-- Targeting TRT Platform: x86_64
-- CUDA version set to 10.2
-- cuDNN version set to 7.6.5
-- Protobuf version set to 3.0.0
-- Setting up another Protobuf build for cross compilation targeting aarch64-Linux
-- Using libprotobuf /home/ubuntu/TensorRT/build/third_party.protobuf_aarch64/lib/libprotobuf.a
-- ========================= Importing and creating target nvinfer ==========================
-- Looking for library nvinfer
-- Library that was found /usr/lib/aarch64-linux-gnu/libnvinfer.so
-- ==========================================================================================
-- ========================= Importing and creating target nvuffparser ==========================
-- Looking for library nvparsers
-- Library that was found /usr/lib/aarch64-linux-gnu/libnvparsers.so
-- ==========================================================================================
-- GPU_ARCHS defined as 53. Generating CUDA code for SM 53
-- Protobuf proto/trtcaffe.proto -> proto/trtcaffe.pb.cc proto/trtcaffe.pb.h
-- /home/ubuntu/TensorRT/build/parsers/caffe
Generated: /home/ubuntu/TensorRT/build/parsers/onnx/third_party/onnx/onnx/onnx_onnx2trt_onnx-ml.proto
Generated: /home/ubuntu/TensorRT/build/parsers/onnx/third_party/onnx/onnx/onnx-operators_onnx2trt_onnx-ml.proto
-- 
-- ******** Summary ********
--   CMake version         : 3.13.5
--   CMake command         : /usr/local/bin/cmake
--   System                : Linux
--   C++ compiler          : /usr/bin/g++
--   C++ compiler version  : 7.5.0
--   CXX flags             : -Wno-deprecated-declarations  -DBUILD_SYSTEM=cmake_oss -Wall -Wno-deprecated-declarations -Wno-unused-function -Wnon-virtual-dtor
--   Build type            : Release
--   Compile definitions   : _PROTOBUF_INSTALL_DIR=/home/ubuntu/TensorRT/build/third_party.protobuf;ONNX_NAMESPACE=onnx2trt_onnx
--   CMAKE_PREFIX_PATH     : 
--   CMAKE_INSTALL_PREFIX  : /usr/lib/aarch64-linux-gnu/..
--   CMAKE_MODULE_PATH     : 
-- 
--   ONNX version          : 1.6.0
--   ONNX NAMESPACE        : onnx2trt_onnx
--   ONNX_BUILD_TESTS      : OFF
--   ONNX_BUILD_BENCHMARKS : OFF
--   ONNX_USE_LITE_PROTO   : OFF
--   ONNXIFI_DUMMY_BACKEND : OFF
--   ONNXIFI_ENABLE_EXT    : OFF
-- 
--   Protobuf compiler     : 
--   Protobuf includes     : 
--   Protobuf libraries    : 
--   BUILD_ONNX_PYTHON     : OFF
-- Found TensorRT headers at /home/ubuntu/TensorRT/include
-- Find TensorRT libs at /usr/lib/aarch64-linux-gnu/libnvinfer.so;/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so;TENSORRT_LIBRARY_MYELIN-NOTFOUND
-- Could NOT find TENSORRT (missing: TENSORRT_LIBRARY) 
ERRORCannot find TensorRT library.
-- Adding new sample: sample_algorithm_selector
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_char_rnn
--     - Parsers Used: uff;caffe;onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_dynamic_reshape
--     - Parsers Used: onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_fasterRCNN
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_googlenet
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_int8
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_int8_api
--     - Parsers Used: onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_mlp
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_mnist
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_mnist_api
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_movielens
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_movielens_mps
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_nmt
--     - Parsers Used: none
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_onnx_mnist
--     - Parsers Used: onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_plugin
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_reformat_free_io
--     - Parsers Used: caffe
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_ssd
--     - Parsers Used: caffe
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_uff_fasterRCNN
--     - Parsers Used: uff
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_uff_maskRCNN
--     - Parsers Used: uff
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_uff_mnist
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_uff_plugin_v2_ext
--     - Parsers Used: uff
--     - InferPlugin Used: OFF
--     - Licensing: opensource
-- Adding new sample: sample_uff_ssd
--     - Parsers Used: uff
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: sample_onnx_mnist_coord_conv_ac
--     - Parsers Used: onnx
--     - InferPlugin Used: ON
--     - Licensing: opensource
-- Adding new sample: trtexec
--     - Parsers Used: caffe;uff;onnx
--     - InferPlugin Used: OFF
--     - Licensing: opensource
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
TENSORRT_LIBRARY_MYELIN
    linked by target "nvonnxparser_static" in directory /home/ubuntu/TensorRT/parsers/onnx
    linked by target "nvonnxparser" in directory /home/ubuntu/TensorRT/parsers/onnx

-- Configuring incomplete, errors occurred!
See also "/home/ubuntu/TensorRT/build/CMakeFiles/CMakeOutput.log".
See also "/home/ubuntu/TensorRT/build/CMakeFiles/CMakeError.log".

I serached this issue from github and there there is similar and same issues but their solutions didn’t solve my problem, don’t know why. Is there any package you could advice to convert onnx to engine or to use onnx in deepstream directly?
Note: I have tried build into my Jetson Nano directly w/o any container.

Also I tried to run trtexec from /usr/src/tensorrt/bin/ , and I got this type of error:

/usr/src/tensorrt/bin$ ./trtexec --onnx=/home/ubuntu/Downloads/yolov4_cspdarknet_tiny_epoch_500.onnx  --saveEngine=deneme.trt --verbose
&&&& RUNNING TensorRT.trtexec [TensorRT v8201] # ./trtexec --onnx=/home/ubuntu/Downloads/yolov4_cspdarknet_tiny_epoch_500.onnx --saveEngine=deneme.trt --verbose
[08/22/2023-18:22:23] [I] === Model Options ===
[08/22/2023-18:22:23] [I] Format: ONNX
[08/22/2023-18:22:23] [I] Model: /home/ubuntu/Downloads/yolov4_cspdarknet_tiny_epoch_500.onnx
[08/22/2023-18:22:23] [I] Output:
[08/22/2023-18:22:23] [I] === Build Options ===
[08/22/2023-18:22:23] [I] Max batch: explicit batch
[08/22/2023-18:22:23] [I] Workspace: 16 MiB
[08/22/2023-18:22:23] [I] minTiming: 1
[08/22/2023-18:22:23] [I] avgTiming: 8
[08/22/2023-18:22:23] [I] Precision: FP32
[08/22/2023-18:22:23] [I] Calibration: 
[08/22/2023-18:22:23] [I] Refit: Disabled
[08/22/2023-18:22:23] [I] Sparsity: Disabled
[08/22/2023-18:22:23] [I] Safe mode: Disabled
[08/22/2023-18:22:23] [I] DirectIO mode: Disabled
[08/22/2023-18:22:23] [I] Restricted mode: Disabled
[08/22/2023-18:22:23] [I] Save engine: deneme.trt
[08/22/2023-18:22:23] [I] Load engine: 
[08/22/2023-18:22:23] [I] Profiling verbosity: 0
[08/22/2023-18:22:23] [I] Tactic sources: Using default tactic sources
[08/22/2023-18:22:23] [I] timingCacheMode: local
[08/22/2023-18:22:23] [I] timingCacheFile: 
[08/22/2023-18:22:23] [I] Input(s)s format: fp32:CHW
[08/22/2023-18:22:23] [I] Output(s)s format: fp32:CHW
[08/22/2023-18:22:23] [I] Input build shapes: model
[08/22/2023-18:22:23] [I] Input calibration shapes: model
[08/22/2023-18:22:23] [I] === System Options ===
[08/22/2023-18:22:23] [I] Device: 0
[08/22/2023-18:22:23] [I] DLACore: 
[08/22/2023-18:22:23] [I] Plugins:
[08/22/2023-18:22:23] [I] === Inference Options ===
[08/22/2023-18:22:23] [I] Batch: Explicit
[08/22/2023-18:22:23] [I] Input inference shapes: model
[08/22/2023-18:22:23] [I] Iterations: 10
[08/22/2023-18:22:23] [I] Duration: 3s (+ 200ms warm up)
[08/22/2023-18:22:23] [I] Sleep time: 0ms
[08/22/2023-18:22:23] [I] Idle time: 0ms
[08/22/2023-18:22:23] [I] Streams: 1
[08/22/2023-18:22:23] [I] ExposeDMA: Disabled
[08/22/2023-18:22:23] [I] Data transfers: Enabled
[08/22/2023-18:22:23] [I] Spin-wait: Disabled
[08/22/2023-18:22:23] [I] Multithreading: Disabled
[08/22/2023-18:22:23] [I] CUDA Graph: Disabled
[08/22/2023-18:22:23] [I] Separate profiling: Disabled
[08/22/2023-18:22:23] [I] Time Deserialize: Disabled
[08/22/2023-18:22:23] [I] Time Refit: Disabled
[08/22/2023-18:22:23] [I] Skip inference: Disabled
[08/22/2023-18:22:23] [I] Inputs:
[08/22/2023-18:22:23] [I] === Reporting Options ===
[08/22/2023-18:22:23] [I] Verbose: Enabled
[08/22/2023-18:22:23] [I] Averages: 10 inferences
[08/22/2023-18:22:23] [I] Percentile: 99
[08/22/2023-18:22:23] [I] Dump refittable layers:Disabled
[08/22/2023-18:22:23] [I] Dump output: Disabled
[08/22/2023-18:22:23] [I] Profile: Disabled
[08/22/2023-18:22:23] [I] Export timing to JSON file: 
[08/22/2023-18:22:23] [I] Export output to JSON file: 
[08/22/2023-18:22:23] [I] Export profile to JSON file: 
[08/22/2023-18:22:23] [I] 
[08/22/2023-18:22:23] [I] === Device Information ===
[08/22/2023-18:22:23] [I] Selected Device: NVIDIA Tegra X1
[08/22/2023-18:22:23] [I] Compute Capability: 5.3
[08/22/2023-18:22:23] [I] SMs: 1
[08/22/2023-18:22:23] [I] Compute Clock Rate: 0.9216 GHz
[08/22/2023-18:22:23] [I] Device Global Memory: 3963 MiB
[08/22/2023-18:22:23] [I] Shared Memory per SM: 64 KiB
[08/22/2023-18:22:23] [I] Memory Bus Width: 64 bits (ECC disabled)
[08/22/2023-18:22:23] [I] Memory Clock Rate: 0.01275 GHz
[08/22/2023-18:22:23] [I] 
[08/22/2023-18:22:23] [I] TensorRT version: 8.2.1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::GridAnchor_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::GridAnchorRect_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::NMS_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::Reorg_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::Region_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::Clip_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::LReLU_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::PriorBox_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::Normalize_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::ScatterND version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::RPROI_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::BatchedNMS_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::BatchedNMSDynamic_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::FlattenConcat_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::CropAndResize version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::DetectionLayer_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::EfficientNMS_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::EfficientNMS_ONNX_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::EfficientNMS_TFTRT_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::Proposal version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::ProposalLayer_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::PyramidROIAlign_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::ResizeNearest_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::Split version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::SpecialSlice_TRT version 1
[08/22/2023-18:22:23] [V] [TRT] Registered plugin creator - ::InstanceNormalization_TRT version 1
[08/22/2023-18:22:25] [I] [TRT] [MemUsageChange] Init CUDA: CPU +229, GPU +0, now: CPU 248, GPU 3784 (MiB)
[08/22/2023-18:22:25] [I] [TRT] [MemUsageSnapshot] Begin constructing builder kernel library: CPU 248 MiB, GPU 3747 MiB
[08/22/2023-18:22:25] [I] [TRT] [MemUsageSnapshot] End constructing builder kernel library: CPU 277 MiB, GPU 3777 MiB
[08/22/2023-18:22:25] [I] Start parsing network model
[08/22/2023-18:22:25] [I] [TRT] ----------------------------------------------------------------
[08/22/2023-18:22:25] [I] [TRT] Input filename:   /home/ubuntu/Downloads/yolov4_cspdarknet_tiny_epoch_500.onnx
[08/22/2023-18:22:25] [I] [TRT] ONNX IR version:  0.0.0
[08/22/2023-18:22:25] [I] [TRT] Opset version:    0
[08/22/2023-18:22:25] [I] [TRT] Producer name:    
[08/22/2023-18:22:25] [I] [TRT] Producer version: 
[08/22/2023-18:22:25] [I] [TRT] Domain:           
[08/22/2023-18:22:25] [I] [TRT] Model version:    0
[08/22/2023-18:22:25] [I] [TRT] Doc string:       
[08/22/2023-18:22:25] [I] [TRT] ----------------------------------------------------------------
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::GridAnchor_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::GridAnchorRect_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::NMS_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::Reorg_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::Region_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::Clip_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::LReLU_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::PriorBox_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::Normalize_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::ScatterND version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::RPROI_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::BatchedNMS_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::BatchedNMSDynamic_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::FlattenConcat_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::CropAndResize version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::DetectionLayer_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::EfficientNMS_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::EfficientNMS_ONNX_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::EfficientNMS_TFTRT_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::Proposal version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::ProposalLayer_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::PyramidROIAlign_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::ResizeNearest_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::Split version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::SpecialSlice_TRT version 1
[08/22/2023-18:22:25] [V] [TRT] Plugin creator already registered - ::InstanceNormalization_TRT version 1
[08/22/2023-18:22:25] [I] Finish parsing network model
[08/22/2023-18:22:25] [E] Error[4]: [network.cpp::validate::2633] Error Code 4: Internal Error (Network must have at least one output)
[08/22/2023-18:22:25] [E] Error[2]: [builder.cpp::buildSerializedNetwork::609] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed. )
[08/22/2023-18:22:25] [E] Engine could not be created from network
[08/22/2023-18:22:25] [E] Building engine failed
[08/22/2023-18:22:25] [E] Failed to create engine from model.
[08/22/2023-18:22:25] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8201] # ./trtexec --onnx=/home/ubuntu/Downloads/yolov4_cspdarknet_tiny_epoch_500.onnx --saveEngine=deneme.trt --verbose

Hi,

Could you please try the Polygraphy tool sanitization.

 polygraphy surgeon sanitize model.onnx --fold-constants --output model_folded.onnx

If you still face the same issue, please share the issue repro ONNX model to try from our end for better debugging.

Thank you.

In my folder path there is only trtexec tool can be runable there is not any tools folder in this package why?

Could you answer the first question because I cannot build tensorRT in my Jetson Nano that’s why I cannot create any trt in this package ?EDIT: I could build TensortRT package just by passing another version of it (8.2) but I cannot create bin file of trtexec make and cmake commands doesnot work in …samples/trtexec folder

got this error:

[!] Module: 'onnx_graphsurgeon' is required but could not be imported.
    You can try setting POLYGRAPHY_AUTOINSTALL_DEPS=1 in your environment variables to allow Polygraphy to automatically install missing modules.
    Note that this may cause existing modules to be overwritten - hence, it may be desirable to use a Python virtual environment or container.

Please do export POLYGRAPHY_AUTOINSTALL_DEPS=1 (add environment variable) and try again.

I cannot build TensorRT:

~/TensorRT/build$ make nvinfer_plugin
[  0%] Building CUDA object plugin/CMakeFiles/nvinfer_plugin.dir/efficientNMSPlugin/efficientNMSInference.cu.o
/home/ubuntu/TensorRT/plugin/efficientNMSPlugin/efficientNMSInference.cu:18:10: fatal error: cub/cub.cuh: No such file or directory
 #include "cub/cub.cuh"
          ^~~~~~~~~~~~~
compilation terminated.
plugin/CMakeFiles/nvinfer_plugin.dir/build.make:569: recipe for target 'plugin/CMakeFiles/nvinfer_plugin.dir/efficientNMSPlugin/efficientNMSInference.cu.o' failed
make[3]: *** [plugin/CMakeFiles/nvinfer_plugin.dir/efficientNMSPlugin/efficientNMSInference.cu.o] Error 1
CMakeFiles/Makefile2:326: recipe for target 'plugin/CMakeFiles/nvinfer_plugin.dir/all' failed
make[2]: *** [plugin/CMakeFiles/nvinfer_plugin.dir/all] Error 2
CMakeFiles/Makefile2:338: recipe for target 'plugin/CMakeFiles/nvinfer_plugin.dir/rule' failed
make[1]: *** [plugin/CMakeFiles/nvinfer_plugin.dir/rule] Error 2
Makefile:238: recipe for target 'nvinfer_plugin' failed
make: *** [nvinfer_plugin] Error 2

that’s why cannot use

~/TensorRT/tools/Polygraphy/bin$ ./polygraphy surgeon sanitize /home/ubuntu/Downloads/yolov4_cspdarknet_tiny_epoch_500.onnx --fold-constants --output model_folded.onnx
[I] Module: 'onnx_graphsurgeon' is required, but not installed. Attempting to install now.
[I] Running installation command: /usr/bin/python3 -m pip install onnx_graphsurgeon --extra-index-url=https://pypi.ngc.nvidia.com
Defaulting to user installation because normal site-packages is not writeable
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com, https://pypi.ngc.nvidia.com
Collecting onnx_graphsurgeon
  Downloading https://developer.download.nvidia.com/compute/redist/onnx-graphsurgeon/onnx_graphsurgeon-0.3.27-py2.py3-none-any.whl (42 kB)
     |████████████████████████████████| 42 kB 4.3 MB/s            
Requirement already satisfied: numpy in /home/ubuntu/.local/lib/python3.6/site-packages (from onnx_graphsurgeon) (1.19.5)
Collecting onnx
  Downloading onnx-1.14.0.tar.gz (11.3 MB)
     |████████████████████████████████| 11.3 MB 2.8 MB/s            
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Collecting typing-extensions>=3.6.2.1
  Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB)
Requirement already satisfied: protobuf>=3.20.2 in /home/ubuntu/.local/lib/python3.6/site-packages (from onnx->onnx_graphsurgeon) (4.21.0)
Building wheels for collected packages: onnx
  Building wheel for onnx (pyproject.toml) ... error
  ERROR: Command errored out with exit status 1:
   command: /usr/bin/python3 /home/ubuntu/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmpyw9j9ch4
       cwd: /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164
  Complete output (179 lines):
  fatal: not a git repository (or any of the parent directories): .git
  running bdist_wheel
  running build
  running build_py
  running create_version
  running cmake_build
  Using cmake args: ['/usr/local/bin/cmake', '-DPYTHON_INCLUDE_DIR=/usr/include/python3.6m', '-DPYTHON_EXECUTABLE=/usr/bin/python3', '-DBUILD_ONNX_PYTHON=ON', '-DCMAKE_EXPORT_COMPILE_COMMANDS=ON', '-DONNX_NAMESPACE=onnx', '-DPY_EXT_SUFFIX=.cpython-36m-aarch64-linux-gnu.so', '-DCMAKE_BUILD_TYPE=Release', '-DONNX_ML=1', '/tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164']
  -- The C compiler identification is GNU 7.5.0
  -- The CXX compiler identification is GNU 7.5.0
  -- Check for working C compiler: /usr/bin/cc
  -- Check for working C compiler: /usr/bin/cc -- works
  -- Detecting C compiler ABI info
  -- Detecting C compiler ABI info - done
  -- Detecting C compile features
  -- Detecting C compile features - done
  -- Check for working CXX compiler: /usr/bin/c++
  -- Check for working CXX compiler: /usr/bin/c++ -- works
  -- Detecting CXX compiler ABI info
  -- Detecting CXX compiler ABI info - done
  -- Detecting CXX compile features
  -- Detecting CXX compile features - done
  -- Found PythonInterp: /usr/bin/python3 (found version "3.6.9")
  -- Found PythonLibs: /usr/lib/aarch64-linux-gnu/libpython3.6m.so (found version "3.6.9")
  -- Found Protobuf: /usr/lib/aarch64-linux-gnu/libprotobuf.a;-lpthread (found version "3.0.0")
  Generated: /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-ml.proto
  Generated: /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-operators-ml.proto
  Generated: /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-data.proto
  -- Could NOT find pybind11 (missing: pybind11_DIR)
  -- pybind11 v2.10.3
  -- Found PythonInterp: /usr/bin/python3 (found suitable version "3.6.9", minimum required is "3.6")
  -- Found PythonLibs: /usr/lib/aarch64-linux-gnu/libpython3.6m.so
  -- Performing Test HAS_FLTO
  -- Performing Test HAS_FLTO - Success
  --
  -- ******** Summary ********
  --   CMake version             : 3.13.5
  --   CMake command             : /usr/local/bin/cmake
  --   System                    : Linux
  --   C++ compiler              : /usr/bin/c++
  --   C++ compiler version      : 7.5.0
  --   CXX flags                 :  -Wnon-virtual-dtor
  --   Build type                : Release
  --   Compile definitions       : __STDC_FORMAT_MACROS
  --   CMAKE_PREFIX_PATH         :
  --   CMAKE_INSTALL_PREFIX      : /usr/local
  --   CMAKE_MODULE_PATH         :
  --
  --   ONNX version              : 1.14.0
  --   ONNX NAMESPACE            : onnx
  --   ONNX_USE_LITE_PROTO       : OFF
  --   USE_PROTOBUF_SHARED_LIBS  : OFF
  --   Protobuf_USE_STATIC_LIBS  : ON
  --   ONNX_DISABLE_EXCEPTIONS   : OFF
  --   ONNX_WERROR               : OFF
  --   ONNX_BUILD_TESTS          : OFF
  --   ONNX_BUILD_BENCHMARKS     : OFF
  --
  --   Protobuf compiler         : /usr/bin/protoc
  --   Protobuf includes         : /usr/include
  --   Protobuf libraries        : /usr/lib/aarch64-linux-gnu/libprotobuf.a;-lpthread
  --   BUILD_ONNX_PYTHON         : ON
  --     Python version        :
  --     Python executable     : /usr/bin/python3
  --     Python includes       : /usr/include/python3.6m
  -- Configuring done
  -- Generating done
  -- Build files have been written to: /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build
  Scanning dependencies of target gen_onnx_proto
  [  1%] Running gen_proto.py on onnx/onnx.in.proto
  Processing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/onnx.in.proto
  Writing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-ml.proto
  Writing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-ml.proto3
  generating /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx_pb.py
  [  2%] Running C++ protocol buffer compiler on /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-ml.proto
  Failed to generate mypy stubs: No module named 'google'
  [  2%] Built target gen_onnx_proto
  Scanning dependencies of target gen_onnx_operators_proto
  Scanning dependencies of target gen_onnx_data_proto
  [  4%] Running gen_proto.py on onnx/onnx-data.in.proto
  [  5%] Running gen_proto.py on onnx/onnx-operators.in.proto
  Processing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/onnx-data.in.proto
  Writing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-data.proto
  Writing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-data.proto3
  generating /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx_data_pb.py
  Processing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/onnx-operators.in.proto
  Writing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-operators-ml.proto
  Writing /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-operators-ml.proto3
  generating /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx_operators_pb.py
  [  7%] Running C++ protocol buffer compiler on /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-data.proto
  [  8%] Running C++ protocol buffer compiler on /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/.setuptools-cmake-build/onnx/onnx-operators-ml.proto
  Failed to generate mypy stubs: No module named 'google'
  Failed to generate mypy stubs: No module named 'google'
  [  8%] Built target gen_onnx_data_proto
  [  8%] Built target gen_onnx_operators_proto
  Scanning dependencies of target onnx_proto
  [ 13%] Building CXX object CMakeFiles/onnx_proto.dir/onnx/onnx-data.pb.cc.o
  [ 13%] Building CXX object CMakeFiles/onnx_proto.dir/onnx/onnx-ml.pb.cc.o
  [ 13%] Building CXX object CMakeFiles/onnx_proto.dir/onnx/onnx-operators-ml.pb.cc.o
  [ 14%] Linking CXX static library libonnx_proto.a
  [ 23%] Built target onnx_proto
  Scanning dependencies of target onnx
  [ 26%] Building CXX object CMakeFiles/onnx.dir/onnx/common/assertions.cc.o
  [ 27%] Building CXX object CMakeFiles/onnx.dir/onnx/common/interned_strings.cc.o
  [ 26%] Building CXX object CMakeFiles/onnx.dir/onnx/checker.cc.o
  [ 29%] Building CXX object CMakeFiles/onnx.dir/onnx/common/ir_pb_converter.cc.o
  [ 30%] Building CXX object CMakeFiles/onnx.dir/onnx/common/model_helpers.cc.o
  In file included from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/schema.h:25:0,
                   from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/function.h:16,
                   from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/checker.h:10,
                   from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/checker.cc:5:
  /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/shape_inference.h: In function ‘void onnx::adjustNegativeAxes(Axes&, int)’:
  /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/shape_inference.h:828:8: error: ‘transform’ is not a member of ‘std’
     std::transform(
          ^~~~~~~~~
  In file included from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/schema.h:25:0,
                   from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/function.h:16,
                   from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/checker.h:10,
                   from /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/common/model_helpers.cc:6:
  /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/shape_inference.h: In function ‘void onnx::adjustNegativeAxes(Axes&, int)’:
  /tmp/pip-install-yhmlysz3/onnx_7a9ae4bcdd254996bae08f56d4fbe164/onnx/defs/shape_inference.h:828:8: error: ‘transform’ is not a member of ‘std’
     std::transform(
          ^~~~~~~~~
  CMakeFiles/onnx.dir/build.make:114: recipe for target 'CMakeFiles/onnx.dir/onnx/common/model_helpers.cc.o' failed
  make[2]: *** [CMakeFiles/onnx.dir/onnx/common/model_helpers.cc.o] Error 1
  make[2]: *** Waiting for unfinished jobs....
  CMakeFiles/onnx.dir/build.make:62: recipe for target 'CMakeFiles/onnx.dir/onnx/checker.cc.o' failed
  make[2]: *** [CMakeFiles/onnx.dir/onnx/checker.cc.o] Error 1
  CMakeFiles/Makefile2:207: recipe for target 'CMakeFiles/onnx.dir/all' failed
  make[1]: *** [CMakeFiles/onnx.dir/all] Error 2
  Makefile:129: recipe for target 'all' failed
  make: *** [all] Error 2
  Traceback (most recent call last):
    File "/home/ubuntu/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
      main()
    File "/home/ubuntu/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "/home/ubuntu/.local/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 262, in build_wheel
      metadata_directory)
    File "/tmp/pip-build-env-5izomkcg/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 231, in build_wheel
      wheel_directory, config_settings)
    File "/tmp/pip-build-env-5izomkcg/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 215, in _build_with_temp_dir
      self.run_setup()
    File "/tmp/pip-build-env-5izomkcg/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 268, in run_setup
      self).run_setup(setup_script=setup_script)
    File "/tmp/pip-build-env-5izomkcg/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 158, in run_setup
      exec(compile(code, __file__, 'exec'), locals())
    File "setup.py", line 365, in <module>
      "backend-test-tools = onnx.backend.test.cmd_tools:main",
    File "/tmp/pip-build-env-5izomkcg/overlay/lib/python3.6/site-packages/setuptools/__init__.py", line 153, in setup
      return distutils.core.setup(**attrs)
    File "/usr/lib/python3.6/distutils/core.py", line 148, in setup
      dist.run_commands()
    File "/usr/lib/python3.6/distutils/dist.py", line 955, in run_commands
      self.run_command(cmd)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/tmp/pip-build-env-5izomkcg/overlay/lib/python3.6/site-packages/wheel/bdist_wheel.py", line 299, in run
      self.run_command('build')
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "/usr/lib/python3.6/distutils/command/build.py", line 135, in run
      self.run_command(cmd_name)
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "setup.py", line 236, in run
      self.run_command("cmake_build")
    File "/usr/lib/python3.6/distutils/cmd.py", line 313, in run_command
      self.distribution.run_command(command)
    File "/usr/lib/python3.6/distutils/dist.py", line 974, in run_command
      cmd_obj.run()
    File "setup.py", line 230, in run
      subprocess.check_call(build_args)
    File "/usr/lib/python3.6/subprocess.py", line 311, in check_call
      raise CalledProcessError(retcode, cmd)
  subprocess.CalledProcessError: Command '['/usr/local/bin/cmake', '--build', '.', '--', '-j', '4']' returned non-zero exit status 2.
  ----------------------------------------
  ERROR: Failed building wheel for onnx
Failed to build onnx
ERROR: Could not build wheels for onnx, which is required to install pyproject.toml-based projects
[!] Could not automatically install required module: onnx_graphsurgeon. Please install it manually.

Hi,

We recommend you to please reach out to the Issues · NVIDIA/TensorRT · GitHub to get better help for TensorRT OSS build errors.

Thank you.