Unable to generate make file for ONNX(make install -j12 )

Hello,

I am following this article “https://devblogs.nvidia.com/speed-up-inference-tensorrt

luminosity@jetson-nano:~/Desktop/code-samples/posts/TensorRT-introduction$ git clone --recursive https://github.com/onnx/onnx.git
Cloning into ‘onnx’…
remote: Enumerating objects: 51, done.
remote: Counting objects: 100% (51/51), done.
remote: Compressing objects: 100% (51/51), done.
remote: Total 19069 (delta 17), reused 6 (delta 0), pack-reused 19018
Receiving objects: 100% (19069/19069), 10.09 MiB | 3.17 MiB/s, done.
Resolving deltas: 100% (10367/10367), done.
Submodule ‘third_party/benchmark’ (https://github.com/google/benchmark.git) registered for path ‘third_party/benchmark’
Submodule ‘third_party/pybind11’ (https://github.com/pybind/pybind11.git) registered for path ‘third_party/pybind11’
Cloning into ‘/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/third_party/benchmark’…
remote: Enumerating objects: 3, done.
remote: Counting objects: 100% (3/3), done.
remote: Compressing objects: 100% (3/3), done.
remote: Total 5269 (delta 0), reused 1 (delta 0), pack-reused 5266
Receiving objects: 100% (5269/5269), 1.68 MiB | 1.82 MiB/s, done.
Resolving deltas: 100% (3469/3469), done.
Cloning into ‘/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/third_party/pybind11’…
remote: Enumerating objects: 11001, done.
remote: Total 11001 (delta 0), reused 0 (delta 0), pack-reused 11001
Receiving objects: 100% (11001/11001), 4.03 MiB | 2.08 MiB/s, done.
Resolving deltas: 100% (7436/7436), done.
Submodule path ‘third_party/benchmark’: checked out ‘e776aa0275e293707b6a0901e0e8d8a8a3679508’
Submodule path ‘third_party/pybind11’: checked out ‘a1041190c8b8ff0cd9e2f0752248ad5e3789ea0c’
Submodule ‘tools/clang’ (https://github.com/wjakob/clang-cindex-python3) registered for path ‘third_party/pybind11/tools/clang’
Cloning into ‘/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/third_party/pybind11/tools/clang’…
remote: Enumerating objects: 353, done.
remote: Total 353 (delta 0), reused 0 (delta 0), pack-reused 353
Receiving objects: 100% (353/353), 119.74 KiB | 752.00 KiB/s, done.
Resolving deltas: 100% (149/149), done.
Submodule path ‘third_party/pybind11/tools/clang’: checked out ‘6a00cbc4a9b8e68b71caf7f774b3f9c753ae84d5’
luminosity@jetson-nano:~/Desktop/code-samples/posts/TensorRT-introduction$ cd onnx
luminosity@jetson-nano:~/Desktop/code-samples/posts/TensorRT-introduction/onnx$ ls
appveyor.yml CMakeLists.txt community docs MANIFEST.in pyproject.toml RELEASE-MANAGEMENT.md setup.py third_party VERSION_NUMBER
cmake CODEOWNERS conda LICENSE onnx README.md setup.cfg stubs tools
luminosity@jetson-nano:~/Desktop/code-samples/posts/TensorRT-introduction/onnx$ cmake .
– Build type not set - defaulting to Release
– The C compiler identification is GNU 7.4.0
– The CXX compiler identification is GNU 7.4.0
– Check for working C compiler: /usr/bin/cc
– Check for working C compiler: /usr/bin/cc – works
– Detecting C compiler ABI info
– Detecting C compiler ABI info - done
– Detecting C compile features
– Detecting C compile features - done
– Check for working CXX compiler: /usr/bin/c++
– Check for working CXX compiler: /usr/bin/c++ – works
– Detecting CXX compiler ABI info
– Detecting CXX compiler ABI info - done
– Detecting CXX compile features
– Detecting CXX compile features - done
– Looking for pthread.h
– Looking for pthread.h - found
– Looking for pthread_create
– Looking for pthread_create - not found
– Looking for pthread_create in pthreads
– Looking for pthread_create in pthreads - not found
– Looking for pthread_create in pthread
– Looking for pthread_create in pthread - found
– Found Threads: TRUE
– Found Protobuf: /usr/lib/aarch64-linux-gnu/libprotobuf.so;-lpthread (found version “3.0.0”)

– ******** Summary ********
– CMake version : 3.10.2
– CMake command : /usr/bin/cmake
– System : Linux
– C++ compiler : /usr/bin/c++
– C++ compiler version : 7.4.0
– CXX flags : -Wnon-virtual-dtor
– Build type : Release
– Compile definitions :
– CMAKE_PREFIX_PATH :
– CMAKE_INSTALL_PREFIX : /usr/local
– CMAKE_MODULE_PATH :

– ONNX version : 1.5.0
– ONNX NAMESPACE : onnx
– ONNX_BUILD_TESTS : OFF
– ONNX_BUILD_BENCHMARKS : OFF
– ONNX_USE_LITE_PROTO : OFF
– ONNXIFI_DUMMY_BACKEND : OFF
– ONNXIFI_ENABLE_EXT : OFF

– Protobuf compiler : /usr/bin/protoc
– Protobuf includes : /usr/include
– Protobuf libraries : /usr/lib/aarch64-linux-gnu/libprotobuf.so;-lpthread
– BUILD_ONNX_PYTHON : OFF
– Configuring done
– Generating done
– Build files have been written to: /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx
luminosity@jetson-nano:~/Desktop/code-samples/posts/TensorRT-introduction/onnx$ ls
appveyor.yml CMakeCache.txt cmake_install.cmake CODEOWNERS conda LICENSE MANIFEST.in ONNXConfig.cmake pyproject.toml RELEASE-MANAGEMENT.md setup.py third_party VERSION_NUMBER
cmake CMakeFiles CMakeLists.txt community docs Makefile onnx ONNXConfigVersion.cmake README.md setup.cfg stubs tools
luminosity@jetson-nano:~/Desktop/code-samples/posts/TensorRT-introduction/onnx$ make install -j12
Scanning dependencies of target onnxifi_loader
Scanning dependencies of target gen_onnx_proto
Scanning dependencies of target onnxifi_dummy
[ 1%] Building C object CMakeFiles/onnxifi_loader.dir/onnx/onnxifi_loader.c.o
[ 3%] Running C++ protocol buffer compiler on /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/onnx-ml.proto
[ 5%] Building C object CMakeFiles/onnxifi_dummy.dir/onnx/onnxifi_dummy.c.o
[ 6%] Built target gen_onnx_proto
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/onnxifi_dummy.c: In function ‘onnxGetExtensionFunctionAddress’:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/onnxifi_dummy.c:173:21: warning: assignment from incompatible pointer type [-Wincompatible-pointer-types]
*function = &onnxGetExtensionFunctionAddress;
^
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/onnxifi_dummy.c:176:21: warning: assignment from incompatible pointer type [-Wincompatible-pointer-types]
*function = &onnxSetIOAndRunGraph;
^
[ 8%] Running C++ protocol buffer compiler on /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/onnx-operators-ml.proto
[ 10%] Linking C static library libonnxifi_loader.a
Scanning dependencies of target onnx_proto
[ 12%] Linking C shared library libonnxifi_dummy.so
[ 13%] Building CXX object CMakeFiles/onnx_proto.dir/onnx/onnx-operators-ml.pb.cc.o
[ 13%] Built target onnxifi_loader
[ 15%] Building CXX object CMakeFiles/onnx_proto.dir/onnx/onnx-ml.pb.cc.o
Scanning dependencies of target onnxifi_wrapper
[ 15%] Built target onnxifi_dummy
[ 17%] Building C object CMakeFiles/onnxifi_wrapper.dir/onnx/onnxifi_wrapper.c.o
[ 18%] Linking C shared module libonnxifi.so
[ 18%] Built target onnxifi_wrapper
[ 20%] Linking CXX static library libonnx_proto.a
[ 25%] Built target onnx_proto
Scanning dependencies of target onnx
[ 27%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/generator/defs.cc.o
[ 29%] Building CXX object CMakeFiles/onnx.dir/onnx/common/interned_strings.cc.o
[ 31%] Building CXX object CMakeFiles/onnx.dir/onnx/common/status.cc.o
[ 32%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/attr_proto_util.cc.o
[ 34%] Building CXX object CMakeFiles/onnx.dir/onnx/checker.cc.o
[ 36%] Building CXX object CMakeFiles/onnx.dir/onnx/common/model_helpers.cc.o
[ 37%] Building CXX object CMakeFiles/onnx.dir/onnx/common/ir_pb_converter.cc.o
[ 39%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/function.cc.o
[ 41%] Building CXX object CMakeFiles/onnx.dir/onnx/common/assertions.cc.o
[ 43%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/controlflow/defs.cc.o
[ 44%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/controlflow/old.cc.o
[ 46%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/data_type_utils.cc.o
[ 48%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/generator/old.cc.o
[ 50%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/logical/defs.cc.o
[ 51%] Building CXX object CMakeFiles/onnx.dir/onnx/defs/logical/old.cc.o
In file included from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:1:0:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc: In function ‘void onnx::checker::check_graph(const onnx::GraphProto&, const onnx::checker::CheckerContext&, const onnx::checker::LexicalScopeContext&)’:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:625:30: error: invalid initialization of reference of type ‘const google::protobuf::Message&’ from expression of type ‘const onnx::NodeProto’
ProtoDebugString(node),
^
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.h:34:34: note: in definition of macro ‘fail_check’
ONNX_NAMESPACE::MakeString(VA_ARGS));
^~~~~~~~~~~
In file included from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/defs/shape_inference.h:4:0,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/defs/schema.h:24,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.h:7,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:1:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/proto_utils.h:23:20: note: in passing argument 1 of ‘std::__cxx11::string onnx::ProtoDebugString(const google::protobuf::Message&)’
inline std::string ProtoDebugString(const Message& proto) {
^~~~~~~~~~~~~~~~
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:637:61: error: invalid initialization of reference of type ‘const google::protobuf::Message&’ from expression of type ‘const onnx::NodeProto’
ex.AppendContext("Bad node spec: " + ProtoDebugString(node));
^~~~
In file included from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/defs/shape_inference.h:4:0,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/defs/schema.h:24,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.h:7,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:1:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/proto_utils.h:23:20: note: in passing argument 1 of ‘std::__cxx11::string onnx::ProtoDebugString(const google::protobuf::Message&)’
inline std::string ProtoDebugString(const Message& proto) {
^~~~~~~~~~~~~~~~
In file included from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:1:0:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc: In function ‘void onnx::checker::check_function(const onnx::FunctionProto&, const onnx::checker::CheckerContext&, const onnx::checker::LexicalScopeContext&)’:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:712:30: error: invalid initialization of reference of type ‘const google::protobuf::Message&’ from expression of type ‘const onnx::NodeProto’
ProtoDebugString(node),
^
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.h:34:34: note: in definition of macro ‘fail_check’
ONNX_NAMESPACE::MakeString(VA_ARGS));
^~~~~~~~~~~
In file included from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/defs/shape_inference.h:4:0,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/defs/schema.h:24,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.h:7,
from /home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/checker.cc:1:
/home/luminosity/Desktop/code-samples/posts/TensorRT-introduction/onnx/onnx/proto_utils.h:23:20: note: in passing argument 1 of ‘std::__cxx11::string onnx::ProtoDebugString(const google::protobuf::Message&)’
inline std::string ProtoDebugString(const Message& proto) {
^~~~~~~~~~~~~~~~
CMakeFiles/onnx.dir/build.make:62: recipe for target ‘CMakeFiles/onnx.dir/onnx/checker.cc.o’ failed
make[2]: *** [CMakeFiles/onnx.dir/onnx/checker.cc.o] Error 1
make[2]: *** Waiting for unfinished jobs…

CMakeFiles/Makefile2:210: recipe for target ‘CMakeFiles/onnx.dir/all’ failed
make[1]: *** [CMakeFiles/onnx.dir/all] Error 2
Makefile:129: recipe for target ‘all’ failed
make: *** [all] Error 2

I think the instructions given on https://devblogs.nvidia.com/speed-up-inference-tensorrt/ is outdated.

So I followed the instructions on onnx’s README.md and installed the onnx, which says,
not cmake and make install,
but,

python setup.py install