Issue with running setup.py after cloning NVIDIA-AI-IOT/trt_pose

Hello, I am trying to get the trt_pose_hand installed in the jetson inference container. I followed the steps on the website and all of these commands seem to work:
git clone GitHub - NVIDIA-AI-IOT/torch2trt: An easy to use PyTorch to TensorRT converter
pip3 install pillow==8.4.0
cd torch2trt
python3 setup.py install --plugins
cd …
pip3 install tqdm cython pycocotools
git clone GitHub - NVIDIA-AI-IOT/trt_pose: Real-time pose estimation accelerated with NVIDIA TensorRT
cd trt_pose

However, when I tried running this
sudo python3 setup.py install
My jetson nano froze for an extended period of time at
/jetson-inference/trt_pose/trt_pose/parse/munkres.cpp:135:31: warning: ‘min’ may be used uninitialized in this function [-Wmaybe-uninitialized]
cost_graph[i * M + j] += min;
with the ram at ~99% usage

Then it threw me this error:
FAILED: /jetson-inference/trt_pose/build/temp.linux-aarch64-3.6/trt_pose/plugins.o
c++ -MMD -MF /jetson-inference/trt_pose/build/temp.linux-aarch64-3.6/trt_pose/plugins.o.d -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/local/lib/python3.6/dist-packages/torch/include -I/usr/local/lib/python3.6/dist-packages/torch/include/torch/csrc/api/include -I/usr/local/lib/python3.6/dist-packages/torch/include/TH -I/usr/local/lib/python3.6/dist-packages/torch/include/THC -I/usr/include/python3.6m -c -c /jetson-inference/trt_pose/trt_pose/plugins.cpp -o /jetson-inference/trt_pose/build/temp.linux-aarch64-3.6/trt_pose/plugins.o -DTORCH_API_INCLUDE_EXTENSION_H ‘-DPYBIND11_COMPILER_TYPE=“_gcc”’ ‘-DPYBIND11_STDLIB=“_libstdcpp”’ ‘-DPYBIND11_BUILD_ABI=“_cxxabi1011”’ -DTORCH_EXTENSION_NAME=plugins -D_GLIBCXX_USE_CXX11_ABI=1 -std=c++14
/jetson-inference/trt_pose/trt_pose/plugins.cpp: In function ‘std::vectorat::Tensor find_peaks_torch(at::Tensor, float, int, int)’:
/jetson-inference/trt_pose/trt_pose/plugins.cpp:45:13: warning: unused variable ‘H’ [-Wunused-variable]
const int H = input.size(2);
^
/jetson-inference/trt_pose/trt_pose/plugins.cpp:46:13: warning: unused variable ‘W’ [-Wunused-variable]
const int W = input.size(3);
^
/jetson-inference/trt_pose/trt_pose/plugins.cpp: In function ‘std::vectorat::Tensor connect_parts_torch(at::Tensor, at::Tensor, at::Tensor, int)’:
/jetson-inference/trt_pose/trt_pose/plugins.cpp:186:9: warning: unused variable ‘K’ [-Wunused-variable]
int K = topology.size(0);
^
/jetson-inference/trt_pose/trt_pose/plugins.cpp:188:9: warning: unused variable ‘M’ [-Wunused-variable]
int M = connections.size(3);
^
c++: internal compiler error: Killed (program cc1plus)
Please submit a full bug report,
with preprocessed source if appropriate.
See <file:///usr/share/doc/gcc-7/README.Bugs> for instructions.
[7/8] c++ -MMD -MF /jetson-inference/trt_pose/build/temp.linux-aarch64-3.6/trt_pose/train/generate_cmap.o.d -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/local/lib/python3.6/dist-packages/torch/include -I/usr/local/lib/python3.6/dist-packages/torch/include/torch/csrc/api/include -I/usr/local/lib/python3.6/dist-packages/torch/include/TH -I/usr/local/lib/python3.6/dist-packages/torch/include/THC -I/usr/include/python3.6m -c -c /jetson-inference/trt_pose/trt_pose/train/generate_cmap.cpp -o /jetson-inference/trt_pose/build/temp.linux-aarch64-3.6/trt_pose/train/generate_cmap.o -DTORCH_API_INCLUDE_EXTENSION_H ‘-DPYBIND11_COMPILER_TYPE=“_gcc”’ ‘-DPYBIND11_STDLIB=“_libstdcpp”’ ‘-DPYBIND11_BUILD_ABI=“_cxxabi1011”’ -DTORCH_EXTENSION_NAME=plugins -D_GLIBCXX_USE_CXX11_ABI=1 -std=c++14
/jetson-inference/trt_pose/trt_pose/train/generate_cmap.cpp: In function ‘at::Tensor trt_pose::train::generate_cmap(at::Tensor, at::Tensor, int, int, float, int)’:
/jetson-inference/trt_pose/trt_pose/train/generate_cmap.cpp:16:9: warning: unused variable ‘M’ [-Wunused-variable]
int M = peaks.size(2);
^
[8/8] c++ -MMD -MF /jetson-inference/trt_pose/build/temp.linux-aarch64-3.6/trt_pose/train/generate_paf.o.d -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/local/lib/python3.6/dist-packages/torch/include -I/usr/local/lib/python3.6/dist-packages/torch/include/torch/csrc/api/include -I/usr/local/lib/python3.6/dist-packages/torch/include/TH -I/usr/local/lib/python3.6/dist-packages/torch/include/THC -I/usr/include/python3.6m -c -c /jetson-inference/trt_pose/trt_pose/train/generate_paf.cpp -o /jetson-inference/trt_pose/build/temp.linux-aarch64-3.6/trt_pose/train/generate_paf.o -DTORCH_API_INCLUDE_EXTENSION_H ‘-DPYBIND11_COMPILER_TYPE=“_gcc”’ ‘-DPYBIND11_STDLIB=“_libstdcpp”’ ‘-DPYBIND11_BUILD_ABI=“_cxxabi1011”’ -DTORCH_EXTENSION_NAME=plugins -D_GLIBCXX_USE_CXX11_ABI=1 -std=c++14
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File “/usr/local/lib/python3.6/dist-packages/torch/utils/cpp_extension.py”, line 1723, in _run_ninja_build
env=env)
File “/usr/lib/python3.6/subprocess.py”, line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command ‘[‘ninja’, ‘-v’]’ returned non-zero exit status 1.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File “setup.py”, line 28, in
install_requires=[
File “/usr/local/lib/python3.6/dist-packages/setuptools/init.py”, line 153, in setup
return distutils.core.setup(**attrs)
File “/usr/lib/python3.6/distutils/core.py”, line 148, in setup
dist.run_commands()
File “/usr/lib/python3.6/distutils/dist.py”, line 955, in run_commands
self.run_command(cmd)
File “/usr/lib/python3.6/distutils/dist.py”, line 974, in run_command
cmd_obj.run()
File “/usr/local/lib/python3.6/dist-packages/setuptools/command/install.py”, line 74, in run
self.do_egg_install()
File “/usr/local/lib/python3.6/dist-packages/setuptools/command/install.py”, line 116, in do_egg_install
self.run_command(‘bdist_egg’)
File “/usr/lib/python3.6/distutils/cmd.py”, line 313, in run_command
self.distribution.run_command(command)
File “/usr/lib/python3.6/distutils/dist.py”, line 974, in run_command
cmd_obj.run()
File “/usr/local/lib/python3.6/dist-packages/setuptools/command/bdist_egg.py”, line 164, in run
cmd = self.call_command(‘install_lib’, warn_dir=0)
File “/usr/local/lib/python3.6/dist-packages/setuptools/command/bdist_egg.py”, line 150, in call_command
self.run_command(cmdname)
File “/usr/lib/python3.6/distutils/cmd.py”, line 313, in run_command
self.distribution.run_command(command)
File “/usr/lib/python3.6/distutils/dist.py”, line 974, in run_command
cmd_obj.run()
File “/usr/local/lib/python3.6/dist-packages/setuptools/command/install_lib.py”, line 11, in run
self.build()
File “/usr/lib/python3.6/distutils/command/install_lib.py”, line 109, in build
self.run_command(‘build_ext’)
File “/usr/lib/python3.6/distutils/cmd.py”, line 313, in run_command
self.distribution.run_command(command)
File “/usr/lib/python3.6/distutils/dist.py”, line 974, in run_command
cmd_obj.run()
File “/usr/local/lib/python3.6/dist-packages/setuptools/command/build_ext.py”, line 79, in run
_build_ext.run(self)
File “/usr/local/lib/python3.6/dist-packages/Cython/Distutils/old_build_ext.py”, line 186, in run
_build_ext.build_ext.run(self)
File “/usr/lib/python3.6/distutils/command/build_ext.py”, line 339, in run
self.build_extensions()
File “/usr/local/lib/python3.6/dist-packages/torch/utils/cpp_extension.py”, line 735, in build_extensions
build_ext.build_extensions(self)
File “/usr/local/lib/python3.6/dist-packages/Cython/Distutils/old_build_ext.py”, line 195, in build_extensions
_build_ext.build_ext.build_extensions(self)
File “/usr/lib/python3.6/distutils/command/build_ext.py”, line 448, in build_extensions
self._build_extensions_serial()
File “/usr/lib/python3.6/distutils/command/build_ext.py”, line 473, in _build_extensions_serial
self.build_extension(ext)
File “/usr/local/lib/python3.6/dist-packages/setuptools/command/build_ext.py”, line 202, in build_extension
_build_ext.build_extension(self, ext)
File “/usr/lib/python3.6/distutils/command/build_ext.py”, line 533, in build_extension
depends=ext.depends)
File “/usr/local/lib/python3.6/dist-packages/torch/utils/cpp_extension.py”, line 565, in unix_wrap_ninja_compile
with_cuda=with_cuda)
File “/usr/local/lib/python3.6/dist-packages/torch/utils/cpp_extension.py”, line 1404, in _write_ninja_file_and_compile_objects
error_prefix=‘Error compiling objects for extension’)
File “/usr/local/lib/python3.6/dist-packages/torch/utils/cpp_extension.py”, line 1733, in _run_ninja_build
raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension

Hi,

c++: internal compiler error: Killed (program cc1plus)

Killed usually indicates running out of memory.
Could you add some swap and install it again?

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.