I would like to install PyTorch with Python 3.8 on Jetpack 4.4.1. Unfortunately I ran into the following problem while building PyTorch from source:
[ 76%] Building NVCC (Device) object caffe2/CMakeFiles/torch_cuda.dir/__/aten/src/ATen/native/cuda/torch_cuda_generated_DistributionCauchyKernel.cu.o
/home/xavier/pytorch/c10/cuda/CUDAMathCompat.h: In static member function ‘static scalar_t at::native::copysign_kernel_cuda(at::TensorIterator&)::<lambda()>::<lambda()>::<lambda(scalar_t, scalar_t)>::_FUN(scalar_t, scalar_t)’:
/home/xavier/pytorch/c10/cuda/CUDAMathCompat.h:46:24: internal compiler error: Segmentation fault
return ::copysignf(x, y);
^
Please submit a full bug report,
with preprocessed source if appropriate.
See <file:///usr/share/doc/gcc-7/README.Bugs> for instructions.
CMake Error at torch_cuda_generated_CopysignKernel.cu.o.Release.cmake:281 (message):
Error generating file
/home/xavier/pytorch/build/caffe2/CMakeFiles/torch_cuda.dir/__/aten/src/ATen/native/cuda/./torch_cuda_generated_CopysignKernel.cu.o
caffe2/CMakeFiles/torch_cuda.dir/build.make:89106: recipe for target 'caffe2/CMakeFiles/torch_cuda.dir/__/aten/src/ATen/native/cuda/torch_cuda_generated_CopysignKernel.cu.o' failed
make[2]: *** [caffe2/CMakeFiles/torch_cuda.dir/__/aten/src/ATen/native/cuda/torch_cuda_generated_CopysignKernel.cu.o] Error 1
make[2]: *** Waiting for unfinished jobs....
Those are the steps I applied:
git clone https://github.com/pytorch/pytorch.git
cd pytorch
git submodule --init --recursive
export USE_NCCL=0
export USE_DISTRIBUTED=0
export USE_QNNPACK=0
export USE_PYTORCH_QNNPACK=0
export TORCH_CUDA_ARCH_LIST="5.3;6.2;7.2"
export PYTORCH_BUILD_VERSION=1.7.0
export PYTORCH_BUILD_NUMBER=1
python3.8 -m pip install -r requirements.txt
python3.8 setup.py build
I tried it out two different versions of cmake, namely 3.10.2
and 3.18.5
. Also I applied the patch that was referred to in this post for the PyTorch installation with Python 3.6 - PyTorch for Jetson - version 1.7.0 now available, but none of those measures helped.
Any advice would be appreciated.