How to build jetson-inference in host PC?

Thank you AastaLLL, I have successfully compile jetson-inference and run my code in host PC Ubuntu14.04 x86_64 platform by several steps below:

  1. I comment the lines 24-25 in CMakePrebuild.sh, cause it’s related to aarch64 platform.
# sudo rm /usr/lib/aarch64-linux-gnu/libGL.so
# sudo ln -s /usr/lib/aarch64-linux-gnu/tegra/libGL.so /usr/lib/aarch64-linux-gnu/libGL.so
  1. My GPU is GeForce GTX 1050, so i append one line in CMakeLists.txt,
set(
	CUDA_NVCC_FLAGS
	${CUDA_NVCC_FLAGS}; 
    -O3 
	-gencode arch=compute_53,code=sm_53
	-gencode arch=compute_62,code=sm_62
	-gencode arch=compute_61,code=sm_61  # the line i add
)
  1. i add a line in CMakeLists.txt to use c++11 for CUDA
set(CUDA_NVCC_FLAGS ${CUDA_NVCC_FLAGS};--disable-warnings;--ptxas-options=-v;-use_fast_math;-lineinfo;-std=c++11)

After that, i compile jetson-inference by make && make install, and then i compile my code and run. It works.