Problems with the instalaltion of the trt_pose and torch2trt in the dcoker container

Hello.I couldnt install trt_pose and torch2trt in the docker container and i need help.
Sequence of my actions:

Attempt1:

1.sudo docker run -it 98aa2f10ae96
2.git clone GitHub - NVIDIA-AI-IOT/torch2trt: An easy to use PyTorch to TensorRT converter
3.cd torch2trt
4.python3 setup.py install

Error:
Traceback (most recent call last):
File “setup.py”, line 3, in
import torch
File “/usr/local/lib/python3.6/dist-packages/torch/init.py”, line 196, in
_load_global_deps()
File “/usr/local/lib/python3.6/dist-packages/torch/init.py”, line 149, in _load_global_deps
ctypes.CDLL(lib_path, mode=ctypes.RTLD_GLOBAL)
File “/usr/lib/python3.6/ctypes/init.py”, line 348, in init
self._handle = _dlopen(self._name, mode)
OSError: libcurand.so.10: cannot open shared object file: No such file or directory
Attempt1.txt (3.4 KB)

Attempt2:

1.cd /home/atlas/jetson-inference
2.docker/run.sh -c face_recognition_only
3.git clone GitHub - NVIDIA-AI-IOT/torch2trt: An easy to use PyTorch to TensorRT converter
4.cd torch2trt
5.python3 setup.py install
6.exit
7.exit
8.docker/run.sh -c face_recognition_only
9.git clone GitHub - NVIDIA-AI-IOT/trt_pose: Real-time pose estimation accelerated with NVIDIA TensorRT
10. cd trt_pose
11.python3 setup.py install
12.exit
13.exit
14.sudo docker ps -a
15.sudo docker images
16.sudo docker commit 185bb86ae284 pose_detection_and_face_detection
17. docker/run.sh -c pose_detection_and_face_detection --volume /home/atlas/Desktop/Jupiter1:/home/atlas/jetson-inference
18.cd /home/atlas/jetson-inference
19.python3 pose_need.py

Error:

ModuleNotFoundError: No module named ‘trt_pose’
ModuleNotFoundError: No module named ‘torch2trt’

Attempt2.txt (60.7 KB)

Full log is here:
log (2).txt (64.1 KB)

What did i do wrong and how to install torch2trt and trt_pose in the docker container ?

Hi @A98, can you try starting the container with --runtime nvidia ? That is needed on JetPack 4 so that the CUDA/cuDNN/TensorRT libraries automatically get mounted into the container from the device. You can also make your own Dockerfile with your steps so that it is built in an automated way.

Thanks for your reply. --runtime nvidia solved the problem with the torch2trt installation.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.