Cant use torch2trt in the volume of the container

Hello.I installed torch2trt in the container by the commands:
1.git clone https://github.com/NVIDIA-AI-IOT/torch2trt
2.cd torch2trt
3.python3 setup.py install

After that i greated new container to save the changes by the command:

sudo docker commit

I have a problem. I can import torchtrt outside of the volume (i can import it in the container after the start), but i cant import it in the volume (if i use command cd home/atlas/jetson-inference in the docker container).As a result, i cant run my programm, because it is located in the volume and cant import torch2trt.

And I dont have the same problem with the face_recognition package, which was installed via pip.

How can i solv;e this problem ?

Is it possbile to solve this problem without direct torch2trt installation in the volume folder ?(Because i have multiple programms, that need this package, and installation of the torch2trt in each folder would be very hard)

Hi @A98, when you installed torch2trt, it got installed under /usr/local/lib/python3.6/dist-packages, so you don’t need the original sources anymore. Rename them to something other than torch2trt, or put them in another folder (or delete them). When you try to import xyz in Python, it will first search your current directory (and when that’s your mounted folder, it points to the torch2trt sources which aren’t meant to be imported like that)

Hello.I am sorry, but i didnt understand that exactly should i do.

After the installation of the torch2trt in the container, i have torch2trt.py in the torch2trt folder in the another torch2trt folder (in the jetson-inference folder).It seems all problems associated with this.


I need to:

import torch2trt (I need torch2trt folder, which is located in the another torch2trt folder most likely)
from torch2trt import TRTModule

I tried to:

1.Extract torch2trt folder (which is loacted in the another torch2trt folder) outside and import it.(I coludnt import TRTModule by this method)
2.Extract torch2trt.py outside and import TRTModule.(I got erros)
3.use another command with a full path to the file torch2trt.py instead of “from torch2trt import TRTModule”.(I got erros)

Could you, please, give me an insturction how to:

import torch2trt
from torch2trt import TRTModule

Also i have the same problem with trtpose:
I cant import:
import trt_pose.coco
import trt_pose.models

You don’t need the torch2trt folder anymore after you setup.py install it. Then you can import torch2trt from any Python script without needing the original torch2trt folder. torch2trt has other build steps that occur during setup.py, so you can’t try to just import the source directly.

If you exit the container, you will need to reinstall torch2trt the next time (or use docker commit or a dockerfile)

“so you can’t try to just import the source directly.”
“you don’t need the torch2trt folder anymore after you setup.py install it”

In my point of view , these two statements directly contradict each other.
You said, that i dont need torch2trt folder, which is located in another torch2trt folder, right?
image
If so, i need only its content:

So i need to extract it outside (in the main torch2trt folder) and then delete a torch2trt folder (which is located in another torch2trt folder), but this action is almost the same with just import the source directly.

Myabe you meant, that i dont need the main torch2trt folder instead of the torch2trt folder, which is located in the another torch2trt fodler ?Which folder of these two i need and which folder of these two i dont need ?

Do i understand correctly meaning of the phrase : "don’t need the torch2trt folder ".Does it mean, that i need to extract its content outside and then delete the folder?Or this phrase means something else ?

Can i find full instruction about hte instalaltion of the torch2trt in main system somewhere ?

You only need the original torch2trt sources for when you do setup.py install for it. Try this in a fresh instance of the container:

# run these commands inside container
cd /tmp
git clone https://github.com/NVIDIA-AI-IOT/torch2trt
cd torch2trt
pip3 install --verbose .

# test torch2trt in container
cd /
pip3 show torch2trt
python3 -c 'import torch2trt'

If you mean outside container, those are the same instructions on the torch2trt github page - you would need to install PyTorch first (from PyTorch for Jetson)

You can also just use my torch2trt or l4t-pytorch container images which already has torch2trt installed in it.

I tried to install torch2trt in the container without torch2trt by these commands:

cd /tmp
git clone GitHub - NVIDIA-AI-IOT/torch2trt: An easy to use PyTorch to TensorRT converter
cd torch2trt
pip3 install --verbose .

As a result i got the error:

After that i tried to
python3 setup.py
(but it didnt work too)

Full text log is here:
log11.txt (4.1 KB)

Could you explain me, please, what did i do wrong ?

Try starting the container with including the --runtime nvidia flag (sudo docker run -it --runtime nvidia dustynv/jetson-inference:r32.7.1), otherwise the CUDA components won’t be available to use.

It woks.Thanks for the help.
log11 (1).txt (48.2 KB)

Can i use the same steps for the trtpose installation ?

#run these commands inside container
cd /tmp
git clone GitHub - NVIDIA-AI-IOT/trt_pose: Real-time pose estimation accelerated with NVIDIA TensorRT
cd trt_pose
pip3 install --verbose .

OK great, glad that you got it working!

Basically, yes - following the steps on the trt_pose github. It looks like trt_pose prefers python3 setup.py install. Also it wants you to have installed torch2trt with its plugins enabled.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.