I’ve tried to soak this in for days but I’m still unsure about how everything would pan out.
I can almost see how to do the first step with Docker, just navigate to a reasonable path like a new folder in ~/ros2_dev and do:
docker pull dustynv/ros:foxy-pytorch-l4t-r34.1.1
Edit#1: I ran jetsonUtilities and found out that L4T is 32.7.1 [JetPack UNKNOWN]. Will the above version still work, or do I need to install the r32.7.1 container? (foxy)
I guess the image is then “present” on the Jetson Nano, but not “opened”? After that, would something like this be correct?
docker run -it --name master dustynv/ros:foxy-pytorch-l4t-r34.1.1
Check the list of all (active?) images and get the container ID:
docker ps -a
Open a new terminal and do:
docker exec -it <container ID> bash
That terminal is then inside of the container i suppose? A terminal with “access” to ros2?
The next step with installing the ros_deep_learning package “on top” is however also something I am unsure about. The installation section calls for installation of the jetson-inference, but where will that be installed? I assume a terminal that has the Docker container opened so that it actually can be accessed by ros2, or is it in a “regular” terminal on the Nano? Aka, in what terminal is this supposed to be executed:
`$ cd ~
$ sudo apt-get install git cmake
$ git clone --recursive GitHub - dusty-nv/jetson-inference: Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
$ cd jetson-inference
$ mkdir build
$ cd build
$ cmake …/
$ make -j$(nproc)
$ sudo make install
$ sudo ldconfig`
Maybe I’ve gotten confused by Docker, but I view containers as completely independent- and isolated environments from the “regular” environment on the host computer.
It also says that either Melodic or Eloquent ros2 is needed, but I assume Foxy works as well since you recommended this.
Since this:
$ cd ~/ros_workspace/src $ git clone https://github.com/dusty-nv/ros_deep_learning
requires you to navigate to the ros2 workspace, I assume this has to be done in a terminal that has “opened” the Docker container?
Thanks for your help so far!
Edit: See Edit#1
Edit#2:
The last step when
RUN source ${ROS_ENVIRONMENT} &&
cd ${WORKSPACE_ROOT}/src &&
git clone GitHub - dusty-nv/ros_deep_learning: Deep learning inference nodes for ROS / ROS2 with support for NVIDIA Jetson and TensorRT &&
cd …/ &&
colcon build --symlink-install --event-handlers console_direct+
I assume this can just be executed in a terminal (normal or Docker?) except for actually writing “RUN”, "{WORKSPACE_ROOT} etc? Aka cloning the repository to src and then colcon build --symlink-install
?