Hello,
Im trying to build jetson inference from source following jetson-inference/docs/building-repo-2.md at master · dusty-nv/jetson-inference · GitHub , to be able to use DetectNet outside the container. I have updated my python 3 to 3.7 on the jetson nano and i was wondering if everything will still work ok on the jetson-inference side? I have to use python 3.7.5 because of the FeatherWing H-bridge to be able to control some motors.
@castej10 I don’t believe I’ve personal built jetson-inference against Python 3.7 specifically (Python 3.6 and 3.8 I have), but if you have the libpython3.7-dev packages installed, it should detect it:
if i wanted to instead use the container and communicate with host using RbbitMQ server using the pika library for python, can i change the Dockerfile from:
Sure you could do that - I would actually put your commands in their own RUN statement so that when you change them, it doesn’t have to re-run all those other pip3 installations that I have. It may take a bit of time to build the container at first, but once you do, you’ll be able to add your own commands to the end of the Dockerfile and it’ll quickly build just those commands using the build cache.
Note that the typical approach with Docker would be to create your own container by making your own Dockerfile with the commands you added with FROM jetson-inference:rXX.X at the top, and build it (look up tutorials on creating your own containers). Alas your approach may be easier to start with.
also when I edit the Dockerfile and run the command docker/build.sh after quite sometime of bulding the image i get this error:
eppl@eppl-desktop:~/jetson-inference$ docker/run.sh
ARCH: aarch64
reading L4T version from /etc/nv_tegra_release
L4T BSP Version: L4T R32.7.1
[sudo] password for eppl:
localuser:root being added to access control list
CONTAINER_IMAGE: jetson-inference:r32.7.1
DATA_VOLUME: --volume /home/eppl/jetson-inference/data:/jetson-inference/data --volume /home/eppl/jetson-inference/python/training/classification/data:/jetson-inference/python/training/classification/data --volume /home/eppl/jetson-inference/python/training/classification/models:/jetson-inference/python/training/classification/models --volume /home/eppl/jetson-inference/python/training/detection/ssd/data:/jetson-inference/python/training/detection/ssd/data --volume /home/eppl/jetson-inference/python/training/detection/ssd/models:/jetson-inference/python/training/detection/ssd/models --volume /home/eppl/jetson-inference/python/www/recognizer/data:/jetson-inference/python/www/recognizer/data
V4L2_DEVICES: --device /dev/video0
DISPLAY_DEVICE: -e DISPLAY=:1 -v /tmp/.X11-unix/:/tmp/.X11-unix
ERROR: ld.so: object '/lib/aarch64-linux-gnu/libGLdispatch.so.0' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/lib/aarch64-linux-gnu/libGLdispatch.so.0' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/lib/aarch64-linux-gnu/libGLdispatch.so.0' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
also i get this error in the mounted directories:
root@eppl-desktop:/jetson-inference/python/training/detection/ssd# ls models
ERROR: ld.so: object '/lib/aarch64-linux-gnu/libGLdispatch.so.0' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
Mmm good point - I forgot you were building this in container. In that case, you would want to make sure those libpython3.7-dev packages were installed in the Dockerfile before jetson-inference gets built.