Install Jetbot on JetPack

Hi everyone,
I had downloaded the newest JetPack(JP 4.2.2,release date: 2019/08/26),
and I want to build a image file that include all the jetson nano stuff like jetbot, jetracer, donkeycar and the files needed for the nvidia original course.

Is there any method to build a mixed image file?
So that I can only use one SD card and demo many examples .
P.s My OS is windows
very thanks~

Hi johnson, see this page from the JetBot wiki for creating your own SD card image for JetBot:

[url]https://github.com/NVIDIA-AI-IOT/jetbot/wiki/Create-SD-Card-Image-From-Scratch[/url]

Hi dusty_nv ,thanks for your reply!
But seems that the link you post just can download jetbot project.
If I want to install another project, I need to change the link in step9 ?

After you have completed the steps for installing JetBot, you can install other projects or software that you desire.

Hi dusty_nv thanks for your reply again.
But when I follow the step by step tutorial at step8 ,an error occurred ,the message is that:
sudo jupyter labextension install @jupyterlab/statusbarAn error occured.
ValueError: “@jupyterlab/statusbar” is not a valid extension:
No jupyterlab key
See the log file for details: /tmp/jupyterlab-debug-isfq_j1j.log

Is there any solution?

With JupyterLab 1.0 update, the statusbar extension is now built-in, so you can skip this step. We are also currently validating the JetBot install against JupyterLab 1.0.

Hello dusty_nv,

I want to run your DNN vision library in Jetbot. I could make it running in Jetpack OS, thanks to the nicely structured installation guide here. However when I follow the very steps, I got the following error log from the jetbot terminal:

.

Do you have any recommendation to use Robot functionalities of jetbot and jetson-inference together? Thank you for your help in advance.

Hi @dovuscuhoroz, when you want to build jetson-inference inside Docker, see how I build it in container here:

https://github.com/dusty-nv/jetson-inference/blob/2fb798e3e4895b51ce7315826297cf321f4bd577/Dockerfile#L76

Namely you need to run this line which patches the CMakeLists.txt to remove -lnvcaffe_parser:

sed -i 's/nvcaffe_parser/nvparsers/g' CMakeLists.txt

Hi, @dusty_nv,

Thank you for your reply. I assume jetbot jupyter environment is also in a docker container when installing jetbot using SD card image. After making sure that etc/docker/deamon.json is edited as show here, which was by default, I run the docker/build.sh from jetson-inference directory. As the result, I got the following error log:

root@nano-4gb-jp45:/workspace/jetson-inference# docker/build.sh
head: cannot open ‘/etc/nv_tegra_release’ for reading: No such file or directory
reading L4T version from “dpkg-query --show nvidia-l4t-core”
dpkg-query: no packages found matching nvidia-l4t-core
L4T BSP Version: L4T R.
cannot build jetson-inference docker container for L4T R.
please upgrade to the latest JetPack, or build jetson-inference natively
root@nano-4gb-jp45:/workspace/jetson-inference#

I tried manually replacing nvcaffe_parser with nvparser as you described here, then went for the regular build. it returned the following.

/usr/bin/ld: cannot find -lnvparser
collect2: error: ld returned 1 exit status
CMakeFiles/jetson-inference.dir/build.make:257: recipe for target ‘aarch64/lib/libjetson-inference.so’ failed
make[2]: *** [aarch64/lib/libjetson-inference.so] Error 1
CMakeFiles/Makefile2:67: recipe for target ‘CMakeFiles/jetson-inference.dir/all’ failed
make[1]: *** [CMakeFiles/jetson-inference.dir/all] Error 2
Makefile:129: recipe for target ‘all’ failed
make: *** [all] Error 2

I have the latest Jetpack 4.5 jetbot image. Do you have any other suggestion?

Thank you for your time in advance