Looking for CI/CD best practices for creating custom jetson nano images

Hey everyone,

I am currently looking into building a CI/CD pipeline for a custom ML Computer Vision application on the Jetson Nano.

We would like to do modification to the device tree and deploy tensorflow CUDA workflows with custom code to an image file.

Currently the Yocto project looks very promising, specially with the BSP Layer:
https://github.com/madisongh/meta-tegra

Has anyone of you experience with building CI/CD pipeline for the Jetson platform?
What kind of Frameworks are you using?
What are best practices?

Best regards
Frederic

Hi,

Sorry that I’m not the familiar with building CI/CD pipeline.
Here is a related post may give you some information:
https://dev.to/azure/getting-started-with-devops-ci-cd-pipelines-on-nvidia-arm64-devices-4668

Thanks.

That seems to be a good way for compiling the code as a first step in the pipeline.
Even-though I am looking more for a way to automatically package everything into an img file, that can be distributed to new devices afterwards.

Do you have any best practices or suggestions regarding this?

Maybe you have some feedback on my ideas:

  1. cross compile the code for the different platforms.
  2. Get some kind of image and modify it (device tree modifications, install dependencies, …).
  3. Place the code into the modified image and save it to some kind of storage registry.

At D3 we have had good luck cross compiling into .deb packages. We have a relatively powerful build server running Jenkins. We package our kernel and applications as debs. If you use autotools it’s very easy to cross compile. You could then place the debs in a repository and use apt to pull them down. We also bundle our DTB files into our kernel image. It’s not difficult to detect which platform (Nano, Tx2, Xavier) you are running on to install the correct DTB. You can install signed kernels and DTB files using dd from the target.

The above works well for users that wish to use a mostly ‘standard’ nano/tx2/xavier image. Most D3 customers fall into that category.

I’m personally very interested in seeing a more embedded deployment style that you seem to be pursuing (a minimal single image with a small bootloader). Please share your results!

You can see how we bundle the kernel and dtb into a .deb at the link below. In a future release we’ll be moving away from make-kpkg and using the .deb facilities that are already present in the Linux kernel build system.

https://github.com/D3Engineering/d3-jetson-bsp/tree/d3/2.0.0

Hi @Frederic-Tausch-
We have an open-source project for Edge AI and ML. We are hosting a series of webinar and our first session is “Edge DevOps”. We are going to cover CI and how it can improve the edge AI application development. Please click the below link to register for the webinar.

Link to register for our Webinar: https://us02web.zoom.us/meeting/register/tZckduysqz4rEt1UxuHwzMQAZhe8R2F0G2fG

Open-source Edge AI project: Neuralet.com
Our Github Repo: https://github.com/neuralet