How to run hello ai world container on a kubernetes cluster

I have followed the steps of hello AI world and ran the pre-built docker container in 1 Jetson nano. It is working on a single Jetson nano but I wanted to run the docker container on a Kubernetes cluster of 3 Jetson nano. I am not sure how to do it by writing an appropriate yaml file. Please advise me on how to proceed.

This is the link that I used:

Hi @freewalker96, I haven’t personally Kubernetes or the container on it before, but perhaps someone from the community can share their experiences.

I don’t think the jetson-inference container should be very different from running another container with Kubernetes. You will want to use --runtime nvidia, along with mounting the camera devices and these data dirs. You can see how these are setup in the docker/ script.

Ok thank you @dusty_nv , I shall follow the steps as you said.