I tried running gemma3n e2b and e4b both using ollama on jetson. since we cannot directly since ollama is not ARM compatible. So I ran it using the container images that are ARM compatible:
git clone GitHub - dusty-nv/jetson-containers: Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
bash jetson-containers/install.sh *
j*etson-containers run $(autotag ollama)
# in another terminal, run the ollama client
user@hostname:~$ jetson-containers run $(autotag ollama) ollama run gemma3n:e4b
Namespace(packages=[‘ollama’], prefer=[‘local’, ‘registry’, ‘build’], disable=[‘’], user=‘dustynv’, output=‘/tmp/autotag’, quiet=False, verbose=False)
-- L4T_VERSION=36.4.4 JETPACK_VERSION=6.2.1 CUDA_VERSION=12.6
-- Finding compatible container image for [‘ollama’]
dustynv/ollama:0.6.8-r36.4-cu126-22.04
V4L2_DEVICES: --device /dev/video0 --device /dev/video1
### ARM64 architecture detected
### Jetson Detected
SYSTEM_ARCH=tegra-aarch64
+ docker run --runtime nvidia --env NVIDIA_DRIVER_CAPABILITIES=compute,utility,graphics -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/students/jetson-containers/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb --device /dev/video0 --device /dev/video1 --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-7 --device /dev/i2c-9 -v /run/jtop.sock:/run/jtop.sock --name jetson_container_20250821_222743 dustynv/ollama:0.6.8-r36.4-cu126-22.04 ollama run gemma3n:e4b
**Error: llama runner process has terminated: signal: killed
user@hostname**:~$ jetson-containers run $(autotag ollama) ollama run gemma3n:e2b
Namespace(packages=[‘ollama’], prefer=[‘local’, ‘registry’, ‘build’], disable=[‘’], user=‘dustynv’, output=‘/tmp/autotag’, quiet=False, verbose=False)
-- L4T_VERSION=36.4.4 JETPACK_VERSION=6.2.1 CUDA_VERSION=12.6
-- Finding compatible container image for [‘ollama’]
dustynv/ollama:0.6.8-r36.4-cu126-22.04
V4L2_DEVICES: --device /dev/video0 --device /dev/video1
### ARM64 architecture detected
### Jetson Detected
SYSTEM_ARCH=tegra-aarch64
+ docker run --runtime nvidia --env NVIDIA_DRIVER_CAPABILITIES=compute,utility,graphics -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /home/students/jetson-containers/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb --device /dev/video0 --device /dev/video1 --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-7 --device /dev/i2c-9 -v /run/jtop.sock:/run/jtop.sock --name jetson_container_20250821_225026 dustynv/ollama:0.6.8-r36.4-cu126-22.04 ollama run gemma3n:e2b
>>> hyy! how are you
Error: Post “http://0.0.0.0:11434/api/chat”: EOF
**
then I checked the logs:
user@hostname:~**$ docker logs jetson_container_20250821_215028
Starting ollama server
OLLAMA_HOST 0.0.0.0
OLLAMA_LOGS /data/logs/ollama.log
OLLAMA_MODELS /data/models/ollama/models
ollama server is now started, and you can run commands here like ‘ollama run gemma3’
root@ubuntu:/# jetson-containers run $(autotag ollama) ollama run gemma3n:e2b
bash: autotag: command not found
bash: jetson-containers: command not found
root@ubuntu:/# ollama run gemma3n:e2b
>>> hyy! how are you?
⠧
Error: Post “http://0.0.0.0:11434/api/chat”: EOF
root@ubuntu:/#