NanoVLM Issue on Jetson Orin Nano

Hi Team

We are using Jetson Orin Nano 8gb with the latest jetpack version of 6.0 and we are trying to run the NanoVLM model for that we are following the below steps:

Cloned git from the following repository: git clone GitHub - dusty-nv/jetson-containers: Machine Learning Containers for NVIDIA Jetson and JetPack-L4T
Install the jetson-containers utilities: bash jetson-containers/install.sh
Add your user to the docker group : sudo usermod -aG docker $USER.
Container images are compatible with other minor versions of JetPack/L4T:
L4T R32.7 containers can run on other versions of L4T R32.7 (JetPack 4.6+)
L4T R35.x containers can run on other versions of L4T R35.x (JetPack 5.1+)
Downloaded the docker image by using the following command: jetson-containers run $(autotag nano_llm).
Finally used this below command to run the model with sample offline videos:jetson-containers run $(autotag nano_llm) \ python3 -m nano_llm.vision.video --model Efficient-Large-Model/VILA1.5-3b --max-images 8 --max-new-tokens 48 --video-input /data/my_video.mp4 --video-output /data/my_output.mp4 --prompt 'What changes occurred in the video?’
Reference Link: NanoVLM - NVIDIA Jetson AI Lab

It is working with offline video/image. while testing with live streaming used below command:
jetson-containers run $(autotag nano_llm)
python3 -m nano_llm.agents.video_query --api=mlc
–model Efficient-Large-Model/VILA1.5-3b
–max-context-len 256
–max-new-tokens 32
–video-input /dev/video0
–video-output webrtc://@:8554/output

The model successfully loads and generates the expected output prompts. However, when accessing the live video stream via https://localhost:8050, no video is visible.
Tried to disable ‘chrome://flags#enable-webrtc-hide-local-ips-with-mdns’ its shows no matching experiments, also tried with Firefox but still live feed from camera not visible in browser.
camera is working tested with the below command:
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! autovideosink

Can you help us to solve this issue

Regards
Karthika

Hi @dusty_nv can you please help to resolve the above issue.

Hello,

Thanks for visiting the NVIDIA Developer forums! Your topic will be best served in the Jetson category.

I have moved this post for better visibility.

Cheers,
Tom

Hi,

Did you test the gst-launch command inside the container?
Could you help to verify if the camera is accessible inside the container first?

Thanks.

Hi @AastaLLL , thanks for your response.

Live streaming issue is resolved, tried to change the style/font through Python and html code but it’s not reflecting in the output web page.
Used this below command to pull the container
jetson-containers run $(autotag nano_llm)

Tried to rebuild the container by using the following command: jetson-containers build nano_llm, after the rebuild process completed, the container is now not working properly.

jetson-containers run -v /path/on/host:/path/in/container $(autotag nano_llm) used this command to mount the directory.

Could you please help me to resolve this issue?

Thanks and regards,
Karthika

Hi,

Could you share the details about how you updated the style/font?
The source of nano_llm can be found in the /opt/ inside the container.

Thanks.

Hi @AastaLLL thanks for your response.

I am using this below command to test with live streaming
jetson-containers run $(autotag nano_llm)
python3 -m nano_llm.agents.video_query --api=mlc
–model Efficient-Large-Model/VILA1.5-3b
–max-context-len 256
–max-new-tokens 32
–video-input /dev/video0
–video-output webrtc://@:8554/output

made changes in the video_query.py code.
NanoLLM/nano_llm/agents/video_query.py at main · dusty-nv/NanoLLM

changed web title and background color in this code, but it’s not reflecting in the output web page.

we didn’t make any changes /opt/ instead of that doing changes in local and mounting that directory to the container.

Thanks and regards,
Karthika

Hi,

If you want to use the nano_llm source outside of the container, please mount it with the below command:

Or it will by default download the source into /opt/ folder.

Thanks.

Hi @AastaLLL

I faced issue with container like, it is not working properly, so tired below steps and reinstall the container.

Step 1: Removed all the docker images .
Step 2: Deleted cache files.
Step 3: Removed the old container .
Step 4: cloned the git repo and install dependencies using the following commands: git clone GitHub - dusty-nv/jetson-containers: Machine Learning Containers for NVIDIA Jetson and JetPack-L4T,
bash jetson-containers/install.sh
Step 5 : used below command to automatically pull or build a compatible container image and able to go inside the container as well
jetson-containers run $(autotag nano_llm)
now am able to access the container and it’s working properly.
Step 6: Able to find the nano_llm path inside the /opt/ and changed the changed the web title using video_query.py file. But its not reflecting in the web page UI .
Step 7: For style/font changes am looking into css/html code.

Note: am not mounting any volumes.

can you please help how to make the changes inside the container, and it should reflect in web UI page. Sometimes facing issue like camera is not loading properly in web page.

Thanks and regards,
Karthika

Hi @AastaLLL ,

Tried to mount the directory by using below command:

git clone GitHub - dusty-nv/NanoLLM: Optimized local inference for LLMs with HuggingFace-like APIs for quantization, vision/language models, multimodal agents, speech, vector DB, and RAG.
jetson-containers run
-v ${PWD}/NanoLLM:/opt/NanoLLM
$(autotag nano_llm)

I can be able to mount the directory from my local to container its reflecting but once I exit from the container its reverting back to the original.

can you please help me to resolve this issue.

Thanks and regards,
Karthika

Hi,

Could you share more info about the behavior?

If you mount the source from local, the source code should not be affected even if the container is terminated.
Is your change (git diff) still there once you relaunch the container?
Or is it just not applied to the web page?

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.