Hi,
Can we expect soon to have containers for the new llama3.2 vision models?
In particular, I would be interested in llama-3.2-11b-vision-instruct.
Or is there a way to build it locally on my hardware?
thanks,
Luc
Hi,
Can we expect soon to have containers for the new llama3.2 vision models?
In particular, I would be interested in llama-3.2-11b-vision-instruct.
Or is there a way to build it locally on my hardware?
thanks,
Luc
Hi @renambot – I don’t have an exact timeline on when the downloadable containers will be available but it’s being actively worked on. For now, you can use the hosted API endpoint here: NVIDIA NIM | llama-3.2-11b-vision-instruct