How do you use OPENAI’s whisper model on AGX Orin?
Hi @wfh1, please see this tutorial about Whisper on Jetson AI Lab:
Thank you for your answer. It is really convenient to run under docker. I would like to continue discussing this issue. Is there a way to convert it to run under arm environment?
Hi @wfh1, sorry for the delay - docker containers aren’t a virtual machine, and it is running natively in ARM64 architecture (with CUDA acceleration).
If you meant, is it possible to run it outside of container, then yes it is possible, you would just need to manually install all of the dependencies that the Whisper container uses to run the Whisper code/models (like PyTorch/numba/ect). These stacks of AI libraries can get quite complex, which is why we distribute them as all-inclusive container images.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.