I want to use transformers with l4t-pytorch

Hi

I created a container using the Docker image dustynv/l4t-pytorch:r35.3.1 and am running Whisper inside the container.

I want to use a transformer inside this container.
If I install a transformer with pip, will this transformer use CUDA?

Or should I use the Docker image dustynv/transformers:r35.3.1?

Thank’s

Hi,

It looks like Whisper uses PyTorch as inference engine

If so, it should run on the GPU since the container uses the package with CUDA support.

Thanks.

1 Like

@Heartful-echo you can indeed just install transformers with pip3, and it will already use CUDA because the PyTorch in your container was built with CUDA enabled. Also, I do have a container for transformers that I use in a bunch of other stuff here:

1 Like

@AastaLLL
Thank you for the interesting information.

@dusty_nv
I will use dustynv/transformers:r35.3.1.
Thank you for the great content.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.