hello i want to use dla in jetson pytorch transformers
should I have to use TenrsorRT to use dla???
torch 2.4.0a0+07cecf4168.nv24.5
jetpack. 6
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="cuda", # "cuda" , "CPU"
torch_dtype="auto",
trust_remote_code=True, )
tokenizer = AutoTokenizer.from_pretrained(model_id) pipe = pipeline( "text-generation", model=model, tokenizer=tokenizer, )
Hi,
Please find the Torch-TensorRT document for DLA below:
https://pytorch.org/TensorRT/user_guide/using_dla.html
Thanks.
Hi,
Thx to reply my message
but It is quite hard to find the version of Torch-TensorRT for my device (Jestson agx orin)
what is the version of my device to use dla with tensrrt??
Hi,
Torch-TensorRT is a plugin in PyTorch.
Suppose you can use it directly since you already installed our prebuilt.
Please follow the sample code in the doc and let us know if you meet an issue.
Thanks.
Hi I already install pytoch and tensorrt but I still can not use torch-tensorrt so I tried to install torch-tensorrt but still not working
I followed the error message, but the same error message kept popping up.
Hi,
Could you give our docker container a try?
For example: nvcr.io/nvidia/pytorch:24.08-py3-igpu
Thanks.
system
Closed
10
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.