I'm getting an error in torch2trt engine when running the google/bit-50 model, the error is related to torch2trt engine

In torch2trt engine in this line: engine = builder.build_engine(network, config).

Here is the Error:
Internal Error (func:64:NORMALIZATION:GPU: scale dimensions not compatible for normalization)

For Code:

from transformers import BitImageProcessor, BitForImageClassification
import torch
from datasets import load_dataset
from PIL import Image

dataset = load_dataset(“huggingface/cats-image”)
image = dataset[“test”][“image”][0]
feature_extractor = BitImageProcessor.from_pretrained(“google/bit-50”)
model = BitForImageClassification.from_pretrained(“google/bit-50”)
inputs = feature_extractor(image, return_tensors=“pt”)

trt_model = torch2trt(model, args=inputs)

Hi ,
We recommend you to check the supported features from the below link.

You can refer below link for all the supported operators list.
For unsupported operators, you need to create a custom plugin to support the operation

Thanks!

I didn’t quite get your reply sorry but what I’m trying to convey is that we can directly use torch2trt and I’m using TensorRT’s latest version plus torch’s latest version and the model’s using all the supported operations of tensorrt so I would assume we don’t need any custom plugins for this. I just need to understand what is the exact meaning of the above error which is clearly related to engine of tensorrt and I provided on which line the error raised. I tried all my ways and couldn’t get any help, please let me know if you have any solution for this?

Thanks!

The error message indicates that the scales of the normalization layer in the model are not compatible with the dimensions of the input tensor. This may happen if the model was trained on a different image size than the input tensor.

Please check the image size that the model was trained on and resize the input tensor to the same size.
We obtained the following output after adding the following:

print(inputs['pixel_values'].shape)

# Output
torch.Size([1, 3, 448, 448])

Also, it appears that there is some problem with the usage of torch2trt in the above script. Please refer to the following document:

Thanks!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.