Tensorrt Error when using dynamic batch: data: kMIN dimensions in profile 0 are [24,3,224,224] but input has static dimensions [48,3,224,224]

Description

Hi all. I have converted a PyTorch model to trt directly in python, without using ONNX or trtexec (because it has some custom operations).
my python pseudo-code is as below, I used tsm model from this repo (https://github.com/wang-xinyu/tensorrtx/blob/master/tsm/tsm_r50.py):

    network = builder.create_network(
        1 << int(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH)
    )
    num_segments=24
    n_batch= 2
    batch_segments = n_batch*num_segments
    data = network.add_input(input_plob_name, data_type,
                             (batch_segments, 3, model_input_h, model_input_h))
    conv1 = network.add_convolution(input=data,
                                    num_output_maps=64,
                                    kernel_shape=(7, 7),
                                    kernel=np.asarray(
                                        weight_map["conv1.weight"],
                                        dtype="float32"),
                                    bias=trt.Weights())
    # add some layers here ...

    fc1 = network.add_fully_connected(input=pool2.get_output(0),
                                      num_outputs=output_size,
                                      kernel=np.asarray(
                                          weight_map['fc.weight'],
                                          dtype="float32"),
                                      bias=np.asarray(weight_map['fc.bias'],
                                                      dtype="float32"))

    reshape = network.add_shuffle(fc1.get_output(0))
    assert reshape
    reshape.reshape_dims = (n_batch, num_segments, output_size)
    reduce = network.add_reduce(reshape.get_output(0),
                                op=trt.ReduceOperation.AVG,
                                axes=2,
                                keep_dims=False)
    assert reduce
    softmax = network.add_softmax(reduce.get_output(0))
    assert softmax

    softmax.axes = 1
    softmax.get_output(0).name = output_plob_name
    network.mark_output(softmax.get_output(0))

    config = builder.create_builder_config()
    config.max_workspace_size = 1 << 30
    builder.max_batch_size = n_batch
    
    profile = builder.create_optimization_profile()
    num_segments=24
    n_batch=2
    batch_segments=  n_batch*num_segments   # 2*24
    profile.set_shape(
        input_plob_name,
        (num_segments, 3, 224, 224),
        (batch_segments, 3, 224, 224),
        (batch_segments, 3, 224, 224),
    )
    logger = trt.Logger(trt.Logger.INFO)
    config.add_optimization_profile(profile)

when I want to apply dynamic batch with min=24, opt=48, max=48, it gave me an error:

[TensorRT] ERROR: 4: [network.cpp::operator()::2733] Error Code 4: Internal Error (data: kMIN dimensions in profile 0 are [24,3,224,224] but input has static dimensions [48,3,224,224].)

Environment

docker image: nvcr.io/nvidia/tensorrt version==21.11-py3
TensorRT Version: tensorrt ==8.0.3.4
GPU Type: RTX 3060
Operating System + Version: UBUNTU 18.04
Python Version (if applicable): python 3.8.10

any help would be appreciated

Hi,

We are checking on this issue. Will get back to you shortly.

Thank you.

Hi,

Optimization profiles are meant to be used with dynamic shape.
You have to export the model to ONNX with dynamic shape support from the original framework for correct results. You need to export the ONNX model with dynamic_axes option. Please check following doc and search for dynamic_axes:
https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html

Thank you.

Hi @spolisetty
I can not use ONNX model because tsm model has some custom operations and custom layers which onnx can not support it.
Finally, I found the solution
in the above code I have to change the max_batch_size as below:

builder.max_batch_size = n_batch*num_segments

then it works and converted corretly.

1 Like