INTERNAL ERROR: Assertion failed: mWeights.count == C

Linux - Ubuntu 16.04
GPU type - RTX 2080Ti
Nvidia driver version - 435.21
CUDA version - 10.0
CUDNN version - 7.6.3
Python version - 3.6.9
TensorRT version - 6.0.1.5

Problem:

I am implementing Instance Normalization custom layer in TensorRT by using Normalize_TRT Plugin.

Function for the Plugin:

def get_trt_plugin(plugin_name, weights=None):
for plugin_creator in PLUGIN_CREATORS:
if “Normalize_TRT” == plugin_name and plugin_creator.name == plugin_name:
print(‘Using Normalize_TRT’)
weight = trt.PluginField(‘weights’, np.array(weights, np.float32), trt.PluginFieldType.FLOAT32)
nbweight = trt.PluginField(‘nbWeights’, np.array([1], dtype=np.int32), trt.PluginFieldType.INT32)
across_spatial = trt.PluginField(‘acrossSpatial’, np.array([1], dtype=np.int32), trt.PluginFieldType.INT32)
channel_shared = trt.PluginField(‘channelShared’, np.array([0], dtype=np.int32), trt.PluginFieldType.INT32)
epsilon = trt.PluginField(‘epsilon’, np.array([0.00001], np.float32), trt.PluginFieldType.FLOAT32)
field_collection = trt.PluginFieldCollection([weight, nbweight, across_spatial, channel_shared, epsilon])
plugin = plugin_creator.create_plugin(name=plugin_name, field_collection=field_collection)
return plugin

Function to create network

def populate_network(network, weights):

Weights are from Pytorch.

""" Input """
input_tensor = network.add_input(name='data', dtype=trt.float32, shape=(3, 1024, 1024))
""" conv 0 """
down_convs_0_weight = weights['down_convs.0.weight'].numpy()
down_convs_0_bias = weights['down_convs.0.bias'].numpy()
# (0): Conv2d(3, 64, kernel_size=(7, 7), stride=(1, 1), padding=(3, 3))
padding_1 = network.add_padding(input=input_tensor, pre_padding=(3, 3), post_padding=(3, 3))
down_convs_0 = network.add_convolution(input=padding_1.get_output(0), num_output_maps=64, kernel_shape=(7, 7), kernel=down_convs_0_weight, bias=down_convs_0_bias)
down_convs_0.stride = (1, 1)
print(down_convs_0.kernel.shape)

in_1 = network.add_plugin_v2(inputs=[down_convs_0.get_output(0)], plugin=get_trt_plugin('Normalize_TRT', weights=down_convs_0.kernel))

""" ReLU """
# (2): ReLU(inplace=True)
relu_2 = network.add_activation(input=in_1.get_output(0), type=trt.ActivationType.RELU)
print(in_1.get_output(0).dtype)
print(in_1.plugin.num_outputs, in_1.plugin.tensorrt_version, in_1.plugin.plugin_type, in_1.plugin.plugin_version, in_1.plugin.plugin_namespace, in_1.plugin.serialization_size)
relu_2.get_output(0).name = 'tanh'
network.mark_output(tensor=relu_2.get_output(0))

Error:
[TensorRT] INTERNAL ERROR: Assertion failed: mWeights.count == C
normalizePlugin.cpp:124
Aborting…

Aborted (core dumped)

Hi,

It seems that something is wrong with the weights passed to the plugin.

As per plugin documentation:
Weights → A pointer to weights which contains information about scale factors for normalization. The definition of Weights can be found in the NvInfer.h header. The number of values in weights is 1 if channelShared = false, otherwise the number of values in weights is the number of channels in the input tensor.

Please refer to below link for more details:
https://github.com/NVIDIA/TensorRT/blob/07ed9b57b1ff7c24664388e5564b17f7ce2873e5/plugin/normalizePlugin/README.md#parameters

Thanks

Hi,

Thanks for your reply

But I have question about pointer to weights.As I am using Python interface how can I pass a pointer to weights to the plugin?

Thanks.

Hi,
I haven’t tried it but i think in python it’s value will be a numpy array.
Could you please try using numpy array and let me know if it works?

Thanks