Weight sharing in TensorRT?

I have a multi stream imaging application that require reshaping over the 0’th axis (batch axis) and work well in Caffe.
According to Tensor RT documentation, reshaping over the batch axis is not supported.
This problem can be bypassed by using weight sharing instead of the unsupported reshape over the 0’th axis.
Is there any method which allow weight sharing in TensorRT?

The following lines were copied from a toy example that works well in Caffe but yield the following error:
Segmentation fault (core dumped)
in TensorRT.

First stream contain:

layer {
name: “conv1”
type: “Convolution”
bottom: “data”
top: “conv1”
param { name: “conv1_w” } # Allow weight sharing under the name conv1_w
convolution_param {
num_output: 32
pad: 2
kernel_size: 5
stride: 1
}
}

Second stream contain: (weights are shared from the layer in the first stream)

layer {
name: “conv1__1”
type: “Convolution”
bottom: “data”
top: “conv1__1”
param { name: “conv1_w” } # Weights are shared weights from the first layer
convolution_param {
num_output: 32
pad: 2
kernel_size: 5
stride: 1
}
}

Thanks,

Itai