How can I inference with multiple input network on TensorRT?

I would like to test GQ-CNN which is network in Dex-Net on tensorRT.
I successfully converted tflite file to uff file but when I tried to inference with that network, there is an error I couldn’t figure out.

[TensorRT] ERROR: Parameter check failed at: …/builder/Network.cpp::addLRN::149, condition: lrnWindow & 0x1
python3: uff/orders.cpp:330: void UffParser::addTranspose(ParserLayer&, std::vector): Assertion `outputs.size() == 1’ failed.

The error is appeared when building model.
I tried to find clue from google but there are no codes and no references.

There’s only different thing compare with example code that works well.
I registered input twice like below code because GQ-CNN has multiple input.
So I guess that registering multiple input using uffparser could be the main reason of that error.

parser.register_input(“Placeholder”, (1,32,32))
parser.register_input(“Placeholder_1”, (2,))

Is there anyone who succeeded to inference with multiple input model?