UFF parser error

**• Hardware Platform (Jetson / GPU) :2080Ti
**• DeepStream Version : 5.0 (docker)
• TensorRT Version 7.0

I have a uff model. But this is failing every time while parsing and making the engine file. The error is:

I made sure that I give correct uff-input-blob-name and output-blob-names

Hi,

There is a similar issue which is caused by the implicate batch before.
We fix it by specifying the input tensor size.

Ex. config.py

...
Input = graphsurgeon.create_node(name="input_x_tensor",
    op="Placeholder",
    dtype=tf.float32,
    shape=[1, 480, 640, 3])

namespace_plugin_map = {
    "image_x_tensor": Input,
}

def preprocess(dyn_graph):
    dyn_graph.collapse_namespaces(namespace_plugin_map)
    ...

If the error goes on, would you mind to share your model for checking?
Thanks.