Error Parsing UFF to TRT

I am trying to parse a UFF model (that came from a tensorflow one) to TensorRT but I do get this errors:

[libprotobuf WARNING google/protobuf/io/coded_stream.cc:604] Reading dangerously large protocol message.
 If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons.  To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[TensorRT] ERROR: Parameter check failed at: ../builder/Network.cpp::addInput::432, condition: isValidDims(dims)
[TensorRT] ERROR: UFFParser: Failed to parseInput for node input_1
[TensorRT] ERROR: UffParser: Parser error: input_1: Failed to parse node - Invalid Tensor found at node input_1
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:81] The total number of bytes read was 553498272

the automatically deduced input nodes are the followings:

UFF Version 0.6.3
=== Automatically deduced input nodes ===
[name: "input_1"
op: "Placeholder"
attr {
  key: "dtype"
  value {
    type: DT_FLOAT
  }
}
attr {
  key: "shape"
  value {
    shape {
      dim {
        size: -1
      }
      dim {
        size: 224
      }
      dim {
        size: 224
      }
      dim {
        size: 3
      }
    }
  }
}
]
=========================================

And here you can see my output nodes and the number of nodes:

Using output node predictions/Softmax
Using output node block1_conv1/kernel
Using output node block1_conv1/bias
Using output node block1_conv2/kernel
Using output node block1_conv2/bias
Using output node block2_conv1/kernel
Using output node block2_conv1/bias
Using output node block2_conv2/kernel
Using output node block2_conv2/bias
Using output node block3_conv1/kernel
Using output node block3_conv1/bias
Using output node block3_conv2/kernel
Using output node block3_conv2/bias
Using output node block3_conv3/kernel
Using output node block3_conv3/bias
Using output node block4_conv1/kernel
Using output node block4_conv1/bias
Using output node block4_conv2/kernel
Using output node block4_conv2/bias
Using output node block4_conv3/kernel
Using output node block4_conv3/bias
Using output node block5_conv1/kernel
Using output node block5_conv1/bias
Using output node block5_conv2/kernel
Using output node block5_conv2/bias
Using output node block5_conv3/kernel
Using output node block5_conv3/bias
Using output node fc1/kernel
Using output node fc1/bias
Using output node fc2/kernel
Using output node fc2/bias
Using output node predictions/kernel
Using output node predictions/bias

No. nodes: 136

Seems like the Dims check didn’t pass. May I ask why the first dimension if -1? Is it because it is set to “None”?

I am using https://github.com/keras-team/keras-applications/blob/master/keras_applications/resnet50.py model (the one that was converted to UFF using the frozen graph) so the dimensions of the model are (3,224,224).

I am not 100% sure about the reason I get the -1 but using the example they give of TensorFlow to TensorRT gets the same “error”. However if you do the UFF parsing using the

convert-to-uff [file.pb]

command utility it works magically with no error (even with the dim to -1).