Tensorflow MobileNetv2 doesn't work with deepstream

I followed the official instructions provided for Inception_v2 but I am using ssd_mobilenet_v2 instead of inception. I converted the file to uff format but when I run it using deepstream, it is throwing this error:

[libprotobuf FATAL /home/erisuser/p4sw/sw/gpgpu/MachineLearning/DIT/externals/protobuf/aarch64/10.0/include/google/protobuf/repeated_field.h:1408] CHECK failed: (index) < (current_size_): 
terminate called after throwing an instance of 'google_private::protobuf::FatalException'
  what():  CHECK failed: (index) < (current_size_): Aborted (core dumped)

Anyone else faced this issue? Stuck at this for two days and I am unable to figure out what is wrong. Any help would be highly appreciated.


Would you mind to share the complete log with us?
It looks like there are some operations of your model don’t create the MemoryData right.

A possible cause is that ssd_mobilenet_v2 requires some TensorRT plugin implementation and customized compiling.
Have you successfully loaded your model with our TensorRT sample: /usr/src/tensorrt/samples/python/uff_ssd/
If not, it’s recommended to try it. This can make sure your uff model and plugin operation is correct.