I used the convert-to-uff utility successfully on this model http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v2_coco_2018_03_29.tar.gz. Then, to perform inferencing with this model,I started with the tensorRT_optimization tool to acquire a .bin file like the one used in the dnn samples.
I input this command:
./tensorRT_optimization --modelType=uff --uffFile=/home/nvidia/Desktop/ssd_mobilenet_coco.uff --outputBlobs=detection_scores,detection_boxes,detection_classes,num_detections --inputDims=3x300x300
and get this error:
nvrm_gpu: Bug 200215060 workaround enabled.
Initializing network optimizer on model /home/nvidia/Desktop/ssd_mobilenet_coco.uff.
Error: DW_INTERNAL_ERROR: Unable to parse uff file.
I am very sure that the inputted information is correct, also, so the issue is likely something else.