Hello,
I try to convert a retrained ssd_inception_v2_coco.
The original model I downloaded was this one:
http://download.tensorflow.org/models/object_detection/ssd_inception_v2_coco_2018_01_28.tar.gz
I used the sample “uff_ssd” and I get the following error
[TensorRT] ERROR: UFFParser: Validator error: Cast: Unsupported operation _Cast
Building TensorRT engine. This may take few minutes.
[TensorRT] ERROR: Network must have at least one output
Traceback (most recent call last):
File "custommodel.py", line 218, in <module>
main()
File "custommodel.py", line 215, in main
batch_size=parsed['max_batch_size'])
File "/usr/src/tensorrt/samples/python/uff_ssd/utils/inference.py", line 70, in __init__
engine_utils.save_engine(self.trt_engine, trt_engine_path)
File "/usr/src/tensorrt/samples/python/uff_ssd/utils/engine.py", line 83, in save_engine
buf = engine.serialize()
AttributeError: 'NoneType' object has no attribute 'serialize'
I uploaded my sourcecode as well as my retrained model:
https://drive.google.com/open?id=1qMuaAnCnkUl6B8uSttSv5Cv3fX-cCJB1
I do have a input named cast, but I cannot remove it.
Any idea what I can do?
Thanks.