fail in converting yolov3.weights to yolov3.onnx

Hi:
I follow the method described in yolov3_onnx sample in TensortRT-5.1.5.0 SDK,install the OnnxTensorRT module,download yolov3.weights from darknet’s site,and type “python yolov3_to_onnx.py” to convert it to onnx format,but the python script report below errors:

Traceback (most recent call last):
File “yolov3_to_onnx.py”, line 812, in
main()
File “yolov3_to_onnx.py”, line 805, in main
onnx.checker.check_model(yolov3_model_def)
File “/usr/local/lib/python2.7/dist-packages/onnx/checker.py”, line 82, in check_model
C.check_model(model.SerializeToString())
onnx.onnx_cpp2py_export.checker.ValidationError: Input size 2 not in range [min=1, max=1].

==> Context: Bad node spec: input: “085_convolutional_lrelu” input: “086_upsample_scale” output: “086_upsample” name: “086_upsample” op_type: “Upsample” attribute { name: “mode” s: “nearest” type: STRING }

I had installed the onnx-tensorrt module,so why upsample & lrelt op is invalid?

below is my system’s “pip list” report:

Package Version


absl-py 0.7.0
appdirs 1.4.3
asn1crypto 0.24.0
astor 0.7.1
atomicwrites 1.3.0
attrs 19.1.0
backports.weakref 1.0.post1
configparser 3.7.4
contextlib2 0.5.5
cryptography 2.1.4
decorator 4.4.0
enum34 1.1.6
funcsigs 1.0.2
futures 3.2.0
gast 0.2.2
graphsurgeon 0.3.2
grpcio 1.19.0
h5py 2.9.0
idna 2.6
importlib-metadata 0.19
ipaddress 1.0.17
Keras-Applications 1.0.7
Keras-Preprocessing 1.0.9
keyring 10.6.0
keyrings.alt 3.0
Mako 1.0.14
Markdown 3.0.1
MarkupSafe 1.1.1
mock 2.0.0
more-itertools 5.0.0
numpy 1.16.4
olefile 0.45.1
onnx 1.2.2
onnx-tensorrt 0.1.0
packaging 19.1
pathlib2 2.3.4
pbr 5.1.3
Pillow 6.1.0
pip 19.2.1
pluggy 0.12.0
protobuf 3.9.0
py 1.8.0
pycrypto 2.6.1
pycuda 2019.1.1
pycurl 7.43.0.1
pygobject 3.26.1
pyliblzma 0.5.3
pyparsing 2.4.2
pysqlite 1.0.1
pytest 4.6.4
pytools 2019.1.1
pyxdg 0.25
rpm 4.14.1
scandir 1.10.0
SecretStorage 2.3.1
setuptools 41.0.1
six 1.12.0
tensorboard 1.13.0
tensorflow-estimator 1.13.0
tensorflow-gpu 1.13.1
tensorrt 5.1.5.0
termcolor 1.1.0
typing 3.7.4
typing-extensions 3.7.4
uff 0.5.5
urlgrabber 3.10.2
wcwidth 0.1.7
Werkzeug 0.14.1
wget 3.2
wheel 0.33.1
yum-metadata-parser 1.1.4
zipp 0.5.2

OK,I had solve the problem by myself,I must install the onnx to v1.4.1,why not show clealy the version of onnx in readme.md?

https://github.com/Rapternmn/PyTorch-Onnx-Tensorrt

Checkout this repo for simplified conversion.

If you install onnx 1.5.0, it may help you to handle your problem. you also need to run this script in python 2.7.
Best,
Mary

Hi , UFF and Caffe Parser have been deprecated from TensorRT 7 onwards, hence request you to try ONNX parser.
Please check the below link for the same.

Thanks!