Can't load InsightFace Onnx model on TX2 by using TensorRT 4.1.3

HI,

I am using InsightFace(https://github.com/deepinsight/insightface) on Windwos with Tensorflow and I want to use it on TX2 now.

I use the Tensorflow-ONNX converter from https://github.com/onnx/tensorflow-onnx by using the command for Face Detect model and Gender-Age model.

python -m tf2onnx.convert --graphdef ./saved_model.pb --output ./frozen.onnx --fold_const --inputs pnet/input:0,rnet/input:0,onet/input:0 --outputs pnet/conv4-2/BiasAdd:0,pnet/prob1:0,rnet/conv5-2/conv5-2:0,rnet/prob1:0,onet/conv6-3/conv6-3:0,onet/prob1:0
python -m tf2onnx.convert --graphdef ./AGE_GENDER.pb --output ./frozen.onnx --fold_const --inputs data:0 --outputs output/BiasAdd:0

Then, I use the ONNX model on my TX2 and get the following error on the Face Detect

[2019-09-11 02:41:05   ERROR] Parameter check failed at: ../builder/Network.cpp::addInput::364, condition: isValidDims(dims)
rtspframebuffer: onnx/converterToTRT.h:211: nvinfer1::ITensor* nvonnxparser::Converter::convert_input(std::string): Assertion `input_tensor' failed.

and get the following error on the Face Detect.

[2019-09-11 02:41:47   ERROR] Parameter check failed at: ../builder/Network.cpp::addScale::113, condition: scale.count == 0 || scale.count == weightCount
rtspframebuffer: onnx/converterToTRT.h:156: nvonnxparser::TRT_LayerOrWeights nvonnxparser::Converter::convert_node(const onnx::NodeProto*): Assertion `layer' failed.

Does anyone know how to solve it?

Thanks.