Failed to parse onnx file in TensorRT-5.0.4.3

Following is the info from terminal.

ONNX IR version: 0.0.4
Opset version: 8
Producer name: OnnxMLTools
Producer version: 1.3.2
Domain: onnxml
Model version: 0
Doc string:

WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
While parsing node number 3 [Pad -> “legacy_padded_tensor”]:
— Begin node —
input: “eltwise1”
output: “legacy_padded_tensor”
name: “Pad”
op_type: “Pad”
attribute {
name: “pads”
ints: 0
ints: 0
ints: 0
ints: 0
ints: 0
ints: 0
ints: 0
ints: 0
type: INTS
}
attribute {
name: “value”
f: -3.4028235e+38
type: FLOAT
}
domain: “”

— End node —
ERROR: C:\p4sw\sw\gpgpu\MachineLearning\DIT\release\5.0\parsers\onnxOpenSource\builtin_op_importers.cpp:1116 In function importPad:
[8] Assertion failed: mode == “constant” && value == 0
ERROR: failed to parse onnx file

I am using onnx model converted from caffe model. (TensorRT doesn’t support caffe slice, so I couldn’t use caffe model directly.)

Original caffe model proto: Conv -> Slice -> Eltwise -> Pooling -> …
Using onnxmltools it converted to: Conv -> Split -> Max -> Pad -> MaxPool -> …
When infer Pad layer, above error occured.

Hope any suggestions.