Provide details on the platforms you are using:
O/S: Windows 10
GPU type: 1080
nvidia driver version:
CUDA version: N/A
CUDNN version: N/A
TensorRT version: 5.1.5.0
Describe the problem
Trying to load ONNX file exported from CNTK (see link below) into TensorRT the import fails on the first node (a Slice transforming 1x3x1024x1024 → 1x3x1024x512);
C:\Software\TensorRT-5.1.5.0\bin>sample_HeteroGenius.exe
&&&& RUNNING TensorRT.sample_onnx_mnist # sample_HeteroGenius.exe
Input filename: …/data/HGdata/TISSUETYPE_8X_DSNET_12D.cntk2.7.onnx
ONNX IR version: 0.0.4
Opset version: 9
Producer name: CNTK
Producer version: 2.7
Domain: ai.cntk
Model version: 1
Doc string:
WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
While parsing node number 0 [Slice]:
ERROR: builtin_op_importers.cpp:2046 In function importSlice:
[4] Assertion failed: std::all_of(axes.begin(), axes.end(), [nbDims](int d)->bool{return d < nbDims;})
[E] Failure while parsing ONNX file
&&&& FAILED TensorRT.sample_onnx_mnist # sample_HeteroGenius.exe
Assertion failed: trtModelStream != nullptr, file c:\software\tensorrt-5.1.5.0\samples\sampleheterogenius\sampleonnxmnist.cpp, line 214
The ONNX file loads fine into netron (Netron). It also loads fine with the onnx python module (from Microsoft?):
model = onnx.load(“TISSUETYPE_8X_DSNET_12D.cntk2.7.onnx”)
onnx.checker.check_model(model)
Files