Hi team,
I have successfully converted the model to TensorRt in jetpack 4.3.
But since I have flashed my device to replace it with jetpack 4.4. Since then I am not able to convert that model to TensorRT neither able to download onnxruntime.
Any suggestion?
Thanks.
Hi,
Do you meet any error when converting the model into TensorRT?
If yes, would you mind to share the error with us?
For onnxruntime installation, please check this comment:
Hi,
We can build this onnxruntime issue with this update:
diff --git a/onnxruntime/core/providers/cuda/rnn/cudnn_rnn_base.h b/onnxruntime/core/providers/cuda/rnn/cudnn_rnn_base.h
index 5281904a2..75131db39 100644
--- a/onnxruntime/core/providers/cuda/rnn/cudnn_rnn_base.h
+++ b/onnxruntime/core/providers/cuda/rnn/cudnn_rnn_base.h
@@ -42,16 +42,16 @@ class CudnnRNN {
if (!cudnn_rnn_desc_)
CUDNN_RETURN_IF_ERROR(cudnnCreateRNNDescriptor(&cudnn_rnn_desc_));
- CUDNN_RETURN_IF_ERROR(…
Thanks.
Hi @AastaLLL ,
Thanks for the response. I am using onnx parser. Error while converting is
AttributeError: 'NoneType' object has no attribute 'create_execution_context'
Hi,
Could you check if this comment helps first?
Hi,
Issue seems to be due to “EXPLICIT_BATCH” setting in the code.
In TRT 7, ONNX parser supports full-dimensions mode only. Your network definition must be created with the explicitBatch flag set (when using ONNX parser).
Since you are using TRT 6, please replace it with below code
with trt.Builder(TRT_LOGGER) as builder, builder.create_network() as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
I tested on both TRT 6 (After code changes) and TRT 7 (without changes), it seems to b…
Thanks.