I am trying to use TensorRT for instance segmentation application. For this I have saved the built engine and subsequently when I try to load the saved .engine file it says “Deserialization Failed. Internal error. Magic Tag assertion failed.”
Interstingly, if I do not save the engine first but rather build and run inference at once consequitively, it does not throw any error.
For the record, my environment remains the same while building and saving the engine as well while running inferences.
Environment
jetpack 4.6 on Xavier Nx with CUDA 10.2, Linux 18.04, Tf 1.15, Python 3.6 and TensorRT 8.2
If not, would you mind sharing the steps/source that you serialize and deserialize the TensorRT engine?
More, do you use the TensorRT v8.0, a default version in JetPack 4.6?