Magic tag assertion failed! Deserialization of engine failed!

Description

I am trying to use TensorRT for instance segmentation application. For this I have saved the built engine and subsequently when I try to load the saved .engine file it says “Deserialization Failed. Internal error. Magic Tag assertion failed.”

Interstingly, if I do not save the engine first but rather build and run inference at once consequitively, it does not throw any error.
For the record, my environment remains the same while building and saving the engine as well while running inferences.

Environment

jetpack 4.6 on Xavier Nx with CUDA 10.2, Linux 18.04, Tf 1.15, Python 3.6 and TensorRT 8.2

Hi,

Would you mind sharing the model so we can reproduce this issue in our environment?
Thanks.

Hey sorry for late reply,

By model do you mean the .engine file?
here it is - mowito_vision_engine.engine - Google Drive

P.S i was unable to upload it here directly

Hi,

Could you share the original model file? Ex. ONNX model.
Thanks.

please find here the uff and pbtxt file for model,
pbtxt file
uff file

Hi,

Do you use our uff-based MaskRCNN sample?

/usr/src/tensorrt/samples/sampleUffMaskRCNN/

If not, would you mind sharing the steps/source that you serialize and deserialize the TensorRT engine?
More, do you use the TensorRT v8.0, a default version in JetPack 4.6?

Thanks.

Hi,

yes we use the uff-based MaskRCNN sample but modified to suit our use case.
We actually built the sample as a library and use it in a ROS node.

We use the v8.2.0.6 for TensorRt.

if(this->bBuildEngine)
{
this->bBuildEngine = false;
SampleUniquePtr plan{builder->buildSerializedNetwork(*network, config)};
if (!plan)
{
return false;
}
std::ofstream p(“/home/mowito/kewal/TensorRT/build/out/mowito_vision_engine.engine”);
p.write((const char
)plan->data(),plan->size());
p.close();

    SampleUniquePtr<IRuntime> runtime{createInferRuntime(sample::gLogger.getTRTLogger())};
      if (!runtime)
	{
	  return false;
	}
   mEngine = std::shared_ptr<nvinfer1::ICudaEngine>(
	             runtime->deserializeCudaEngine(plan->data(), plan->size()), samplesCommon::InferDeleter());

   }

    else
  {
     char* ch = (char *)malloc(102604336);
     std::ifstream p("/home/mowito/kewal/Tensorrt/build/out/mowito_vision_engine.engine");
     p.read(ch,102604336);
     p.close();
     SampleUniquePtr<IRuntime> runtime{createInferRuntime(sample::gLogger.getTRTLogger())};

      if (!runtime)
									             {
									                return false;
										        }

 mEngine = std::shared_ptr<nvinfer1::ICudaEngine>(
                  runtime->deserializeCudaEngine((void *) ch,(std::size_t)102604336), samplesCommon::InferDeleter());

Please let me know if this helps? This is the modified code adapted from uff MaskRCNN sample.

Hi,

May I know how do you install the TensorRT v8.2 package?

Please note that the latest TensorRT version for Jetson is 8.0 (JetPack4.6).
Installing other packages may lead to some non-expected behavior.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.