Trt_yolo_app

Hi all
I am using TensorRt 6.0.1.5 and I tested tht Trt_yolo_app (https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps/tree/restructure/yolo/apps/trt-yolo) in ubundu 16.04 , in jetson Nano and at both works perfect.
In windows with cuda 10.1, tensorrt 6.0.1.5, cudnn 7.6 it works until the serialization it creates the engine, but whe it tries to deserialize the cashed engine it gives me the below error:
Can someone help?

Loading TRT Engine…
ERROR: C:\source\rtSafe\coreReadArchive.cpp (55) - Serialization Error in nvinfer1::rt::CoreReadArchive::verifyHeader: 0 (Length in header does not match remaining archive length)
ERROR: INVALID_STATE: Unknown exception
ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.

I tried different version of tensorrt (5.1, 5.0) but the same. Tensorrt is build with the same cuda (10.1) which is installed in the machine.

Thanks in advance
If I need to post this error to another thread please guide me as it is very ergent to me!

1 Like

Hi again,

Solved, First of kudos to “joestump”. Now is working as a charm. In case anyone have the same problem.
the solution is:

in loadTRTEngine:
std::ifstream cache(planFilePath); → std::ifstream cache(planFilePath, std::ios::binary | std::ios::in);

and

in writePlanFileToDisk:
outFile.open(m_EnginePath); → outFile.open(m_EnginePath, std::ios::binary | std::ios::out);

plus

in loadWeights
std::ifstream file(weightsFilePath, std::ios_base::binary); → std::ifstream file(weightsFilePath, std::ios_base::binary | std::ios::in);

Greetings

3 Likes

I was getting exact same error while loading saved engine and following change provided by @kkourkounis fixed the issue.
Thank you very much.

in loadTRTEngine:
std::ifstream cache(planFilePath); → std::ifstream cache(planFilePath, std::ios::binary | std::ios::in);

Environment

TensorRT Version : TensorRT-7.1.3.4
CUDA Version : CUDA 11.0
CUDNN Version : cudnn-v8.0.2.39
Operating System + Version : Windows 10 64-bit

It’s work. Thank u very much.