TensorRT model not loaded on Jetson Nano


I have been developing a project to estimate the age and gender of person.
This project runs on Jetson Nano, and it’s system is Balena OS.

I have been using the following image as a basic one.

FROM bouwe/jetson-nano-l4t-cuda-cudnn-nvinfer-tensorrt-opencv:latest a

, where CUDA:10.0, CUDNN:7.3, TensorRT:5.1.6, OpenCV:4.1.1, Tensorflow-gpu:1.13.1

So to use TensorRT on Jetson Nano, I have converted the keras models(.h5) to the TensorRT models on my local PC, where Tensorflow version is 1.13.1.
When I run this project on my local PC, it has been well done and provided good result.

Then when pushing this project into the application of Balena Cloud, updating Jetson Nano board connected to that application and runnig this project, I have encountered the issues that TensorRT models couldn’t be loaded.
Those issues are as following;

File "/usr/src/app/source/age_gender/age_gender_detection.py", line 44, in __init__
    self.age_mdl = TRTPredictor(AGE_TRT_MODEL_PATH, sess_name='age')
  File "/usr/src/app/source/age_gender/age_gender_trt_loader.py", line 20, in __init__
    trt_graph = get_frozen_graph(trt_pb_path)
  File "/usr/src/app/source/age_gender/age_gender_trt_loader.py", line 12, in get_frozen_graph
  File "/root/.local/lib/python3.6/site-packages/google/protobuf/message.py", line 187, in ParseFromString
    return self.MergeFromString(serialized)
  File "/root/.local/lib/python3.6/site-packages/google/protobuf/internal/python_message.py", line 1130, in MergeFromString
    raise message_mod.DecodeError('Unexpected end-group tag.')
google.protobuf.message.DecodeError: Unexpected end-group tag.

Please help me~


TRT model (engine) cannot support different platform. You should generate the TRT model on your Jetson device.

Check here: