Error when customizing the deepstream_tlt_apps

Hi, I was running the sample app in this git repo. I ran the sample config with the given pretrained model on Xavier successfully. But when I move my customized tlt 2.0 trained model to the sample app and ran the same command

dewei@dewei-desktop:~/deepstream_tlt_apps$ ./deepstream-custom -c pgie_yolo_tlt_config.txt -i /home/dewei/deepstream_tlt_apps/outfile.h264 -b 1 -d
Now playing: pgie_yolo_tlt_config.txt

as the given config and model, I got stuck with the following error message:

Using winsys: x11 
Opening in BLOCKING MODE 
0:00:00.211150899 24086   0x55876bd4c0 INFO                 nvinfer gstnvinfer.cpp:602:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1591> [UID = 1]: Trying to create engine from model files
ERROR: [TRT]: UffParser: Could not read buffer.
parseModel: Failed to parse UFF model
ERROR: failed to build network since parsing model errors.
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:01.526612825 24086   0x55876bd4c0 ERROR                nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1611> [UID = 1]: build engine file failed
Bus error (core dumped)

I attached my configyolo_labels.txt (136 Bytes) and label.txt below. Thanks.pgie_yolo_tlt_config.txt (2.6 KB)

I cannot upload my custom tlt trained model because it’s not allowed by the forum system.

Please try again with your own key. Below should be your key during your training.

tlt-model-key=nvidia_tlt

hi! I got the same question:

ERROR: …/nvdsinfer/nvdsinfer_func_utils.cpp:31 [TRT]: UffParser: Could not read buffer.
parseModel: Failed to parse UFF model
ERROR: tlt/tlt_decode.cpp:274 failed to build network since parsing model errors.
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:797 Failed to create network using custom network creation function
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:862 Failed to get cuda engine from custom library API
0:00:00.545433637 22197 0x557ef7243780 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1611> [UID = 1]: build engine file failed

I wrote my api_key in the config file :
tlt-model-key=bTVydDMwZzRlNjgz####################0OGI4LTk3OGItZmVjZGNlZDQxOWU3

Your error seems to be similar to TLT-deepstream sample app error.
Please check if it helps.