ERROR running EmotionNet Deployable model with Deepstream

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson)
**• DeepStream Version:5.0 **
• JetPack Version: 4.4.1

I am trying to run the emotionnet deplyable model with deepstream 5.0 on jetson xavier.
I have the etlt file and labels.txt file for emotion net but getting the following error while trying to run with deepstream.
ERROR: ERROR: [TRT]: UffParser: Could not read buffer.
parseModel: Failed to parse UFF model

I am attaching the config file that I am using.
please help
dstest3_pgie_config.txt (3.1 KB)

@Morganh @pcjaked

Now I am trying to convert the model file using tlt-converter.
I am using the following command:
./tlt-converter -k nvidia_tlt -d 1,136,1 ./model.etlt -t fp16 -e model.trt

And getting following error:

[ERROR] UffParser: Could not parse MetaGraph from /tmp/fileAFnIO5
[ERROR] Failed to parse the model, please check the encoding key to make sure it’s correct
[ERROR] Network must have at least one output
[ERROR] Network validation failed.
[ERROR] Unable to create engine
@Morganh @pcjaked

Moving into TLT forum.

Can you refer to NVIDIA TAO Documentation to deploy?