Cannot infer with fpenet with TensorRT8.0

After checking, previous docker works. nvcr.io/nvidia/tlt-streamanalytics:v3.0-dp-py3

More, for current version, can you run official inference way with deepstream? See GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream and deepstream_tao_apps/faciallandmark_sgie_config.txt at master · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub