Deepstream_pose_estimation is not executed after compilation

Hardware Platform : Jetson Nano
DeepStream Version : 5.1
JetPack Version : 4.5.1

hello
I want to test deepstream_pose_estimation
I followed the guide at GitHub - NVIDIA-AI-IOT/deepstream_pose_estimation: This is a sample DeepStream application to demonstrate a human pose estimation pipeline., but there was a problem running after compilation.

I flashed Jetpack4.5.1 and installed DS5.1 normally.
And immediately, deepstream_pose_estimation was compiled and executed, but the following error was encountered.
Someone please help

The following is the process of executing deepstream_pose_estimation from DS5.1 installation.

sudo apt-get install
libssl1.0.0
libgstreamer1.0-0
gstreamer1.0-tools
libgstreamer-plugins-base1.0-dev
libgstreamer1.0-dev
gstreamer1.0-plugins-good
gstreamer1.0-plugins-bad
gstreamer1.0-plugins-ugly
gstreamer1.0-libav
gstreamer1.0-alsa
libgstrtspserver-1.0-0
libgstrtspserver-1.0-dev
libjansson4
libjson-glib-dev
libx11-dev

$ sudo tar xvf deepstream_sdk_v5.1.0_jetson.tbz2 -C /
$ cd /opt/nvidia/deepstream/deepstream-5.1
$ sudo ./install.sh
$ sudo ldconfig

And deepstream_pose_estimation was successfully compiled.
And I ran it, but I got the following error.
Someone please help

$ sudo ./deepstream-pose-estimation-app …/…/…/…/samples/streams/sample_1080p_h264.mp4 ./
Now playing: …/…/…/…/samples/streams/sample_1080p_h264.mp4
Opening in BLOCKING MODE
Opening in BLOCKING MODE
Opening in BLOCKING MODE
Opening in BLOCKING MODE
ERROR: Deserialize engine failed because file path: /home/robopia/deepstream_sdk_v5.1.0_jetson/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream_pose_estimation/pose_estimation.onnx_b1_gpu0_fp16.engine open error
0:00:06.576832978 31873 0x55981a34f0 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1691> [UID = 1]: deserialize engine from file :/home/robopia/deepstream_sdk_v5.1.0_jetson/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream_pose_estimation/pose_estimation.onnx_b1_gpu0_fp16.engine failed
0:00:06.576938606 31873 0x55981a34f0 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1798> [UID = 1]: deserialize backend context from engine from file :/home/robopia/deepstream_sdk_v5.1.0_jetson/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream_pose_estimation/pose_estimation.onnx_b1_gpu0_fp16.engine failed, try rebuild
0:00:06.576971940 31873 0x55981a34f0 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1716> [UID = 1]: Trying to create engine from model files

Input filename: /home/robopia/deepstream_sdk_v5.1.0_jetson/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream_pose_estimation/pose_estimation.onnx
ONNX IR version: 0.0.4
Opset version: 7
Producer name: pytorch
Producer version: 1.3
Domain:
Model version: 0
Doc string:

WARNING: [TRT]: onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
Killed

Hi,

Killed usually indicates an OOM error.
Please monitor your device with tegrastats at the same time:

$ sudo tegrastats

Thanks.