Unable to build Inference package/application via docker on Nvidia AGX

Dear Nvidia Team,

We are unable build Inference package/application via docker on Nvidia AGX and getting below error:

Environment:
Jetson AGX Xavier
Jetpack: 4.6
CUDA Version: cuda_10.2_r440
Operating System + Version: Ubuntu 18.04.6 LT
TensorRT Version: 8.0.1-1+cuda10.2
Python Version (if applicable): 3.6.9
Package: dusty-nv/jetson-inference

After running the Docker Container from Jetson Inference package:
docker/run.sh

We are getting below error:
/jetson-inference/build/aarch64/bin# ./imagenet images/jellyfish.jpg images/test/jellyfish.jpg
./imagenet: error while loading shared libraries: /usr/lib/aarch64-linux-gnu/libnvinfer.so.8: file too short

Please let us know how to resolve same.