Could not find 8.pgm when I execute ./sample_onnx_mnist


jetson7@jetson7-desktop:/usr/src/tensorrt/bin$ ./sample_onnx_mnist
&&&& RUNNING TensorRT.sample_onnx_mnist # ./sample_onnx_mnist
[01/26/2021-13:20:18] [I] Building and running a GPU inference engine for Onnx MNIST

Input filename: …/data/mnist/mnist.onnx
ONNX IR version: 0.0.3
Opset version: 8
Producer name: CNTK
Producer version: 2.5.1
Domain: ai.cntk
Model version: 1
Doc string:

[01/26/2021-13:20:22] [W] [TRT] onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[01/26/2021-13:20:39] [I] [TRT] Detected 1 inputs and 1 output network tensors.
Could not find 8.pgm in data directories:

What am I supposed to do?

Thank you.