Error while using tlt-infer

So I am trying to perform inference using the unpruned model which I have retrained. But I get the following error whenever I try to run tlt-infer. The paths are correct and the model resides in the specified directory.

Command used -
!tlt-infer detectnet_v2 -i $USER_EXPERIMENT_DIR/data/testing/image_2
-o $USER_EXPERIMENT_DIR/tlt_infer_testing
-m $USER_EXPERIMENT_DIR/experiment_dir_unpruned/weights/resnet18_detector.tlt
-cp $SPECS_DIR/detectnet_v2_clusterfile_kitti.json
-k $KEY
–kitti_dump
-lw 3
-g 0
-bs 64

Output -

Using TensorFlow backend.
2020-01-06 05:45:36,344 [INFO] iva.detectnet_v2.scripts.inference: Overlain images will not be saved in the output path.
2020-01-06 05:45:36,344 [INFO] iva.detectnet_v2.inferencer.build_inferencer: Constructing inferencer
2020-01-06 05:45:36.345145: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-01-06 05:45:36.436650: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:998] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-06 05:45:36.437045: I tensorflow/compiler/xla/service/service.cc:150] XLA service 0x6b25bf0 executing computations on platform CUDA. Devices:
2020-01-06 05:45:36.437063: I tensorflow/compiler/xla/service/service.cc:158] StreamExecutor device (0): GeForce GTX 1080 Ti, Compute Capability 6.1
2020-01-06 05:45:36.458594: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 4008000000 Hz
2020-01-06 05:45:36.459101: I tensorflow/compiler/xla/service/service.cc:150] XLA service 0x6c41220 executing computations on platform Host. Devices:
2020-01-06 05:45:36.459117: I tensorflow/compiler/xla/service/service.cc:158] StreamExecutor device (0): ,
2020-01-06 05:45:36.459194: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1433] Found device 0 with properties:
name: GeForce GTX 1080 Ti major: 6 minor: 1 memoryClockRate(GHz): 1.582
pciBusID: 0000:01:00.0
totalMemory: 10.91GiB freeMemory: 10.22GiB
2020-01-06 05:45:36.459208: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1512] Adding visible gpu devices: 0
2020-01-06 05:45:36.459649: I tensorflow/core/common_runtime/gpu/gpu_device.cc:984] Device interconnect StreamExecutor with strength 1 edge matrix:
2020-01-06 05:45:36.459661: I tensorflow/core/common_runtime/gpu/gpu_device.cc:990] 0
2020-01-06 05:45:36.459667: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1003] 0: N
2020-01-06 05:45:36.459715: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1115] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9939 MB memory) -> physical GPU (device: 0, name: GeForce GTX 1080 Ti, pci bus id: 0000:01:00.0, compute capability: 6.1)
2020-01-06 05:45:36,460 [INFO] iva.detectnet_v2.inferencer.tlt_inferencer: Loading model from /workspace/tlt-experiments/experiment_dir_unpruned/weights/resnet18_detector.tlt:
Traceback (most recent call last):
File “/usr/local/bin/tlt-infer”, line 8, in
sys.exit(main())
File “./common/magnet_infer.py”, line 35, in main
File “./detectnet_v2/scripts/inference.py”, line 222, in main
File “./detectnet_v2/scripts/inference.py”, line 147, in inference_wrapper_batch
File “./detectnet_v2/inferencer/tlt_inferencer.py”, line 77, in network_init
File “./detectnet_v2/model/utilities.py”, line 96, in model_io
File “./common/utils.py”, line 154, in decode_to_keras
IOError: Invalid decryption. Unable to open file (File signature not found)

It should be resulted from your key.

Please set $kEY explicitly into the command and retry.
Please also make sure your key is the same as the key which is used in tlt-train.

Also, please refer to below topics.
https://devtalk.nvidia.com/default/topic/1068599/transfer-learning-toolkit/tlt-prune-error-argument-k-key-expected-one-argument/1

https://devtalk.nvidia.com/default/topic/1063996/transfer-learning-toolkit/tlt-prune-error-ioerror-invalid-decryption-unable-to-open-file-file-signature-not-found-/post/5390072/#5390072

https://devtalk.nvidia.com/default/topic/1067915/transfer-learning-toolkit/tlt-infer-with-ssd-fails-with-ioerror-unable-to-open-file-file-signature-not-found-/post/5409323/#5409323

https://devtalk.nvidia.com/default/topic/1064436/transfer-learning-toolkit/ioerror-invalid-decryption-unable-to-open-file-file-signature-not-found-tlt-prune-command/post/5410651/#5410651