I followed https://ngc.nvidia.com/catalog/models/nvidia:tlt_lpdnet to explore the License Plate Detection & Recognition Models made available in Transfer learning Toolkit.
For Inferencing, I made use of TLT container (Transfer Learning Toolkit for Video Streaming Analytics | NVIDIA NGC)
tlt detectnet_v2 infer -e inference_spec.txt -o output_folder -i <image or image folder> -k nvidia_tlt
The above command is given in the Examples section of (https://ngc.nvidia.com/catalog/models/nvidia:tlt_lpdnet) but this does not work. So, instead, I used the command as:
tlt-infer detectnet_v2 -e inference_spec.txt -o output_folder -i <image or image folder> -k nvidia_tlt
This tries loading the resnet model engine file which couldn’t be found.
Output:
at least one NUMA node, so returning NUMA node zero
2021-03-06 07:34:02.673113: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1351] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 13748 MB memory) → physical GPU (device: 0, name: Tesla T4, pci bus id: 0000:00:1e.0, compute capability: 7.5)
2021-03-06 07:34:02,673 [INFO] iva.detectnet_v2.inferencer.tlt_inferencer: Loading model from /workspace/tlt-experiments/detectnet_v2/experiment_dir_retrain/weights/resnet18_detector_pruned.tlt:
Traceback (most recent call last):
File “/usr/local/bin/tlt-infer”, line 8, in
sys.exit(main())
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/216c8b41e526c3295d3b802489ac2034/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/common/magnet_infer.py”, line 54, in main
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/216c8b41e526c3295d3b802489ac2034/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py”, line 201, in main
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/216c8b41e526c3295d3b802489ac2034/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/inference.py”, line 124, in inference_wrapper_batch
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/216c8b41e526c3295d3b802489ac2034/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/inferencer/tlt_inferencer.py”, line 89, in network_init
File “/home/vpraveen/.cache/dazel/_dazel_vpraveen/216c8b41e526c3295d3b802489ac2034/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/model/utilities.py”, line 98, in model_io
AssertionError: Pretrained model not found at /workspace/tlt-experiments/detectnet_v2/experiment_dir_retrain/weights/resnet18_detector_pruned.tlt
Can you please advise on what’s missing and how can we run its inference succesfully.