Hi
when I run tlt-infer for tlt, it’s work but for etlt I get this error:
2020-06-22 13:00:55,383 [INFO] iva.detectnet_v2.scripts.inference: Overlain images will be saved in the output path.
2020-06-22 13:00:55,383 [INFO] iva.detectnet_v2.inferencer.build_inferencer: Constructing inferencer
2020-06-22 13:00:55,581 [INFO] iva.detectnet_v2.inferencer.trt_inferencer: Reading from engine file at: /workspace/tmp2/experiment_dir_final/resnet18_detector.etlt
[TensorRT] ERROR: ../rtSafe/coreReadArchive.cpp (31) - Serialization Error in verifyHeader: 0 (Magic tag does not match)
[TensorRT] ERROR: INVALID_STATE: std::exception
[TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
Traceback (most recent call last):
File "/usr/local/bin/tlt-infer", line 8, in <module>
sys.exit(main())
File "./common/magnet_infer.py", line 56, in main
File "./detectnet_v2/scripts/inference.py", line 194, in main
File "./detectnet_v2/scripts/inference.py", line 117, in inference_wrapper_batch
File "./detectnet_v2/inferencer/trt_inferencer.py", line 380, in network_init
AttributeError: 'NoneType' object has no attribute 'create_execution_context'
I run this command in TLT 2.0:
tlt-infer detectnet_v2 -e /workspace/tmp2/detectnet_v2/specs/detectnet_v2_inference_kitti_etlt.txt \
-o /workspace/tmp2/output \
-i /workspace/tmp2/trainval/image \
-k KEY