Invalid model file extension Error for Inference using TensorRT engine

Thank you. Issue is solved. In model handler config of inference_kitti_etlt spec file, instead of “tlt_config” giving “tensorrt_config” and instead of “model”, giving “trt_engine” solved the issue. Might have edited it wrong some time back.