I really liked the lite api in version 4.0, Engine class was easy to initialize and perform inference on. Now that the lite api has been removed (apparently moved to top-level tensorrt module) I was wondering if there is a way to perform inference in a similar way to trt.lite.Engine(…).infer(data)?
We currently don’t have a replacement for the tensorrt.lite API. We do think the new API is designed to be more concise. Please give it a try and always appreciate community feedback.