Does the inference time from profiling include engine loading time?

I am using this command to profile engine

/usr/src/tensorrt/bin/trtexec --loadEngine=<engine> --exportLayerInfo=graph.json --exportProfile=profile.json --warmUp=0 --duration=0 --iterations=10

And I got inference time from profile.json, but I dont know this inference time inlcudes engine loading time or not. Please let me know. Thanks.

@spolisetty please share with me the information. Thanks.

No, the inference time from a profile.json file does not include engine loading time. The profile.json file contains information about the time it takes to run the inference.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.