I am using this command to profile engine
/usr/src/tensorrt/bin/trtexec --loadEngine=<engine> --exportLayerInfo=graph.json --exportProfile=profile.json --warmUp=0 --duration=0 --iterations=10
And I got inference time from profile.json
, but I dont know this inference time inlcudes engine loading time or not. Please let me know. Thanks.
@spolisetty please share with me the information. Thanks.
No, the inference time from a profile.json file does not include engine loading time. The profile.json file contains information about the time it takes to run the inference.
system
Closed
December 5, 2023, 9:17am
6
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.