Hi, I am developing multi-channel tensorrt inference application with dynamic-batch.
(TensorRT version 8.6.1.6)
while working on it, I found that when creating an execution context, dynamic-batch engine uses more memory than static(batch=1) one. So, my questions are…
-
what does happen when creating an execution context for dynamic-batch engine.
-
where can I find implementation for createExecutionContext?