Does dynamic-batch requires more memory when creating Execution Context?

Hi, I am developing multi-channel tensorrt inference application with dynamic-batch.
(TensorRT version 8.6.1.6)

while working on it, I found that when creating an execution context, dynamic-batch engine uses more memory than static(batch=1) one. So, my questions are…

  1. what does happen when creating an execution context for dynamic-batch engine.

  2. where can I find implementation for createExecutionContext?