Context thread-safe

HI!

Description

I want to know if tensorrt is thread-safe or not?

For example, If I load a model file, create an engine, and then create a single execution context, then start multiple threads where each thread has access to the same context and processes inference requests simultaneously, will this be thread-safe?

Environment

TensorRT Version : 10.6.0.26
GPU Type : Tesla T4
Nvidia Driver Version : 565.57.01
CUDA Version : 12.6
CUDNN Version : 9
Operating System + Version : ubuntu 22.04
Python Version (if applicable) :
TensorFlow Version (if applicable) :
PyTorch Version (if applicable) :
Baremetal or Container (if container which image + tag) :

Looking forward to your reply! Thanks a lot!

Oh, I think I have found the answer now. Thanks!