TensorRT single Inpus multiple engine

I am currently running a pipeline that reads images from a single camera at 30fps. I am performing inference with my TensorRT engine on these images. Recently, I added another engine, so now I have two engines. Currently, I am using a single thread to read the video and run inference within that thread.
I wonder if I simply need to add another enqueue for the different engine within that single thread, or if I should separate the job of reading the video from the job of inference by creating two threads and perform inference within these two threads. If I need to create two threads for inference, I am curious whether I need to apply techniques such as mutex to resolve issues like race conditions, or if it is sufficient to apply it only to the part where I input the images.
If you have any relevant examples, please share them with me as well.
Your help is much appreciated. Thank you.

Hi @reewoo4674 ,
I am afraid, we may not have any example, but you can choose to read images in a separate thread and it is okay to run the two engines in the same thread but with different streams.

Thanks