Is it possible to start TensorRT from an intermediate layer?


I have been looking into getting the results of an intermediate layer using the following link:

However, before I start implementing it, I wanted to know that if I save the results of the intermediary layer, can I feed that data into TensorRT again to start computation from the same intermediary layer?



Could you please let us know why you need to feed to TensorRT again?
You just add an output to TensorRT but TensorRT will still keep calculating all layers.

Thank you

Lets say I wanted to get the output of an intermediate layer, take the output to another computer and finish the rest of the computation on another computer.

Would that be possible with TensorRT?


The only possibility is you can try to split the network into two.

Thank you.