Hi,
You can check this through the TensorRT log.
For example:
TRT_LOGGER = trt.Logger(trt.Logger.INFO)
def build_engine():
"""Takes an ONNX file and creates a TensorRT engine to run inference with"""
EXPLICIT_BATCH = 1 << (int)(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH)
with trt.Builder(TRT_LOGGER) as builder, \
...
Then it is expected to see the detailed layer placement information like below:
…
[07/12/2021-13:54:22] [I] [TRT] --------------- Layers running on DLA:
[07/12/2021-13:54:22] [I] [TRT] {Convolution28}, {ReLU32,Pooling66,Convolution110}, {ReLU114,Pooling160}, {Plus214},
[07/12/2021-13:54:22] [I] [TRT] --------------- Layers running on GPU:
[07/12/2021-13:54:22] [I] [TRT] (Unnamed Layer* 0) [Constant] + Times212_reshape1, (Unnamed Layer* 16) [Constant] + shuffle_(Unnamed Layer* 16) [Constant]_output, (Unnamed Layer* 3) [Constant] + (Unnamed Layer* 4) [Shuffle] + Plus30, (Unnamed Layer* 9) [Constant] + (Unnamed Layer* 10) [Shuffle] + Plus112, Times212_reshape0, Times212, shuffle_Times212_Output_0,
…
Thanks.