Description
Hi,
I am wondering why GPU is working during inference even though I set default device type to DLA.
I checked that all of the layers were allocated to DLA.
[08/18/2020-03:18:54] [I] [TRT] --------------- Layers running on DLA:
[08/18/2020-03:18:54] [I] [TRT] {conv1,relu1,norm1,pool1,conv2,relu2,norm2,pool2,conv3,relu3,conv4,relu4,conv5,relu5,pool5,fc6,relu6,fc7,relu7,fc8},
[08/18/2020-03:18:54] [I] [TRT] --------------- Layers running on GPU:
[08/18/2020-03:18:54] [I] [TRT]
However, I found that GPU is working during inference, (I checked through jtop.)
Is there any work GPU should do during inference even if all of the layers allocated to DLA?
Thanks in advance.
Environment
TensorRT Version: 7.0.0