Why GPU is working even though I set defaultDeviceType to DLA?

Description

Hi,

I am wondering why GPU is working during inference even though I set default device type to DLA.

I checked that all of the layers were allocated to DLA.

[08/18/2020-03:18:54] [I] [TRT] --------------- Layers running on DLA: 
[08/18/2020-03:18:54] [I] [TRT] {conv1,relu1,norm1,pool1,conv2,relu2,norm2,pool2,conv3,relu3,conv4,relu4,conv5,relu5,pool5,fc6,relu6,fc7,relu7,fc8}, 
[08/18/2020-03:18:54] [I] [TRT] --------------- Layers running on GPU: 
[08/18/2020-03:18:54] [I] [TRT] 

However, I found that GPU is working during inference, (I checked through jtop.)

Is there any work GPU should do during inference even if all of the layers allocated to DLA?

Thanks in advance.

Environment

TensorRT Version: 7.0.0

Hi @yjkim2,
If the GPUFallbackMode is set to false, a layer set to execute on DLA, that couldn’t run on DLA will result in an error. However, with GPUFallbackMode set to true, it will continue to execute on the GPU instead, after reporting a warning.
Request you to check the below link for details

Thanks!