Hello , I run onnx model via DLA & GPU
the network is brief UNet, which construct with conv/bn/relu layer,
When I run in GPU, the process is normal ,
Actually when I run in DLA, it would fuse all layer in one like :
[10/12/2020-19:40:28] [E] [TRT] …/builder/tacticOptimizer.cpp (1715) - TRTInternal Error in computeCosts: 0 (Could not find any implementation for node {(Unnamed Layer* 0) [Convolution],(Unnamed Layer* 1) [Scale],(Unnamed Layer* 2) [Activation],(Unnamed Layer* 3) [Convolution],(Unnamed Layer* 4) [Scale],(Unnamed Layer* 5) [Activation],(Unnamed Layer* 6) [Convolution],(Unnamed Layer* 7) [Scale],(Unnamed Layer* 8) [Activation],(Unnamed Layer* 9) [Convolution],(Unnamed Layer* 10) [Scale],(Unnamed Layer* 11) [Activation],(Unnamed Layer* 12) [Convolution],(Unnamed Layer* 13) [Scale],(Unnamed Layer* 14) [Activation],(Unnamed Layer* 15) [Concatenation],(Unnamed Layer* 16) [Convolution],(Unnamed Layer* 17) [Scale],(Unnamed Layer* 18) [Activation],(Unnamed Layer* 19) [Convolution],(Unnamed Layer* 20) [Scale],(Unnamed Layer* 21) [Activation],(Unnamed Layer* 22) [Concatenation],(Unnamed Layer* 23) [Convolution],(Unnamed Layer* 24) [Scale],(Unnamed Layer* 25) [Activation],(Unnamed Layer* 26) [Concatenation],(Unnamed Layer* 27) [Convolution],(Unnamed Layer* 28) [Scale],(Unnamed Layer* 29) [Activation],(Unnamed Layer* 31) [Pooling],(Unnamed Layer* 32) [Convolution],(Unnamed Layer* 33) [Scale],(Unnamed Layer* 34) [Activation],(Unnamed Layer* 35) [Concatenation],(Unnamed Layer* 36) [Convolution],(Unnamed Layer* 37) [Scale],(Unnamed Layer* 38) [Activation],(Unnamed Layer* 39) [Convolution],(Unnamed Layer* 40) [Scale],(Unnamed Layer* 41) [Activation],(Unnamed Layer* 42) [Concatenation],(Unnamed Layer* 43) [Convolution],(Unnamed Layer* 44) [Scale],(Unnamed Layer* 45) [Activation],(Unnamed Layer* 46) [Concatenation],(Unnamed Layer* 47) [Convolution],(Unnamed Layer* 48) [Scale],(Unnamed Layer* 49) [Activation],(Unnamed Layer* 51) [Pooling],(Unnamed Layer* 52) [Convolution],(Unnamed Layer* 53) [Scale],(Unnamed Layer* 54) [Activation],(Unnamed Layer* 55) [Concatenation],(Unnamed Layer* 56) [Convolution],(Unnamed Layer* 57) [Scale],(Unnamed Layer* 58) [Activation],(Unnamed Layer* 59) [Convolution],(Unnamed Layer* 60) [Scale],(Unnamed Layer* 61) [Activation],(Unnamed Layer* 62) [Concatenation],(Unnamed Layer* 63) [Convolution],(Unnamed Layer* 64) [Scale],(Unnamed Layer* 65) [Activation],(Unnamed Layer* 66) [Convolution],(Unnamed Layer* 67) [Scale],(Unnamed Laye
balabala
that leads to memory crash
so how can I solve it?