Thank you for answering.
I have another question.
Because that reason, now I’m trying to using ReLU function as replacement.
But I’m having problem with processing ReLU layer on the nvDLA.
On the GPU (which set the Device as kGPU ), my code is working well.
But on the nvDLA, results are wrong.
I using, below code on activation after batchnorm layer.
Actually, when i used leakyReLU as activation function on nvDLA, it worked well. with:
I have no idea about this problem.
Pleas help me !
Device : Jetson Xavier AGX