TensorRT leaky ReLU activation function update

Hello.

Now I’m using xavier agx for testing some deep learning network.
For utilize the nvDLA with my test, i using tensorRT API now, but leaky relu function is not supported on nvDLA.

Is there any plan to update nvDLA supporting function on later jetson SDK updates?

Hi,

Sorry, right now I don’t have any updates regarding plan/roadmap to update nvDLA supporting function for Leaky Relu support.

Thanks

Thank you for answering.

I have another question.

Because that reason, now I’m trying to using ReLU function as replacement.
But I’m having problem with processing ReLU layer on the nvDLA.
On the GPU (which set the Device as kGPU ), my code is working well.
But on the nvDLA, results are wrong.
I using, below code on activation after batchnorm layer.

network->addActivation(*bn->getOutput(0), nvinfer1::ActivationType::kRELU);

Actually, when i used leakyReLU as activation function on nvDLA, it worked well. with:

GPUFallback(true)

I have no idea about this problem.
Pleas help me !

Device : Jetson Xavier AGX
Jetpack 4.3.0
TensorRT 6.0.1.10