EfficientNetB5 on jetson nano?

Please provide complete information as applicable to your setup.

**• Hardware Platform Jetson Nano **
• DeepStream Version 5.1
• JetPack Version 4.5
• TensorRT Version 7.1.3

Hello, I’ve been training an EfficientNetB5 model with a custom dataset and I tried to export it to my jetson nano device. The problem is I would need it for real time classification but the speed is only 2 fps. I know is a heavy model but I need more accuracy than b0/b1 can bring. I tried to prune the model but the size remains the same after compression. Anybody has tried this before or have any idea about what could I do? Thanks in advance.

Hi,

Do you run it with TensorRT or another framework?
If you doesn’t use the TensorRT acceleration, please give it a try.

Below is an relevant example for your reference:

Thanks.

Yes, the process I’ve been following is:

  1. Train the model with tensor flow/Keras
  2. Convert to onnx
  3. Use trtexec to generate a tensorrt engine
  4. Use DeepStream app

Hi,

Could you try it with trtexec and fp16 mode to get the inference performance?

$ /usr/src/tensorrt/bin/trtexec --onnx=[file] --fp16

Thanks.

I have tried with int8 before, when I’m using fp16 I’m having around 2.7 fps instead of 2.36 fps, is there any option about training efficientNetB5 with Nvidia TAO? Or if that’s not possible, are there any other possibility for the B5 models in the Nano?

Hi,

Sorry that TAO only support B0/B1 model currently.

Thanks.

Ok, thank you, but do you know if are there any possibility of improve performance of the model? For example I have trained the B1 with Keras, and then I used trtexec to generate the engine and its performance is like 9 fps slower than the engine I generate using TAO Toolkit

Hi,

In TAO, there is a prune stage to reduce the model complexity.

Do you need to apply the inference to every single frame?
If not, you can try to set the interval parameter for periodically inference.

The intermediate result can be generated by the tracker algorithm.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.