Jetson nano benchmark

Hi based on benchmark of deepstream on Jetson Nano , I wanted to understand will it be possible to support more camera streams if :

  1. We use skip intervals for faster processing
  2. Use INT8 in models
  3. USE BNN somehow . https://github.com/larq/larq

Hi,

1. Skip interval will save GPU resource since you don’t need to do the inference every frame.

2. Nano doesn’t support INT8 mode.
You will need a GPU architecture > 7.x device for the INT8 feature. Ex. Xavier.

3. BNN
Low precision is similar to the TensorRT precision mode.
Nano can only support upto FP16 mode.

It looks like that BNN can also quantize the network.
It may help if the operation become lower(zero wight increases).

Thanks.

Thank you for your response .