1. No. We don’t have a parser for PaddlePaddle.
But you can choose to inference your model with TensorRT in the PaddlePaddle frameworks.
ONNX is also workable. Currently, we have parsers for Caffe/TensorFlow/ONNX.
2. Suppose yes. It looks like PaddleMobile officially supports ARM platform.
<b>high performance in support of ARM CPU</b>
support Mali GPU
support Andreno GPU
support the realization of GPU Metal on Apple devices
support implementation on ZU5、ZU9 and other FPGA-based development boards
support implementation on Raspberry Pi and other <b>arm-linux </b>development boards