Tensorflow sering on Jetson nano?

hi, I want to serve a saved TF model on Jetson nano. Is anybody done that? Compile Tensorflow Serving on Nano or get Nvidia-docker installed on Nano? Any help will be appreciated. Thx.

Hi,

Tensorflow Serving is not officially supported Jeton platform.
Some users have tried it but found lots of issues when compiling.
Maybe others can share more experience.

nvidia-docker doesn’t support Jetson yet.
But you will get the support from JetPack4.2.1, which is our next release for Nano.

Thanks.

Thanks AastaLLL. I think serving a model by using the GPU capability of jetson is a very common scenario. Is there any other way to do that?

You might want to have a look on https://github.com/helmuthva/jetson/tree/master/workflow/deploy/tensorflow-serving-base/src which contains a Dockerfile and .bazelrc to build the latest TensorFlow Serving (inc. TensorFlow Core) on Jetson devices.

https://github.com/helmuthva/jetson contains more information.

THX