Since our models heavily utilize unsupported TF layers, converting our TF Model to a UFF does not seem feasible. Instead, we were thinking of trying to get TensorFlow Serving working on the jetson, to act as a mini server for model inference.
Has anyone done this yet? I’ve seen examples of installing TensorFlow on the Jetson so I assumed it might be possible to install TensorFlow Serving as well.
However, I run in issues building TF Serving with Bazel, and have exhausted my ability to narrow down the problem.
So far I have:
Installed all pre-reqs
cloned TF Serving and attempted to build it from source. I run into an issue which is similar to memory issues I’ve seen around the forums/github pages and have tried to confine the resources used during the build, but nothing works.
The error is:
Linking of rule ‘//tensorflow_serving/model_servers:tensorflow_model_server’ failed (Exit 1).
bazel-out/local-opt/bin/external/aws/_objs/aws/external/aws/aws-cpp-sdk-core/source/client/ClientConfiguration.o:ClientConfiguration.cpp:function Aws::Client::ComputeUserAgentString(): error: undefined reference to ‘Aws::OSVersionInfo::ComputeOSVersionStringabi:cxx11’
collect2: error: ld returned 1 exit status
Does anyone have experience attempting / successfully installing TensorFlow Serving on a Jetson?