Converting TF 2.0 saved model for TensorRT on Jetson Nano

Hi,

I am trying to convert a TF 2.0 saved_model to tensorRT on the Jetson Nano.

The model was saved in TF 2.0.0. The nano has Jetpack 4.2.2 w/ TensorRT __ and Tensorflow 1.14 (that is the latest Tensorflow release for Jetson).

I have been following the instructions from here which describe how to convert a TF 2.0.0 saved_model into TensorRT.

I had a previous post where I had an issue which I solved, but now I am having a new issue.

I am able to convert my model (via converter.convert()), and save, load, and use trt.convert_to_constants all w/o/ error. However, when I then want to use the model with an actual input I get an error saying that

BaseCollectiveExecutor::StartAbort Internal: Native FunctionDef StatefulPartitionedCall/TRTEngineOp_1_native_segment can't be found in function library
         [[{{node PartitionedCall/StatefulPartitionedCall/TRTEngineOp_1}}]]

I have uploaded my full code sample here, the full error printout here, and the input saved_model here.

Please help. Thank you!

Hi,

Please noticed that the instructions shared here is for TF-TRT, rather than pure TensorRT.
This indicates you will still need to launch TensorRT with TensorFlow interface.
https://docs.nvidia.com/deeplearning/frameworks/tf-trt-user-guide/index.html#worflow-with-savedmodel

However, as you known that the latest TensorFlow package we provide for Jetson is v1.14.0.
We are afraid that you may not be able to read an v2.0 model via v1.14.0 API. Not to mention to convert it into TF-TRT.

So it’s recommended to check if the model for v2.0 can be read with TensorFlow v1.14.0 first.
Or you might just compile a TensorFlow v2.0 from source with TRT support directly.

Thansk.

Thanks AsstaLLL! That is what I suspected.

Is it possible to compite TF 2.0 w/ TRT on the Nano? How would I go about doing that?

@aastalll I now switched to TF 1.14.0 on both my workstation and nano. I have saved my model via tf.saved_model and am trying to convert it on the Nano. However, I get another error. In order to keep the forum clean, I posted it in its own seperate topic here: https://devtalk.nvidia.com/default/topic/1066890/tensorrt/error-converting-tf-model-for-jetson-nano-using-tf-trt/

This depends on the operations you used.
In general, TensorRT should work with TensorFlow v2.0 model.

Thanks.