I would like to fine tune and deploy an SSD Mobilenet V2 (with FPNLite, 640x640) on a Jetson Nano (4GB). I use the official SD Card image with JetPack4.6 (TensorRT 8.0.1.6).
After days and days of trying to find any kind of information on support for TF2 Object Detection Model Library models, I have found this github repo: https://github.com/pskiran1/TensorRT-support-for-Tensorflow-2-Object-Detection-Models
(which was also recommended on nVidia Developer Forum)
I would like to emphesize that this repo is the only source that documents how float inputs are needed during an exprot to successfuly go through the saved_model → ONNX → TRT conversion and build chain. I could create the ONNX model but building it will always fail due to the EfficientNMS plugin that is not supported by TensorRT 8.0.1.
see related issues on GitHub as in:
- https://github.com/pskiran1/TensorRT-support-for-Tensorflow-2-Object-Detection-Models/issues/6
- EfficientNMS_TRT not working on jetson nano (TensorRT 8.0.1) · Issue #1538 · NVIDIA/TensorRT · GitHub
My first question: Is this really my problem or TF2OD models should be supported/compatible (and more importantly buildable on the Jetson Nano devices with the newest JetPack4.6)? If It should be compatible (and I am wrong), could you please provide help to build the given model on my nano? (as a demosntration a pretrained model provided in the model zoo from TF2OD library would be prefect)
In case I am right, and my problem is indeed the lack of TensorRT 8.0.1+ on my Jetson Nano device: I have seen that the new JetPack 4.6.1 (with TensorRT 8.2+) is planned to be released nowish. Could you please provide any hints on the release date other than your roadmap (Jetson Roadmap | NVIDIA Developer)?
In case the release is long time from now, could you provide any fallback mechanism as was provided for efficientnet in TensorRT samples?
Or can I also install the new TensorRT individually without replacing my JetPack version somehow? Or waiting for the release is recommended?
Thank you for your help and work in advance!