Im trying to run an inference on my Jetson Xavier AGX in TensorRT of a custom TF2 model that predicts age and gender using a face image, built and trained by me.
I exported the model using the tf.saved_model.save function, but I converted this .pb file into a TF1 frozen graph pb (due the fact that the UFF conversor can’t read TF2 format).
After that, I converted the frozen graph into a UFF file, and when I attempted to read this UFF file through trt.UffParser(), first it gave me the following error:
“Unsupported operation: _AddV2”
I could fix this by replacing all ‘AddV2’ operation layers for “Add” through graphsurgeon, but now another error appears:
“Unsupported operation: _NoOp”
As I understand, this operation does nothing, but I couldn’t find anything to fix or replace this operation for another compatible (I tried with Placeholder, Constant, Identity, but get more errors).
Any idea how can I fix it? I don’t want to retrain my model in TF1 or Keras, so bad.
Thank you very much. Regards.