Failed to generate TRT engine from ONNX model ( Error parsing text-format onnx2trt_onnx.ModelProto: 1:1: Invalid control characters encountered in te)

Hi @AastaLLL,
For running object detection model on jetson nano, I’m going to generate a tensorrt engine from Tensorflow V2 ( Trained with Object detection API2 ). I follow the steps below:
1- Converte TF2 ( .pb) format to ONNX using "tf2onnx’
2- Modify the ONNX format to solve “Unsupported ONNX data type: UINT8 (2)” issue (Exporting Tensorflow models to Jetson Nano - #11 by AastaLLL)
3- Generating Tensorrt engine using either trtexec API or Jetson.inefernce.DetectNet ( Hello AI world).
I generate a modified ONNX format via stage 2 using (’
onnx.save(gs.export_onnx(graph), “updated_model.onnx”)
How I can save a binary ONNX model?

Thanks