I try to convert tensorflow classification model to tensorRT.
At first conversion process was killed (memory is full) and I increase swap file size.
After it started but is still running more them 2 hours and not finished now.
Memory filled up full and swap on 4.5 gb
print('Converting to TF-TRT FP16...') conversion_params = trt.DEFAULT_TRT_CONVERSION_PARAMS._replace( precision_mode=trt.TrtPrecisionMode.FP16, max_workspace_size_bytes=4000000000) converter = trt.TrtGraphConverterV2( input_saved_model_dir='resnet50_saved_model', conversion_params=conversion_params) converter.convert() converter.save(output_saved_model_dir='resnet50_saved_model_TFTRT_FP16') print('Done Converting to TF-TRT FP16')
GPU: jetson tx2
Operating System + Version: Jetpack 4.3