Tf consuming too much RAM memory

here is my py code

width = 4
height = 4
img =‘./test.png’)
img = img.convert(‘RGB’)
image = tf.image.resize(img, [height, width], method=‘bicubic’, preserve_aspect_ratio=True)
image = tf.cast(image, tf.uint8)
input_tensor = image[tf.newaxis, …]

and input_tensor is

<tf.Tensor: shape=(1, 2, 4, 3), dtype=uint8, numpy=
array([[[[209, 215, 213],
[227, 230, 223],
[255, 255, 255],
[255, 255, 255]],
[[248, 246, 247],
[255, 255, 255],
[143, 136, 100],
[255, 255, 255]]]], dtype=uint8)>

I would like to know why running this code consumes about 18Gi of RAM, is it normal to consume so much?


Yes, TensorFlow occupies some memory for loading the library.
To save memory usage, it’s recommended to try our TensorRT library.


Before excuting , running the command ‘tegrastats’ shows the information as RAM 5493/30536MB.

Importing required libraries and load image → RAM 24379/30536MB

Load TRT_FP16 Model → RAM 26675/30536MB

Predict input → RAM 29715/30536MB
It’s take a long time and I get the result about memory insufficient.

Is there any way to let me get the prediction from TRT model ?


Yes, pleaee following the sample to conver a TensorFlow model into TensorRT engine below:


This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.