Jetson Xavier NX not ble to run segmentation using GPU

Description

I am trying to run matterports segmentation as shown in this repo. Segmentation I have ajetson xavier NX. I am not able to run a basic program on it.
I get the following error towards the end

Stats: 
Limit:                   195117056
InUse:                   195117056
MaxInUse:                195117056
NumAllocs:                     426
MaxAllocSize:             70344448

2021-06-17 15:09:45.539596: W tensorflow/core/common_runtime/bfc_allocator.cc:424] *******************************************************************************************xxxxxxxxx
2021-06-17 15:09:45.539742: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at assign_op.h:117 : Resource exhausted: OOM when allocating tensor with shape[256] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc

These are the specs of my Jetson
Python - 3.6.9
Tensorflow - 1.15.4
CUDA - 10.2.89
Cudnn - 8.0.0
Jetpack 45

Environment

I feel the GPU isnt bein gused adn hence its taking forever. There were few times when it used to execute but it took close to 5 minutes to process one image. Thats highly inefficient. And the jetson used to hang. I want to know if theres some issue with the tensorflow version for this particular application of segmentation. Also, how to check if the GPU is bein gused or not?

Relevant Files

I haave attached the code of my py script for reference.
segmentation.py (3.5 KB)

Hi,

The error indicates out of memory.
If a swap memory is used, the performance is expected to be slow.

To get an optimal performance on Jetson, please convert the model into TensorRT.

Thanks.

As I want to modify the segmentation program and add realsense to it as well, is it okay if I use the tensorrt model? WIll it affect the performance later?

Hi,

TensorRT is an inference engine.
You can prepare the data buffer with the frameworks/API you prefer.

Thanks.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.