Hi, kayccc, thank you for you info.
But what if I’d like to know specific process resource consumption(RAM and GPU mem), I mean I want to record specific model ram & gpu mem consumption when inferring, The more accurate the better.
So are there any APIs to be called for this purpose?
Thank you @AastaLLL for your quick response.
From the link you provide to me, I see we can get gpu mem free & all gpu mem in the system via cudaGetMemInfo API just as I said from the link I post.
Now the question is how can I get mem usage in view of RAM. do I get the right way to query RAM usage.
Thanks.
Jetson is a shared memory system, which indicates that the physical memory is used by both CPU and GPU.
So the available memory for CUDA will be similar to the physical memory that remains.