HI Team,
We want to purchase a 13-14 a laptop for AI Learning that support CUDA.I checked the laptop and many laptop has NVIDIA Geforce MX150 card on it , while going through forum i saw that user has faced issue with cuda with NVIDIA Geforce MX150 graphic card but on your link it said NVIDIA Geforce MX150 support cuda. I am little bit confused so please tell me whether we should NVIDIA Geforce MX150 graphic card or not as laptop is very costly.
laptop we are looking for 13 to 14 inch size
we want run cuda with tensorflow and gpu so please suggest
HI plz reply on above thread
GeForce MX150 supports CUDA.
Does it support high end gpu applicaton
It has various limitations, such as memory size.
I’ve installed tensor rt 8.2 with cuda-toolkit 10.2 on geforce mx150.
But running some functions of tensor rt, it cause oom.
Even though I changed the argument of IBuilder::setMaxWorkspaceSize , it didn’t work.
Any suggestion or any advice for the issue?
Please…
Should I install tensor rt 7.x ? Is that able to solve this issue?
It’s entirely possible that attempting to serialize and use a large model in TensorRT would fail due to out of memory. To perform inferencing on a model, TensorRT needs to be able to load the model into GPU memory. If the model is large enough, a GPU with very small memory size like MX150 might not be able to hold it.
I doubt an earlier (or other) version of TensorRT would help here, if model size is the issue. The recommendation at that point would be to use a GPU with more memory.