I literally just joined this developer community 10 minutes ago. I recently began a new job working in the AI/Machine learning field and I have a question regarding measuring performance on datasets such as MNIST-Digits, MNIST-Fashion, CIFAR-10, and CIFAR-100. I would like to perform some benchmark performance metrics for CPU vs GPU.
In simple terms, I’d like to be able to run my testing programs with and without the NIVDIA GPU. I’m wanting to examine the performance differences of using only the CPU without the NVIDIA GPU, then compare that to performance with the GPU.
What would be considered a “best practices” way to accomplish this testing?