Which mobile GPU do you recommend for doing deep learning work

Hi, I have a powerful workstation at home with Nvidia RTX 2080Ti. I am planning to by a laptop for basic office productivity work as well as some deep learning prototyping while on the go. Taking noisy fans (which I dislike) into consideration, the GPU in the laptop does not have to be the top one as I have a workstation already. Could you please let me know the following?

  1. How the following mobile video cards ranked in terms of performance for doing deep learning work?
    Intel UHD620, GTX 1050Ti Max-Q, GTX 1660Ti, GTX1650/1660(Ti), Quadro P1000 4GB, Quadro P2000 4GB, RTX?

  2. How many times faster are the UHD620, GTX, Quadro and RTX cards compared with each other when doing Tensorflow related work?

  3. How many times faster are the UHD620, GTX, Quadro and RTX cards compared with each other when doing TensorRT related work?

Thank you

My answer? How about none of the above. I’d look for a laptop with the GTX1060 or GTX1070 if you have the cash there are some starting to hit the market with RTX-2060’s in them. Okay the GTX-1660 is a good choice and is slightly faster than the GTX-1060. The Nice thing about the RTX is the Tensor cores which really speed things up, however…some advances have been made in CNN algorithms that reduce the resources needed by up to 90%!

I would STAY AWAY FROM THE QUADRO line if your main purpose is machine learning/tensorflow. I work for an engineering company and my work PC has the P1000 card. The CAd guys love them BUT…as far as cores and speed goes my $300 GTX-1060 at home has double the performance when it comes to compute power than this over-price P1000 at $450.

Hope this helps.