Vague assertions on top of a vague description with some conspiracy theories thrown in does not convey enough information to render technical assistance. If this is your own code, use the CUDA profiler to find the bottlenecks. If this is a third-party application, request assistance from the software vendor.
Presumably because system integrators find that the features provided by Xeon processors and Quadro GPUs target roughly the same markets, e.g. engineering workstations. Plus it maximizes their profits because they add a surcharge to the “raw” price of components, therefore using premium building blocks means more money for the integrator. Th target market will bear the higher price.
As for power consumption, that is a question of GPU architecture, number of CUDA cores in the GPU, memory used on the card, and operating frequencies. If you look closely, you will find that a Quadro K5200 and a GTX Titan (Pascal) have quite different specifications. Wikipedia’s article on Quadro provides a reasonably accurate matching of Quadro and Geforce models. For the K5200, it lists the Geforce GTX 780 as the closest consumer grade equivalent.
I’m also having a similar issue with PIX4d, so will likely be turning in a ticket with them as well. I put this info in the consumer community forum and was directed here, saw this thread and thought i’d chime in.
Although the titan xp looks better on paper it is being slaughtered in comparison to the 1080ti using pix4d. We’ve tried a number of different drivers and driver configurations for the titan xp with no luck. Others are having this issue:
We were having a similar issue with the quadro’s underperforming but the driver has a simple option that seems to fix it : “3d app - game development” in the global options and then it runs in pix4d like a boss. This option is not available in the drivers for the Titan Xp. Is there something we are missing for the titan xp or something else we could try.