Hello!
I’m looking for a way to learn how much different hlsl shader operations cost in instructions and in GPU cycles to know what actually is ineffective to use and what should be avoided.
What I’ve learned is hardware specific and I’ve found a tool to learn this information for AMD - https://shader-playground.timjones.io, here’s the example -
But is there a way to learn those costs for NVIDIA? I understand that it very much depend on exact hardware that’s used for tests but I need a way to get a rough understanding on the topic or a way to check this information for specific hardware to develop the overall understanding
Any suggestions welcomed, thanks.
