Is there any way to get Turing to consume SPIR-V shaders which use Float16?
I know the hardware should be able to do it, but there doesn’t seem to be an option to actually enable it.
(I run into “SPIR-V module not valid: Using a 16-bit floating point type requires the Float16 or Float16Buffer capability, or an extension that explicitly enables 16-bit floating point.”)