Unity3D RenderTexture/Texture2D To OptixImage2D

This is more of a Unity and CUDA-D3D11 interop question. I don’t use either, so let’s focus on the requirements.

1.) You should check any CUDA API call for errors.
2.) cudaGraphicsD3D11RegisterResource() does not work on all formats, specifically not on any 3-component format. See the comments in cuda_d3d11_interop.h. Check if the register resource call succeeds.
3.) A registered resource cannot be changed in size. You would need to unregister and register the resource in all places where the application changes the texture size, if that can happen.
4.) The OptixImage2D requires a linear buffer. You register a texture resource which is not linear memory. You would need to use map to get some pointer as a CUDA texture array from which you could copy the data to a linear buffer. (At least that’s how it works with OpenGL.)
5.) The OptiX 7 denoiser requires half3/half4 or float3/float4 input formats. It doesn’t implement uchar3/uchar4 formats.
If the Unity texture is of a different format you would need to convert the data to a suitable format first.

There is a Vulkan example inside the nvpro-samples which uses the OptiX denoiser.
Maybe that helps to explain the necessary steps some more: https://github.com/nvpro-samples/vk_denoise

The first two hits when searching for “Unity cudaGraphicsD3D11RegisterResource” found these similar issues and Unity specific explanations:
https://issuetracker.unity3d.com/issues/cuda-graphics-interop-fails-in-the-native-plugin-when-using-rendertexture
https://forum.unity.com/threads/getnativetextureptr-call-behavior-differs-from-rendertexture-texture2d-how-come.196911/