I want to render Optix output into Direct3D texture. I have created ID3D10Resource for use in Optix but rtBufferCreateFromD3D10Resource returns RT_ERROR_INVALID_VALUE. I don’t know where the problem is but I think it has something to do with the ID3D10Resource I created from ID3D10ShaderResourceView. Please can anyone show me how to correctly render to Direct3D texture?
Here is a snippet of my code.
// my device for Direct3D,
// when working with Optix, it's fully initiated and working
rtContextCreate( &context );
rtContextSetRayTypeCount( context, 1 );
rtContextSetEntryPointCount( context, 1 );
// pSRVeiw is ID3D10ShaderResourceView created from ID3D10Texture2D
// and now it returns RT_ERROR_INVALID_VALUE
rtBufferCreateFromD3D10Resource(context, RT_BUFFER_OUTPUT, optixResource, &buffer);
i’m also trying to use the output of optix as a d3d texture. i’ve read in the Optix Programming Guide, that it’s only possible to write into buffer objects (Chapter Direct3D Interop).
So I created a buffer in d3d and was successful with calling rtBufferCreateFromD3D10Resource. I then wanted to use it as a shader resource by creating a shader resource view and using it as
uniform Buffer optixBuffer;
It didn’t work and finally I found the following information in the guide:
i’m new to d3d and i found  because after changing the flag it crashed with an exception. they say the following:
now i’m confused. if it can’t be input or output to any shader program, what is the use of such a buffer? how can i use the output of optix in d3d without going through main memory, if i have to make the buffer staging (which means, that i can’t make it a shader uniform afaik).
i’m using .net and SlimDx but it should be the same d3d…
(btw: i’m using optix 2.5, but the “D3D Buffer-Creation Flags” is also in the Optix 3.0 Documentation)
I’m not sure you will gain anything from using interop if the resource that’s interopped is a staging resource. It might be just as fast to create a buffer of the correct dimensions (your ‘buffer’ variable in the last snippet) and bind that to ‘output_mixed_buffer’. Then to copy it to the actual texture, Map() the buffer and copy it either line by line (after texture.Lock) or try to use UpdateSubResource.
It makes sense that you wouldn’t be able to directly write to the underlying data buffer of a texture, because it’s likely to be swizzled. It might work with a renderTarget but unfortunately createBufferFromD3D11Resource doesn’t accept a ID3D11Resource that’s really a ID3D11Texture2D.
I just need to render the Optix output into a texture to work with it in Direct3D (see my topmost post please). OptiX Programming Guide briefly mentions rtBufferCreateFromD3DResource but I keep getting RT_ERROR_INVALID_VALUE from there. Can someone give me a hint what’s wrong with my code or show me a working example please?
it’s working. i’m rendering with optix into a staging buffer and then copy it into a default buffer  for use in the fragment shader. maybe somebody will have similar problems. these were my problems:
it’s not possible to copy a buffer into a texture on d3d and there is no warning or error if you try. just doesn’t work.
i was indexing with uv coordinates, but they go only from 0.0 to 1.0, so multiply by width/height
i had problems with confusing names in ShaderResourceViewDescription (slimdx), see also .
in slimdx there is only ElementWidth but in d3d it’s in an union with NumElements. so i had the wrong value there
according to the optix programming guide you need cpu read access to the buffer, this is afaik only possible with the staging buffer. but now that i have working code i will try. maybe they removed this necessity but didn’t update the guide. by not copying from gpu to ram and back i gained 4 fps, from ~9 to ~13 on my gtx 550ti.
edit3: i did wrong a measurement, gain is only ~1.2 fps, from ~9.6 to ~10.8.
as said in my first post on this thread. it’s not possible to render with optix into a texture. you have to use buffers. if there is no way around a texture for you, you can bind a texture to the render target, render a screen quad and write the buffer into the texture this way. it would be slow though but i don’t know a better way. maybe there is, i’m new to d3d.
indeed, you don’t need cpu read access. at least it seems to work (using optix 2.5). right now i’m rendering with optix directly into a buffer that is bound as a ShaderResource.