Basic OptiX use turning image red

Hello!

Apologies if this is a very basic question, but i am trying to use the OptiX denoiser on a noisy png image, and it is giving me a pretty unusable image full of checkerboard patterns and is all red.
This is the original image:


Stored as a png file.
And this is the output:

Details about what i am doing:

  • I am using OptiX 7.3
  • The image format is given to optix as Float3
  • There is no reordering of the buffer elements in between the format conversions.
  • The initial float3 buffer is made by converting the uchar rgb png data to a float then dividing by 255.0
  • The output png buffer is made by the opposite process, multiply by 255.0 then convert to uchar.
  • I do not give optix a guide albedo or normal image
  • I do not give it an hdr intensity either.
  • the model kind is ldr, choosing hdr doesnt make a difference.
  • the output pixels appear to be in Ldr range, but ive also observed it giving gigantic numbers.

Now, i realize that this is a horrible way to run the denoiser because i give it basically no other hints, but is it expected to get such a bad result out of a simple image? Or am i perhaps doing something wrong?

That’s definitely unexpected.

Could you please add some more information about your system configuration to be able to try reproducing this internally?
OS version, installed GPU(s), VRAM amount, display driver versions, OptiX (major.minor.micro) version, CUDA toolkit version (major.minor) used to generate the input PTX, host compiler version.

Did this work with an older driver?

the model kind is ldr, choosing hdr doesnt make a difference.

There is an optixDenoiser example inside the OptiX SDK 7.3 which allows reading EXR images and running the denoiser modes HDR, temporal, or AOV on them.
Could you try what happens if you convert your PNG image to EXR (with GIMP for example) and then run the optixDenoiser SDK example with HDR and AOV modes on that input?

Sure, i am on an RTX 3070, 8192mb VRAM, my driver is 472.12, Windows 10, and OptiX is the latest OptiX at the time of this post, i am not quite sure where to get the micro version but i have downloaded the newest to make sure that was not the issue. And i am on CUDA 11.4. I did not try with an older driver but i will try updating my display driver now, since i don’t think this is the latest.

There is no GPU code, i am purely running the denoiser, also, i am not using NVCC, this is my own wrapper for OptiX written in Rust, which just uses C FFI (with optix stubs). I can probably provide reproducible rust code but that might not be very helpful. Also, “proper” uses of optix work, i.e. something like blender with cycles works.

I have not tried building the examples, but i will try it now and see

@droettger Ok so it seems to be either an issue with how i convert the png or with how i run the denoiser, converting my png to exr then using the sample gives me a clean image (albeit with a bit of artifacts but thats expected with no info):

So i will need to see if i am perhaps giving optix some weird parameters. My wrapper makes sure the image dimensions are all ok so it can’t be weird undefined behavior. And i doubt it is numbers being converted wrong because the data is valid, its just not the right color lol. I will need to go back and thoroughly check everything.

What order does OptiX expect the data to be in? the image library i am using puts x = 0, y = 0 at the bottom left and yields pixels from x = 0 to width, then y = 0 to height.

I think i will go ahead and close this, i tried reimplementing this using raw bindings over the wrapper, and it works. So i have some sort of weird bug in my wrapper.

So for now i think this is solved, if i find that this was a result of an OptiX bug i will file a bug, but i expect it is just my bad programming :)

Thanks for your help!

mhmm yeah it was my bad programming

        match self {
            Half2 => sys::OptixPixelFormat::OPTIX_PIXEL_FORMAT_HALF2,
            Half3 => sys::OptixPixelFormat::OPTIX_PIXEL_FORMAT_HALF3,
            Half4 => sys::OptixPixelFormat::OPTIX_PIXEL_FORMAT_HALF4,
            Float2 => sys::OptixPixelFormat::OPTIX_PIXEL_FORMAT_HALF2,
            Float3 => sys::OptixPixelFormat::OPTIX_PIXEL_FORMAT_HALF3,
            Float4 => sys::OptixPixelFormat::OPTIX_PIXEL_FORMAT_FLOAT4,

Well, that makes sense. :-)

i am not quite sure where to get the micro version

That’s in the name of the OptiX download and installation folder. It was always 0 so far, but in case that ever changes, versions always need to be precise in bug reports.

Please always read the OptiX Release Notes for each individual version before setting up a development system.
That lists with which CUDA toolkit version each OptiX release was built, e.g. OptiX 7.3 was built with CUDA 11.1.
While newer CUDA toolkits usually work as well, there have been cases where the produced PTX code wasn’t parsed correctly by OptiX. On the other hand there had been cases where a newer CUDA toolkit was solving issues.
Just be aware that the CUDA toolkit version can have an effect on your OptiX application.
(I know that is not relevant for your current use case, unless you’re beginning to trace rays with OptiX.)

What order does OptiX expect the data to be in?

That shouldn’t matter. The denoiser doesn’t care. You just need to have the origin placed consistently over all input images.

Inside the OptiX examples which are rendering images, the launch index (0, 0) is usually the lower left image pixel.
That simply depends on how the camera is implemented inside the ray generation program.
Lower-left origin matches what OpenGL expects in glTexImage2D so this is convenient when displaying the raytraced images with an OpenGL textured rectangle, which is what most OptiX examples do.
Means lower-left origin of the image data given to the denoiser is the common case.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.