[OptiX 7.2] Tiled denoiser errors in output

Hello,
I’m adding tiling to a denoiser example for a graphics engine, but the denoised output image has bars that shouldn’t be there.
this is the output with the bars:

Those bars are of the same width as the overlap region, in the image above (which is 3840x2160 px big) the tile size is set to 1024x1024, and the overlap to 64px. Vertical bars appear to be copied from the left, while horizontal ones are completely messed up.

This is the code i use to tile the image:

unsigned int get_pixel_size(OptixPixelFormat pixelFormat)
{
    switch (pixelFormat) {
    case OPTIX_PIXEL_FORMAT_HALF3:
        return 3 * sizeof(unsigned short);
    case OPTIX_PIXEL_FORMAT_HALF4:
        return 4 * sizeof(unsigned short);
    case OPTIX_PIXEL_FORMAT_FLOAT3:
        return 3 * sizeof(float);
    case OPTIX_PIXEL_FORMAT_FLOAT4:
        return 4 * sizeof(float);
    default:
        return OPTIX_ERROR_INVALID_VALUE;
    }
}
void IDenoiser::createTilesForDenoising(
    CUdeviceptr inputBuffer,
    CUdeviceptr outputBuffer,
    size_t                inputWidth,
    size_t                inputHeight,
    OptixPixelFormat   pixelFormat,
    size_t                overlap,
    size_t                tileWidth,
    size_t                tileHeight,
    std::vector<Tile>& tiles)
{
    int pixelSize = get_pixel_size(pixelFormat);
    int rowStride = inputWidth * pixelSize;

    int  pos_y= 0;
    do {
        int inputOffsetY = pos_y==0? 0 : overlap;
        auto avaible_height = inputHeight - pos_y;
        int actualInputTileHeight = std::min(avaible_height, overlap + tileHeight) + inputOffsetY;

        int pos_x = 0;
        do 
        {
            int inputOffsetX = pos_x == 0 ? 0 : overlap;
            auto avaible_width = inputWidth - pos_x;
            int actualInputTileWidth = std::min(avaible_width, overlap + tileWidth) + inputOffsetX;

            Tile tile{};
            {
                auto in_posx = pos_x - inputOffsetX;
                auto in_posy = pos_y - inputOffsetY;
               
                tile.input.data = inputBuffer
                        + in_posy * rowStride
                        + in_posx * pixelSize;
                tile.input.width = actualInputTileWidth;
                tile.input.height = actualInputTileHeight;
                tile.input.rowStrideInBytes = rowStride;
                tile.input.format = pixelFormat;
                tile.output.data = outputBuffer
                        + pos_y * rowStride
                        + pos_x * pixelSize;
                tile.output.width = std::min(avaible_width, tileWidth);
                tile.output.height = std::min(avaible_height, tileHeight);
                tile.output.rowStrideInBytes = rowStride;
                tile.output.format = pixelFormat;
                tile.inputOffsetX = inputOffsetX;
                tile.inputOffsetY = inputOffsetY;
                tiles.push_back(tile);
            }
            pos_x += tileWidth;
        } while (pos_x < inputWidth);
        pos_y += tileHeight;
    } while (pos_y < inputHeight);    
}

Do you have any ideas what may be the cause, and what is the solution to getting rid of those bars?

Please always include your system configuration information when asking about OptiX issues to reduce the turnaround time and to allow potential reproducers:
OS version, installed GPU(s), VRAM amount, display driver version, OptiX version (major.minor.micro), CUDA toolkit version (major.minor) used to generate the input PTX, host compiler version.

Is your algorithm in any way different than the OptiX helper function optixUtilDenoiserSplitImage() in OptiX SDK 7.2.0\include\optix_denoiser_tiling.h?

Have you tried using that instead?

Do you really need to tile the image for denoising? That will only slow down the denoiser.

That tiling mechanism runs on full size input and output images. If you’re trying save memory by working on individual image tiles only, the tile data data would need to look very different.

Sorry about lack of my system info, first time on Nvidia Forum.
OS Win 10 Pro 10.0.18363,
GPU GTX 1060 6GB, driver 457.09
CUDA version installed: 11.0.2,
OptiX 7.2.0,
Host Compiler version -MSVC, I use driver API and not runtime API, all symbols are loaded from DLL

Thanks for notifying me that optixUtilDenoiserSplitImage() exists, totally missed it. I switched to optixUtilDenoiserInvokeTiled which tiles the image and invokes the denoiser. This is the result.


Tile resolution and overlap size are the same as in the image in the previous post.

As for whether tiling is necessary - it is, when trying to denoise images of 8k resolution and greater, my 6 GB of VRAM is not enough.

Ok, thanks. I’ll file a bugreport tomorrow.

What’s the exact pixelformat in the failing case?

Would you be able to provide a minimal but complete reproducer in failing state which demonstrates the issue?
Source code appreciated.
If confidentiality is required, you can send that to OptiX-Help (at) nvidia.com (max. 10 MB, no *.zip extension, rename that to *.zi_, or *.7z should do). Google Drive should do as well. Other file sharing sites will not work! Or send it via a private message with attachment or link to me.

In main.cpp there is the code that invokes the denoiser, around line 800,
the format is OPTIX_PIXEL_FORMAT_HALF3 and as input there are 3 textures, RGB, Albedo and Normals

A bit of an update,
I have prepared a .zip with an exe file to showcase this problem, as well as another version of it with debug info, as when attempting to launch the exe with outdated optix results in crashes. If that happens, please extract the debug files to /bin, then attach the program to visual studio
https://drive.google.com/file/d/1gZNwhDuxvLYp0hy4g3vdC5K88dPTy1H7/view?usp=sharing
and here the debug version
https://drive.google.com/file/d/1WZD-Xr2QNJPoGnX-FnLKaWLeffadneGc/view?usp=sharing

@droettger we’ve made and provided an example on GitHub, should be a three-click configure&generate&build on Visual Studio as long as you have CMake and OptiX 7.2 installed.

As far as we know we’ve not messed anything up, it clearly shows the bug

We’ve changed the format from HALF3 to HALF4 for ease of inspection, doesn’t affect the reproducibility of the bug.

Because it saves a RGBA_SFLOAT16 DDS file, we recommend “Open With → RenderDoc” to view the DDS in the texture viewer. Because AFAIK neither GIMP or Photoshop can display DDS files with exotic formats properly.

If its our fault please show how to use the API properly to achieve the tiled denoise.

Thanks for the additional reproducer.
I added the information to the bug report I filed in December 2020 and asked OptiX development to take a look again.

According to the developer feedback your call to optixDenoiserSetup does not include the overlap inside the code. It should be tile_width + 2 * overlap, tile_height + 2 * overlap.

There will be a tiled denoiser example inside the next OptiX SDK release.

Yes, that fixes the issue.

We’ve followed the documentation to the letter.

It seems that once again its a little underspecced.

You’re right, and you’re not the only person that ran into this particular issue. The tiled denoising is a new feature, so we are listening to your feedback and improving the documentation for the next release of OptiX.


David.