Render an image bigger than my screen size

When I try to save a denoised image using the Optix DLD (from a buffer), whose resolution is bigger than my screen, I’m getting the next error:

Details: Function “_rtCommandListExecute” caught exception: Encountered a CUDA error: cudaDriver().CuEventSynchronize( m_event ) returned (719): Launch failed

If I set the resolution to anything within my screen range, it’s all fine
is there a way to get past the screen size restriction?

There is no such restriction. You can have launch dimensions completely independent of whatever your display mechanism is.

Though you need to make sure that all buffers used during the post-processing are sized accordingly and since the denoiser stage dimension cannot be set individually the command list needs to be recreated during size changes.
Then there could be potential timeout issues if this is just too much workload for your GPU.

Basically look at all occurrences of the Application class’ m_width and m_height members in this example below.
These follow the current client window width and height, but you could just change that to use an additional pair of m_widthLaunch and m_heightLaunch variables and use that for all image buffers and the launch call.
Not for the OpenGL viewport and mouse interaction (pinhole camera setup) which are client window relative.

Then depending on your GPU configuration, there could still be too much memory used. In that case look at the “maxmem” variable and set that to some smaller limit, e.g. try below 100 MB.

Please always list the following system configuration information when asking about OptiX issues:
OS version, installed GPU(s), VRAM amount, display driver version, OptiX major.minor.micro version, CUDA toolkit version used to generate the input PTX, host compiler version.

Some more precise information with absolute numbers would have been helpful for developers reading this.
You neither said what your screen resolution is nor what launch sizes failed on what system configuration.


So after a long debugging session, I found the problem in my code. It had nothing to do with the screen size. I was trying to do something within the glfw rendering loop that was probably too heavy on big screen sizes. It usually crashed right before / during the denoiser execution.
I have a controller on the C++ side that influences the optix::Camera parameter, and sends info into the CUDA side of the code. My guess is that the CPU did a heavy operation, and the GPU was kinda stuck, waiting for a signal or something, and it just crashed.

Once I rewrote this segment and moved the controller entirely into CUDA, the problem was gone!
At first I refrained from doing that because I had an issue with mapping complex struct buffers into CUDA / Optix… Once I got over that, the rest was pretty easy!

Sorry for the inaccuracy!