Large Screen Warp and Blend

in the past few months I have been building a large projector screen with 3 projectors image calibrated and blended with Fly-Elise as i am using this screen for a flight simulator. The soft blended edges are fine and seamless, however there is to much brightness in the center between each projector. I have spent countless hours tweaking numbers and adjusting the blend via gain, slope, and gamma options available to me and have hit a personal roadblock. When testing darker colors such as deep blues and black, (as the most extreme) the blend starts to fall apart and I can clearly see the outline of each projection. My question is does Nvidia blend and warp truly stitch together black across three projectors?
I’ve submitted a few image links that shows specifically what I am trying to explain as i am no specialist in the science behind light projection and how Nvidia graphics interprets it. if anyone could shine any light on this to me I would be very grateful.

Hi AHeininger,

NVIDIA is providing an API to allow ISVs to build a Warp and Blend software. The question on the implementation of the warp and blend is best asked of the software provider as there is a number of options in the implementation of a projector blending application which can affect the overall performance.

In general terms - looking at the images provided, it doesn’t look like that the projectors are displaying a “true black” – i.e. even if you sent a black image you would still see a shadow on the screen as the projector is sending light. This is a limitation of software blending systems where you can’t blend “black” because the projector is still casting light even if the GPU is sending a black image to the projector.

Ryan Park

Hello, @AHeininger,

My reply may come a bit late for you, but I wanted to say that your woes with the regions between projectors displaying a bit too bright is quite common.

The problem, as you may know since posting, is that you need to characterize the photometric curve of your projectors, as the actual observed profile from black to white is most likely not the linear profile in your alpha blends.

It helps to do explicit photometric calibration, separating the RGB curves. I’ve done that manually with thousands of SLR photos and a variety of densitometer and colorimeter devices which are meant to characterize the color they observe.

I have had good luck with the open source tool Splash, which can do very competent photometric calibration for projection environments such as yours:

Even with color profile for R, G, and B the blue sky in your flight simulator scenes happens to be particularly difficult. Any smooth, nearly constant color is bad, but blue and white tend to be the worst offenders; I wrestled at length with a deep blue undersea for a cylindrical projection installation – it was very frustrating to see there are in fact limits to how far you can truly eliminate blends from sticking out when you have nearly static fields of color.

-Kevin Cain