[BUG] Nvidia hardware does not render sobel filter post process material correctly

I recently purchased an Nvidia 3060ti and have found it is unable to properly render a sobel filter post process material in Unreal. The effect works correctly on AMD hardware I have used. I’m assuming this is an Nvidia specific problem, but I don’t have any other Nvidia gpus (atleast not any made in the last decade) to test on so I can’t say for certain. Due to this before I file a bug report with Epic I would appreciate if anyone else can confirm that this happens on other Nvidia cards.

I have included an Unreal sample project demonstrating the error.


The project is 4.27, but the error occurs on 5 and 5.1 (and likely pre 4.27 versions as well, but I haven’t tested this)

This is the correct result from the RX580 and Vega 11. (I’ve disabled anti aliasing for these pictures)

(continued in next post because I apparently am only allowed to include one link and picture per post)

And here is the effect on a 3060ti.

(continued in next post, but apparently I have to wait 2 minutes between replies too!)

The error is particularly egregious in motion.

(cont’d in next reply)

I have tracked down the cause to the (scentexture invsize * outline size) part of the shader.

If the outline size is increased to 1 or higher the error does not occur.

The error also does not occur if the screen percentage is increased to 150%.

This effect is critical for the game I am developing, and neither of these workarounds are viable, 1+ outline size is far too thick for my game’s style, and increasing the render resolution to 150% is obviously not a viable solution.

(that’s all. ps: please remove these obnoxious posting restrictions from my account.)

Hello @Falldog and welcome to the NVIDIA developer forums.

Posting restrictions are the default for new users, they will lift automatically after a certain amount of activity, don’t worry.

Thank you for the detailed issue report!

I tried your Demo project and can confirm that it looks rather “jagged” in places. I tinkered a bit with the kernel parameters of the Sobel Operator, using for example Sharr or Prewitt. But the result did not improve much and will likely not be enough for your needs.

I don’t think this issue is a hardware bug as such, I rather think it is an issue somewhere in the translation of the material graph to actual shader code for NVIDIA HW. This might be in UE or in our driver, I cannot say for certain.

I will talk to some engineers internally and check if there might be some known issue or possible workaround. But I can’t promise that there will be a quick solution available for you.

Did you look into other approaches to do cartoon shading, if that is what you want to achieve?

I didn’t mean to imply it’s a hardware fault, I just use hardware interchangeably with gpu.

Thanks :>

I’m not in any hurry. I’m still atleast months away from any sort of release, I just hope it can be resolved before then so I don’t have to tell Nvidia users to run the game at 150% or deal with this:

Yeah. I’ve spent the last few years progressing a style and trying out all kinds of approaches for shading and outlines.

I use inverted mesh outlines approach in addition to this on certain objects where the sobel filter has difficulty, fingers, pleated skirts, anything with insufficient depth/normal contrast. For outlining the silhouette of a mesh it does a great job, but it does nothing for the surface details, and I haven’t seen any other approaches that come anywhere close to catching those at this level of quality.

1 Like

Right, that last screenshot looks really bad.

I have the feeling that applying a hand written shader using the Sobel kernel directly on the color buffer during the fragment shader pass and using proper depth buffer information would yield better results.
Looking at the “decompiled” Material Graph there are a lot of UE specific comments regarding depth information accuracy that could easily cause this.

Well, speculation does not help you, and if your project does not already contain low level C++ code to add your own shaders, then it might be somewhat of a steep ramp to do so.

I cannot promise how fast I get some more eyes on this, but I find the visuals already pretty cool, so I might put some personal interest behind this :-)

Did you know by the way, that we now also have a Discord Server for Developers? Anyone is more than welcome to join!


Yeah it stands out more in my project because I also run it on the diffuse GBuffer (I almost exclusively use solid colours on everything), in addition to the normal and depth GBuffers used in the example project.

I tossed in a cube with a dithertemporalAA opacity mask to trigger the filter on basically every pixel and visualize the error better and this is what happens.

Top is Nvidia bottom is AMD.

And these overlaid in my project. I blew out the levels of the error overlay to visualize it better. (I don’t know why there’s vertical lines in 5.1 v 4.27)

I don’t really know what to make of it, the horizontal lines line up perfectly with every broken area, but the vertical ones do not. The only thing that potentially comes to mind is that some sort of precision error making it feed garbage UVs into the sampler which makes it detect an outline since it’s sampling either the wrong pixel or one that doesn’t exist. That could explain why it works correctly with an outline size of 1, after all 1/xy resolution and then multiplied by 0.5 would result in some very small numbers, but there’s a good chance I’m wrong since graphics programming is pretty far out of my skillset.

I should probably also note I was wrong, and that setting the resolution to 150% doesn’t actually fix the problem, just mitigates it enough that it’s much less visible, so still better than nothing I guess.

Yeah this is getting way beyond me. I did modify the source code and shader files to achieve my game’s style, but those were very basic changes that weren’t really all that different from the basic shader logic you learn from just using the material graph.

I could rewrite this as a usf file and insert it into the graphics pipeline along with the other post process usfs, but as I understand it those files are just instructions to tell the engine how to interpret shaders when it generates them into the end file so that probably would still result in the same problem.

Either way I appreciate the effort :>, and as I said it’s not like I’m in any sort of massive hurry. Fortunately I just happened to change gpus and become aware of the problem now rather than needing to track it down on release.