I’m currently trying to write a script that automatically applies an image to a plane in an Omniverse scene. I was able to automatically create an MDL with an extension and set the diffuse texture to the desired image, but the texture has an alpha channel, and the alpha channel was lost when it was used for the diffuse texture. I’ve tried doing some image-processing to create an alpha texture and adding that to the material as well, but it seems like on the Omniverse side, the alpha channel behaves as a binary On/Off rather than allowing for smooth transitions from visible to invisible. Any advice on the best way to implement this would be much appreciated!
Hi @jlw387. I’m going to assume you’re using OmniPBR, but OmniSurface should work similarly. You shouldn’t need to do any image processsing. You can tell the Opacity attribute to use the alpha channel of an RGBA image. Here it is working for me:
For Real-Time renderer, you’ll need to enable “Fractional Cutout Opacity”. For Interactive, it should just work.
Thanks @mati-nvidia, this works pretty well! I do notice some issues with the borders having a bit of an off-color border in certain areas (see examples):
Note that the texture itself cuts off the line on the right side; the problem is that there are two strips of white not present in the original texture.
Adjusting the opacity threshold seems to help, but it doesn’t completely remove the white border until I set it at 1.0, which makes the opacity binary again.