Adding 2D image as texture to a height-field

I’ve been generating a mesh terrain using a 2D height map by converting them to vertices and triangles, and then adding them to stage as follows:

def add_terrain_to_stage(stage, vertices, triangles, position=None, orientation=None):
    num_faces = triangles.shape[0]
    terrain_mesh = stage.DefinePrim("/World/terrain", "Mesh")
    terrain_mesh.GetAttribute("points").Set(vertices)
    terrain_mesh.GetAttribute("faceVertexIndices").Set(triangles.flatten())
    terrain_mesh.GetAttribute("faceVertexCounts").Set(np.asarray([3]*num_faces))

    terrain = XFormPrim(prim_path="/World/terrain",
                        name="terrain",
                        position=position,
                        orientation=orientation)

    UsdPhysics.CollisionAPI.Apply(terrain.prim)
    physx_collision_api = PhysxSchema.PhysxCollisionAPI.Apply(terrain.prim)
    physx_collision_api.GetContactOffsetAttr().Set(0.02)
    physx_collision_api.GetRestOffsetAttr().Set(0.00)

Now, I want to use a 2D RGB image of the same size as the height map, as texture for the mesh. How can I do that?

Hi @ahallak - To apply a texture to a mesh in USD, you need to create a material, bind it to the mesh, and then set the texture file path to the diffuse color input of the material’s shader. Here’s how you can do it:

from pxr import UsdShade, Sdf, UsdGeom

# Create a material.
material_path = Sdf.Path("/World/terrain_material")
material = UsdShade.Material.Define(stage, material_path)

# Create a shader for the material.
shader = UsdShade.Shader.Define(stage, material_path.AppendChild('PBRShader'))
shader.CreateIdAttr('UsdPreviewSurface')

# Create an input for the diffuse color.
diffuse_color_input = shader.CreateInput('diffuseColor', Sdf.ValueTypeNames.Color3f)

# Create a texture for the diffuse color.
texture_path = material_path.AppendChild('diffuseTexture')
texture = UsdShade.Shader.Define(stage, texture_path)
texture.CreateIdAttr('UsdUVTexture')

# Set the file path of the texture.
texture.CreateInput('file', Sdf.ValueTypeNames.Asset).Set('path/to/your/texture.png')

# Connect the texture to the diffuse color input.
texture.CreateOutput('rgb', Sdf.ValueTypeNames.Float3).ConnectToSource(diffuse_color_input)

# Bind the material to the mesh.
UsdShade.MaterialBindingAPI(terrain_mesh).Bind(material)

This code creates a material with a PBR shader and a texture for the diffuse color. The texture file path is set to the ‘file’ input of the texture shader. The ‘rgb’ output of the texture shader is then connected to the ‘diffuseColor’ input of the PBR shader. Finally, the material is bound to the mesh using the UsdShade.MaterialBindingAPI.

Please replace ‘path/to/your/texture.png’ with the actual path to your texture file.

Thank you very much for your reply.
I pasted the code you sent, while it doesn’t crash, it also doesn’t seem to work.
I’ve tried replacing the png with jpeg, and also instead of using my desired image using a simple blue screen from online. None of them shows.
I do see in the IsaacSim screen opening up the material in the right path, and seemed to be binded.

Any idea what am I doing wrong?
Screenshot from 2023-07-30 13-37-04

The plot thickens…
Even though your code didn’t pull through, it led me to this online source Create a UsdPreviewSurface Material — dev-guide latest documentation

The last part was very similar to what you proposed, so I added everything without understanding anything and got the following code:

# Add texture
    from pxr import UsdShade, Sdf, UsdGeom

    # Create a material.
    material_path = Sdf.Path("/World/terrain_material")
    material = UsdShade.Material.Define(stage, material_path)

    # Create a shader for the material.
    shader = UsdShade.Shader.Define(stage, material_path.AppendChild('PBRShader'))
    shader.CreateIdAttr('UsdPreviewSurface')

    # Create an input for the diffuse color.
    diffuse_color_input = shader.CreateInput('diffuseColor', Sdf.ValueTypeNames.Color3f)

    shader.CreateInput("roughness", Sdf.ValueTypeNames.Float).Set(0.5)
    shader.CreateInput("metallic", Sdf.ValueTypeNames.Float).Set(0.0)
    # Create a texture for the diffuse color.
    # texture_path = material_path.AppendChild('diffuseTexture')
    texture_path = material_path.AppendChild('DiffuseColorTx')

    texture = UsdShade.Shader.Define(stage, texture_path)
    texture.CreateIdAttr('UsdUVTexture')

    # Set the file path of the texture.
    texture.CreateInput('file', Sdf.ValueTypeNames.Asset).Set('/home/ahallak/Downloads/rgb_map5.png')
    # texture.CreateInput('file', Sdf.ValueTypeNames.Asset).Set('/home/ahallak/Downloads/Solid_blue.png')

    # Connect the texture to the diffuse color input.
    texture.CreateOutput('rgb', Sdf.ValueTypeNames.Float3).ConnectToSource(diffuse_color_input)

    shader.CreateInput("diffuseColor", Sdf.ValueTypeNames.Color3f).ConnectToSource(texture.ConnectableAPI(), 'rgb')
    material.CreateSurfaceOutput().ConnectToSource(shader.ConnectableAPI(), "surface")
    # Bind the material to the mesh.
    UsdShade.MaterialBindingAPI(terrain_mesh).Bind(material)

Now this code, worked on an online png image I found, but I couldn’t get it to work on a HxWx3 numpy array have:

  • when I try to save it to PNG with matplotlib.Image.imsave, nothing happens in the rendering.
  • when I save it using cv2.imwrite, it only appears in grayscale for both 0-1 float values when saved as png, or 0-255 numpy.uint8 values when saved as jpg. So I couldn’t get color and I have no idea why.

Do you understand what was missing from your original snippet, maybe I’m merging the sources wrong?
Any idea how to save a HxWx3 numpy array as an image in a way which is compatible with the texture?

OK, new hint - by playing with additional PNGs, I’ve realized that although some don’t work, these that do result in a monochromatic rendering - either all are shades of blue, or shades of red, or shades of grayscale. I couldn’t find any clue to why this may happen in the snippet, maybe you have an idea.

result in a monochromatic rendering - either all are shades of blue, or shades of red, or shades of grayscale

Yes, I am also experiencing monochromatic rendering. Phrased another way, it seems to be taking the RGB value of a single pixel within the image and applying it to the entire prim as the diffuse color RGB.

The expected result was to see the full image displayed on the prim which means different RGB value per voxel of material based on the scale.

Also, I think this is how you would achieve the same result using the Isaac Sim UI

Create > Material > USD Preview Surface Texture

However, even when attempting through Isaac Sim UI using this method above, I get the same monochromatic rending where the prim is always a single color despite the UVTexture input:file Asset having multiple colors

I think I have found out the reason for these issues described above:

  • why this technique appears to work on some prims and not others
  • why it appears to render 1 color (monochromatic) instead of rendering the entire texture

I think it has to do with the Geometry > Face subdivision and interpolation

Video Explanation

The image renders on the cube mesh
The image does NOT render on the custom prim with holes because it is very complex and is messing up the interpolation.

However, I am not sure how to solve it.

Possible solutions

  1. Override the faceVertex indices, counts, interpolation, etc?
  2. Create a cube without collision that renders as sibling to this complicated prim
    Basically complex prim has the physics, but the rendering is the simple cube.

2 seems easer; however, when I try this the cube mesh isn’t constrained to the area of the complex prim so it would not look correct. There must be a way to fix the rendering of texture on the complex prim as suggested in option 1

After some more testing I also see the image is being projected on the wrong axis.
From this weird angle from camera you can see the image INSIDE each of the holes

I am also not sure how to change this. Perhaps Texture properties