Point Instancer, material color and primvars

I have written a python script that creates a instancer prim with thousands of copies of a prototype object, which also has a bound material. I tried to use a primvars to modify the diffuse color of the instancer prim to assign a different color to each point instance, but so far I was unsuccessful - the instances all have the same color defined in the shader prim. Can someone tell me exactly what I’m doing wrong? Here short test version of the code:

from pxr import Usd, UsdGeom, UsdShade, Sdf, Gf

stage = omni.usd.get_context().get_stage()

sphere = UsdGeom.Sphere.Define(stage, “/World/PrototypeSphere”)

material = UsdShade.Material.Define(stage, “/World/Materials/RedMaterial”)
shader = UsdShade.Shader.Define(stage, “/World/Materials/RedMaterial/Shader”)
shader.CreateIdAttr(“UsdPreviewSurface”)
shader.CreateInput(“diffuseColor”, Sdf.ValueTypeNames.Color3f).Set((1.0, 0.0, 0.0))
material.CreateSurfaceOutput().ConnectToSource(shader.ConnectableAPI(), ‘surface’)
UsdShade.MaterialBindingAPI(sphere).Bind(material)

instancer = UsdGeom.PointInstancer.Define(stage, “/World/Instancer”)
instancer.CreatePrototypesRel().AddTarget(sphere.GetPath())

instancer.CreateProtoIndicesAttr().Set([0,0,0,0])

positions = [(0, 0, 0), (2, 0, 0), (4, 0, 0),(6, 0, 0)]
colors = [(1.0, 1.0, 0.0), (0.0, 1.0, 0.0), (0.0, 0.0, 1.0), (1.0, 0.0, 0.0)]

instancer.CreateIdsAttr(range(0,len(positions)))
instancer.CreatePositionsAttr().Set(positions)

primvar = UsdGeom.PrimvarsAPI(instancer).CreatePrimvar(“input:diffuseColor”, Sdf.ValueTypeNames.Color3fArray)
primvar.Set(colors)

Hi I believe we have this come up before a few months ago. I think there’s a bug in the code that doesn’t allow for changing the color. Let me find out and get back to you.

Thank you for looking into this! Just to clarify: I can change the color using the “DisplayColor” attribute of the instancer and using primvar to pass the array. What I was looking for is to change the color of the material bound to the prototype by just changing the shader attribute input to each instanced mesh. In the end, my goal is to change the Emissive intensity of each instanced sphere.

Well that is different. You should be able to just change the shader properties like a normal shader in the properties panel.

That changes all the point instances identically, but I’m trying to have each instance (100K of them) with its own color (either diffuse or emissive), so I set a primvar with 100K color entries and pass it to the instancer: one shader and 100K diffuse outputs, one for each instance. Does it make sense?
I should also mention that when I said “I can change color by using DisplayColor…” it is true with kit 106. It doesn’t work with kit 105.

Ok. And can you explain what it is you are trying to actually do with this scene? You need 100,000 copies of an object? Why, may I ask?

You are saying that this workflow DOES work with kit 106, but not with 105? Well that is good news for 106, I guess. Stick with that.

Well, in this particular case I was trying to create a star field using glowing spheres. It works fine the way I currently implement it: 5 prototypes spheres, each with a material of a different emissive intensity, and 5 separate sets of the 100 stars.

What I am trying to do is create a single material/shader and modify only selected output channels, for example the “input:deffuseColor” I tested, and connect it to each point instance (“particle”?), so that each particle would get its own diffuseColor via the primvars. The DisplayColor that works with 106 doesn’t help me because it only affect the vertex color.

Maybe what I’m trying to do is impossible in Omniverse. I was able to do it in Maya for another project where thousands of spheres where placed in space and each sphere color, intensity, and location had specific meaning. Using VR we immersed ourselves in this “cloud” and had an immediate sense of how things were partitioned; we were then able to select a sphere and get detailed numerical information.

Yes, it sounds very creative, but this is really not our workflow at all. We are focusing on physically accurate Digital Twins and Simulations. We are an industrial platform for Enterprise Physical AI and Robotics. This sounds more like VFX work. It may be possible, but there are much easier ways to do this. Why not just download a really high quality starfield as an HDRI. Very easy to do, with amazing results.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.