Hi! I am trying to generate synthetic data of a pipe that should have different diameters at both ends using replicator. The problem I am having is I can control the scale of the whole pipe prim. My data should have images with proper circular pipe having uniform scaling at both ends and irregular pipe having irregular scaling at one end and uniform at the other. How can I control the scale of both ends independently?
Hi there,
for such changes scale will not work since it will apply the change to the whole mesh, you probably need a modifier such as extrude
. You could create several meshes with these changes in a DCC tool and then use them in a randomized fashion in replicator.
Best,
Andrei
Ok, there is another issue I am facing. I have made a pipe in blender and exported it in fbx format. After importing the pipe model in isaac sim, when I try to scan its inner surface using a lidar. My lidar only gives linear depth data that is equal to the minRange set by me. It is like the isaac sim is considering my pipe as a solid pipe and not a hollow pipe
Are you using RTX lidar (which uses the rendering pipeline to compute the hit rays) or the physX lidar, which uses the physics engine hit rays against the collision of the object.
If the latter, make sure your collisions are set up so the pipe is hollow, it would be recommended to use RTX lidar due to its performance and precision (the visible mesh is higher res than simplified collider primitives)
I am using PhysX rotating lidar. I observed that the issue starts when I add a RigidBody component to my pipe. Otherwise it works fine with only collider. How can I make it work with rigidbody enabled?
Will the RTX lidar work with rigidbody enabled?
PhysX Lidar will use the colliders (ignore the visuals): Rigid-Body Simulation — Omniverse Extensions latest documentation
RTX Lidar will use the visible meshes (ignore the colliders)
Yeah, as long as I am using only collider and manually placing the lidar inside the pipe, I am getting proper data. But when I add rigidbody to the pipe for adding physics, lidar stops working ie it gives the minimum possible range value. If the RTX lidar only uses visual then maybe that should work even with rigidbody
Yes, after adding the rigid body you should visualize the colliders and make sure the pipe stays hollow, otherwise the rays will not hit the pipe surface. Using RTX lidar should avoid this scenario.
Using RTX lidar is a little more complicated than physx lidar. I have tried adding a rtx lidar but i am not able to visualize and access its data through the rtx cloud node. Can you guide me through setting that up?
Here is the documentation for the rtx lidar with various usage examples:
Let us know if you run into any issues.
Yes, I tried following this, but i ran into a problem while trying to draw pointclouds and more importantly get linear depth data. But while trying to build this graph myself i found that some of the node have different outputs and some nodes like Isaac RenderVar to CPU pointer is not available in omnigraph
I followed the above image
My question basically boils down to this. After using script editor to generate a rtx lidar rendering pipeline, how can i use the data i am getting in SGD preprocessing graph in my own action graphs ?
I believe you can edit the post render graph after it is built. You could do this with python scripts. Adding and hooking up your own nodes. You could also make your own writers that use the nodes you want and attach to the point cloud node as input.
If you can make the graph you want by hand, you could use the Commands window to see what commands are used to make that graph, and copy them into your own script. Run that script after the writer is attached to your render product so that the dynamically built graph exists.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.