Hey All,
I’ve been working on getting the RTX Lidar to work in my scene.
I followed Adams video on RTX lidar, specifically around the 20 mins mark
Following the code below, which differs from the rtx_lidar.py in the video
create the sensor camera
import omni
from pxr import Gf
_, sensor = omni.kit.commands.execute(
“IsaacSensorCreateRtxLidar”,
path=‘/sensor’,
parent=None,
config=‘Example_Rotary’,
translation=(0, 0, 1.0),
orientation=Gf.Quatd(0.5, 0.5, -0.5, -0.5), # Gf.Quatd is w,i,j,k
)
create a hydra texture to store render product
from omni.isaac.core.utils.render_product import create_hydra_texture
_, render_product_path = create_hydra_texture([1,1], sensor.GetPath().pathString)
create the debug draw pipline in the process graph
import omni.replicator.core as rep
writer = rep.writers.get(‘RtxLidar’ + ‘DebugDrawPointCloud’)
writer.attach([render_product_path])
When I run the intial part of the code in the script editor to create the sensor, it runs with no errors but wont create the sensor head in the stage? Cant understand why it wont create the sensor head, also when I go to the action graph it wont give me the SDG option so its not loaded in the action graph setup up either.
Any help here woud be awesome.
Thanks