2021-01-06 05:10:59 [6,579ms] [Warning] [omni.usd] Warning: in SdfPath at line 97 of /opt/buildagent-share/work/da639afa0455b478/USD/pxr/usd/lib/sdf/path.cpp -- Ill-formed SdfPath </0_Camera>: syntax error
2021-01-06 05:10:59 [6,579ms] [Warning] [omni.client.plugin] Main: usd_plugin: Ill-formed SdfPath </0_Camera>: syntax error
2021-01-06 05:10:59 [6,579ms] [Error] [omni.usd.python] ErrorException:
Error in 'pxrInternal_v0_19__pxrReserved__::UsdStage::_IsValidPathForCreatingPrim' at line 3023 in file /opt/buildagent-share/work/da639afa0455b478/USD/pxr/usd/lib/usd/stage.cpp : 'Path must be an absolute path: <>'
(2) As seen here, I tried to make the Sim wait in L293
but still failed to catch the events in on_stage_event() in L119
Thank you for your answer. Following your steps, I can run the python code from Script Editor in Kit as 2#. But when I run directly from python script. It gave me this error:
Traceback (most recent call last):
File “test/test_save_img.py”, line 5, in
import omni.usd
File “/home/dellstation/isaac-sim-2020.2.2007-linux-x86_64-release/_build/target-deps/kit_sdk_release/_build/linux-x86_64/release/plugins/bindings-python/omni/usd/init.py”, line 1, in
from ._usd import *
ImportError: /home/dellstation/isaac-sim-2020.2.2007-linux-x86_64-release/_build/target-deps/kit_sdk_release/_build/linux-x86_64/release/plugins/bindings-python/omni/usd/…/…/…/./libusdVol.so: undefined symbol: _ZTIN32pxrInternal_v0_19__pxrReserved__12UsdGeomGprimE
Do you have any idea about this issue? I already run the python setup.py install and source setenv.sh.
To run the code I described directly as a python script, I think it is necessary integrate it with a OmniKitHelper object (OmniKitHelper takes care of launching Kit and gives us control over an update() function we can call whenever we would like to render a new frame*). You can check the Basic Time Stepping Example to see how to create and call OmniKitHelper
First, respect to the name of the camera primitive, it seem to be impossible start its name with a number. The solution is easy: start the name with a letter or _ like _0_Camera.
In the other hand, I tested your code and after some tries I find how to trigger the event on_stage_event() but this method was not able to enable the flag omni.usd.StageEventType.OPENED
Then, I edited (simplified) the sample /isaac-sim/python_samples/syntheticdata/offline_dataset/record_scenario.py and included the event and it works. I loaded the same scenario but directly form Nuclues server. Also, you can play with the scenario_path parameter of the class constructor to provide a full path to local assets
Here is the code. I recommend you merge the other part (create the camera and access to synthetic data) of your code in this class and work with it.
import asyncio
import os
import signal
import carb
import omni
from omni.isaac.synthetic_utils import OmniKitHelper, SyntheticDataHelper
from omni.isaac.utils.scripts.nucleus_utils import find_nucleus_server
# Default rendering parameters
RENDER_CONFIG = {
"width": 600,
"height": 600,
"renderer": "PathTracing",
"samples_per_pixel_per_frame": 12,
"max_bounces": 10,
"max_specular_transmission_bounces": 6,
"max_volume_bounces": 4,
"subdiv_refinement_level": 2,
"headless": True,
"experience": f'{os.environ["EXP_PATH"]}/isaac-sim-python.json',
}
class Scenario():
def __init__(self, scenario_path=None):
self.kit = OmniKitHelper(config=RENDER_CONFIG)
self.sd_helper = SyntheticDataHelper()
self.stage = self.kit.get_stage()
self.result = True
if scenario_path is None:
self.result, nucleus_server = find_nucleus_server()
if self.result is False:
carb.log_error("Could not find nucleus server with /Isaac folder")
return
self.asset_path = nucleus_server + "/Isaac"
scenario_path = self.asset_path + "/Environments/Simple_Warehouse/warehouse.usd"
self.scenario_path = scenario_path
current_context = omni.usd.get_context()
self.stage_event = current_context.get_stage_event_stream().create_subscription_to_pop(self.on_stage_event)
self._setup_world(scenario_path)
self.exiting = False
signal.signal(signal.SIGINT, self._handle_exit)
def _handle_exit(self, *args, **kwargs):
print("exiting dataset generation...")
self.exiting = True
def on_stage_event(self, in_event):
if int(omni.usd.StageEventType.OPENED) == in_event.type:
print('STAGE OPENED')
elif int(omni.usd.StageEventType.CLOSED) == in_event.type:
print('STAGE CLOSED')
elif int(omni.usd.StageEventType.OPEN_FAILED) == in_event.type:
print('Failed opening stage!')
elif int(omni.usd.StageEventType.ASSETS_LOADED) == in_event.type:
print("Stage's assets have been all loaded!")
elif int(omni.usd.StageEventType.ASSETS_LOAD_ABORTED) == in_event.type:
print("Stage's assets loading has been aborted!")
async def load_stage(self, path):
await omni.kit.asyncapi.open_stage(path)
def _setup_world(self, scenario_path):
# Load scenario
setup_task = asyncio.ensure_future(self.load_stage(scenario_path))
while not setup_task.done():
self.kit.update()
self.kit.update()
print("stage loaded")
def run(self):
for i in range(1000):
# step once and then wait for materials to load
self.kit.update()
while self.kit.is_loading():
self.kit.update()
print("END")
if __name__ == "__main__":
s = Scenario()
s.run()
Thank you for the pointer. So it is all about using the async omni.kit.asyncapi.open_stage() instead of omni.usd.get_context().open_stage() I think.
Besides, just a note, as I observed that it looks like the event omni.usd.StageEventType.OPENED is triggered before the stage loading task is done, after the waiting loop. So creating my camera then activating it after the loop also works.
And actually, it could be also put into a callback using setup_task.add_done_callback().
Thanks a lot again for all of your continued supports!
Sorry for bugging you again… (1) So in ISAAC_SIM/_build/linux-x86_64/release/exts/omni.isaac.synthetic_utils/omni/isaac/synthetic_utils/scripts/syntheticdata.py, I tried using the function _get_sensor_cuda_tensor() just out of curiosity, by enabling the line:
mode = 'cuda' if use_torch else 'numpy'
though the comment clearly notes that “Currently only numpy output is supported”.
I hit a segmentation fault on printing out tensor_data.data_ptr with dtype as uint8:
Would you mind giving a confirmation?
Besides, there is an error on:
return torch_wrap.wrap_tensor(tensor_data)
due to omni.syntheticdata._syntheticdata.PyTorchTensorByte not matching the type defined in ISAAC_SIM/python_samples/torch_wrap/Py_WrapTensor.h but I managed to resolve it with some extra python binding.
(2) The reason I wanted to try using that cuda method since I happened to measure SyntheticDataHelper's get_groundtruth() (using time.perf_counter()), which normally take over 0.5 second depending on how complicated the scene is.
That time is pretty long and I guess it is attributable to using numpy as at the moment. So how much faster the ground truth extraction could be if _get_sensor_cuda_tensor() function is used would you tell?
Unfortunately, I cannot provide you any confirmation or revelatory info about that 🙈.
I am just an end-user of NVIDIA Omniverse Isaac Sim. However, don’t be shy about asking me. I will be happy to try to find a solution whenever I can.
Hi @Tadinu, Wanted to confirm that for the 2020.2 release, we only support numpy format and observe similar error on trying to enable cuda
I agree that the switch to cuda tensor will make the groundtruth extraction faster. This is on our roadmap and hopefully, we will have this support in the upcoming release.
Hi @ltorabi@sdebnath,
So I have searched thoroughly the whole source and it seems omni.syntheticdata._syntheticdata.acquire_syntheticdata_interface() is the only API through which sensor data is fetched as demo-ed in ISAAC_SIM/_build/linux-x86_64/release/exts/omni.isaac.synthetic_utils/omni/isaac/synthetic_utils/scripts/syntheticdata.py, right?
Hi @Tadinu
Yes, that is correct. And although that interface returns both numpy and cuda tensors, for the 2020.2 release, we only support numpy format.
Hey, new to isaac sim
i do not have a syntheticdata file at /isaac-sim/_build/linux-x86_64/release/exts/omni.isaac.samples/omni/isaac/samples/scripts/
downloaded the latest version not sure what i might have done wrong
Only Isaac Preview Samples are located at /isaac-sim/_build/linux-x86_64/release/exts/omni.isaac.samples/omni/isaac/samples/scripts/
However, you can find the code related to Synthetic Data here:
Isaac Synthetic Data Utilities: /isaac-sim/_build/linux-x86_64/release/exts/omni.isaac.synthetic_utils/omni/isaac/synthetic_utils/scripts
Isaac Utilities (Visualize Synthetic Data, Synthetic Data Recorder): /isaac-sim/_build/linux-x86_64/release/exts/omni.isaac.utils/omni/isaac/utils/scripts