Out of GPU memory but only if adding semantic labels

Hello, I have a reasonably complex scene in a USD file. I can load it fine in the isaacsim GUI, and only about 8GB of the RTX 3090 24GB VRAM is used.

If in a python script I load it with omni.usd.get_context().open_stage(), then use add_update_semantics() to add labels to each object, and then try to load the world with World(), I get out of memory errors:

2024-10-02 00:41:58 [22,318ms] [Error] [carb.graphics-vulkan.plugin] Out of GPU memory allocating resource 'MemoryManager-allocate' [size: unknown]
2024-10-02 00:41:58 [22,319ms] [Error] [carb.graphics-vulkan.plugin] Failure injector rule to repro:
{
    debugName="MemoryManager-allocate",
}
2024-10-02 00:41:58 [22,319ms] [Error] [carb.graphics-vulkan.plugin] VkResult: ERROR_OUT_OF_DEVICE_MEMORY
2024-10-02 00:41:58 [22,319ms] [Error] [carb.graphics-vulkan.plugin] vkAllocateMemory failed for flags: 0.
2024-10-02 00:41:58 [22,319ms] [Error] [gpu.foundation.plugin] Unable to allocate buffer
2024-10-02 00:41:58 [22,319ms] [Error] [gpu.foundation.plugin] subAllocate() failed for device 0, Buffer size: 20908068
2024-10-02 00:41:58 [22,319ms] [Error] [rtx.scenedb.plugin] Out of device memory while allocating geometry buffers
2024-10-02 00:41:58 [22,320ms] [Error] [carb.graphics-vulkan.plugin] Out of GPU memory allocating resource 'MemoryManager-allocate' [size: unknown]
2024-10-02 00:41:58 [22,320ms] [Error] [carb.graphics-vulkan.plugin] Failure injector rule to repro:
{
    debugName="MemoryManager-allocate",
}

However, I noticed that if I do not add semantic labels, the world loads fine and the script runs completely fine.

Any idea why adding semantic labels would trigger memory issues?

Thanks

I’m not able to repro the issue on my end. Using an IsaacSim 4.2 beta build, I opened the sample full warehouse stage and applied semantics to ever prim in the stage with:

from pxr import Usd, UsdGeom, Semantics
import omni.usd

stage = omni.usd.get_context().get_stage()

for prim in stage.Traverse():
	sem = Semantics.SemanticsAPI.Apply(prim, 'Semantics')
	sem.CreateSemanticTypeAttr()
	sem.CreateSemanticDataAttr()
	typeAttr = sem.GetSemanticTypeAttr()
	dataAttr = sem.GetSemanticDataAttr()
	typeAttr.Set('class')
	dataAttr.Set('Prim')

and didn’t see a change in VRAM usage or a crash.

Could you provide some more info on how many prims are in the stage you are using? Which version of IsaacSim are you using?

Hello, thanks a lot for looking into this.

There are 227 prims in the scene.

I’m running into the same issue testing with your blurb:

from isaacsim import SimulationApp
import omni
import argparse

INPUT='/path/to/scene.usdc'

CONFIG = {"width": 1280, "height": 720, "sync_loads": True, "headless": False, "renderer": "RayTracedLighting"}

parser = argparse.ArgumentParser("Usd Load sample")
parser.add_argument("--headless", default=False, action="store_true", help="Run stage headless")
args, unknown = parser.parse_known_args()
CONFIG["headless"] = args.headless
simulation_app = SimulationApp(launch_config=CONFIG)

from omni.isaac.core import World

from pxr import Semantics

# open stage
omni.usd.get_context().open_stage(INPUT)

# wait two frames for stage to start loading
simulation_app.update()
simulation_app.update()

from omni.isaac.core.utils.stage import is_stage_loading

while is_stage_loading():
    simulation_app.update()

stage = omni.usd.get_context().get_stage()

for prim in stage.Traverse():
    sem = Semantics.SemanticsAPI.Apply(prim, "Semantics")
    sem.CreateSemanticTypeAttr()
    sem.CreateSemanticDataAttr()
    typeAttr = sem.GetSemanticTypeAttr()
    dataAttr = sem.GetSemanticDataAttr()
    typeAttr.Set('class')
    dataAttr.Set('Prim')
         
world = World(stage_units_in_meters=1.0)

The issue happens when the world is loaded, and doesn’t happen if I don’t add the semantics.

I’m on version 4.2.0, Ubuntu 24 LTS, RTX 3090. I can look into making the scene available somewhere if needed.

Thanks a lot!