I have successfully run the first eight steps of the link above. When running the isaac-sim file cartpole_paly.py on a cloud server without a GUI interface, the error “Failed to acquire IWindow interface” is reported.
How can I use the Omniverse Streaming Client to map the running results of the cloud server to the local Isaac-sim to sit on the window interface of the cloud server?
The way I have found to run the /isaac-sim/standalone_examples/api/omni.isaac.gym/cartpole_play.py example in a remote container and being able to see the simulation with the remote client involves editing the VecEnvBase class (in /isaac-sim/exts/omni.isaac.gym/omni/isaac/gym/vec_env/vec_env_base.py to enable native streaming as follow:
class VecEnvBase(gym.Env):
""" This class provides a base interface for connecting RL policies with task implementations.
APIs provided in this interface follow the interface in gym.Env.
This class also provides utilities for initializing simulation apps, creating the World,
and registering a task.
"""
def __init__(self, headless: bool, sim_device: int = 0, livestream: bool = False) -> None:
""" Initializes RL and task parameters.
Args:
headless (bool): Whether to run training headless.
sim_device (int): GPU device ID for running physics simulation. Defaults to 0.
livestream (bool): Whether to enable a livestream server to connect to when running headless
"""
experience = ""
if headless:
experience = f'{os.environ["EXP_PATH"]}/omni.isaac.sim.python.gym.headless.kit'
if livestream:
experience = f'{os.environ["EXP_PATH"]}/omni.isaac.sim.python.kit'
headless = True
self._simulation_app = SimulationApp(
{"headless": headless, "physics_device": sim_device}, experience=experience
)
carb.settings.get_settings().set("/persistent/omnihydra/useSceneGraphInstancing", True)
self.sim_frame_count = 0
if livestream:
from omni.isaac.core.utils.extensions import enable_extension
# Default Livestream settings
self._simulation_app.set_setting("/app/window/drawMouse", True)
self._simulation_app.set_setting("/app/livestream/proto", "ws")
self._simulation_app.set_setting("/app/livestream/websocket/framerate_limit", 120)
self._simulation_app.set_setting("/ngx/enabled", False)
# Note: Only one livestream extension can be enabled at a time
# Enable Native Livestream extension
# Default App: Streaming Client from the Omniverse Launcher
enable_extension("omni.kit.livestream.native")
self._render = True
else:
self._render = not headless
The new constructor integrates the code from /isaac-sim/standalone_examples/api/omni.isaac.kit/livestream.py to enable a livestream server to connect to when running headless. Note that the experience file is changed from omni.isaac.sim.python.gym.headless.kit to omni.isaac.sim.python.kit in livestream mode (some determinism improvement with physics and others may be affected)
After that, you only need to instantiate the VecEnvBase in the cartpole_play.py file as follows
thank you for your reply. It has been shown that the simulation environment has been started. But there is an error as follows, should I install stable_baselines3 in the docker environment in the cloud server?
Thanks for your reply! Yes, I have the same problem running /isaac-sim/python.sh livestream.py in the /isaac-sim/standalone_examples/api/omni.isaac.kit folder. And there’s the log file. typescript (52.0 KB)