Blueprints webpage shows as blank with demo content on localhost

I’m trying to run Blueprint Digital Twins for Fluid Simulation with docker compose or my environment.
Demo fails on my environment and docker compose either.
I checked CORS error and send failure happened.
The web page shows blank white screen.
I guess Web Frontend and kit application is not communicating each other.
I checked Service Network, kit application and AeroNIM are fine.


####
kit container log
####
unning VDB based advection2025-05-29 07:26:46 [70,346ms] [Info] [omni.kit.app._impl] [py stdout]: Running VDB based advection

GridInfo= GridInfo(name='Flow', size_in_bytes=989624992, grid_index=0, grid_count=1, type_str='Vec4f', translation=<warp.types.vec3f object at 0x7ea38c7b40c0>, transform_matrix=<warp.types.mat33f object at 0x7ea38c7b43c0>)2025-05-29 07:26:46 [70,347ms] [Info] [omni.kit.app._impl] [py stdout]: GridInfo= GridInfo(name='Flow', size_in_bytes=989624992, grid_index=0, grid_count=1, type_str='Vec4f', translation=<warp.types.vec3f object at 0x7ea38c7b40c0>, transform_matrix=<warp.types.mat33f object at 0x7ea38c7b43c0>)

points X = [0.0, 0.5, 1.0], colors=[(0.0076, 0.035, 0.98, 0.29), (0.993, 0.93304133, 0.026999995, 0.88), (0.625921, 0, 0.9655172, 0.6), (0.49508986, 0.31594005, 0.51831853, 0.8377502)]2025-05-29 07:26:46 [70,359ms] [Info] [omni.kit.app._impl] [py stdout]: points X = [0.0, 0.5, 1.0], colors=[(0.0076, 0.035, 0.98, 0.29), (0.993, 0.93304133, 0.026999995, 0.88), (0.625921, 0, 0.9655172, 0.6), (0.49508986, 0.31594005, 0.51831853, 0.8377502)]

[UI] Sending message: inference_complete2025-05-29 07:26:46 [70,362ms] [Info] [omni.kit.app._impl] [py stdout]: [UI] Sending message: inference_complete

2025-05-29 07:26:46 [70,362ms] [Info] [ov.cgns_ui.extension] streamline stage_update callback on coordinates/velocity/pressure 1748503593447368
[UI] Sending message: inference_complete2025-05-29 07:26:46 [70,362ms] [Info] [omni.kit.app._impl] [py stdout]: [UI] Sending message: inference_complete

signal triggered2025-05-29 07:26:46 [70,372ms] [Info] [omni.kit.app._impl] [py stdout]: signal triggered

Inference Complete: {'message': 'inference_complete'}2025-05-29 07:26:46 [70,372ms] [Info] [omni.kit.app._impl] [py stdout]: Inference Complete: {'message': 'inference_complete'}

signal triggered2025-05-29 07:26:46 [70,373ms] [Info] [omni.kit.app._impl] [py stdout]: signal triggered

Inference Complete: {'message': 'inference_complete'}2025-05-29 07:26:46 [70,373ms] [Info] [omni.kit.app._impl] [py stdout]: Inference Complete: {'message': 'inference_complete'}

2025-05-29 07:26:46 [70,373ms] [Warning] [carb.livestream-rtc.plugin] sendCustomMessageToClient: send failure: 0x800b1000 ({"event_type": "inference_complete_signal", "payload": {"signal": "inference_complete"}})
2025-05-29 07:26:46 [70,373ms] [Warning] [carb.livestream-rtc.plugin] sendCustomMessageToClient: send failure: 0x800b1000 ({"event_type": "inference_complete_signal", "payload": {"signal": "inference_complete"}})
2025-05-29 07:26:48 [72,219ms] [Info] [omni.rtwt.api.api.rtwt] 5. Enable visualizations
2025-05-29 07:26:48 [72,219ms] [Info] [omni.rtwt.api.api.rtwt] Disabling streamline mode
2025-05-29 07:26:48 [72,220ms] [Info] [omni.rtwt.api.api.rtwt] Disabling IndeX volume mode
2025-05-29 07:26:48 [72,220ms] [Info] [omni.rtwt.api.api.rtwt] Disabling IndeX slice mode
2025-05-29 07:26:48 [72,220ms] [Info] [omni.rtwt.api.api.rtwt] Waiting for 10 frames...
2025-05-29 07:26:48 [72,220ms] [Info] [omni.rtwt.api.api.rtwt] Disabling smokeprobe mode
2025-05-29 07:26:48 [72,220ms] [Info] [omni.rtwt.api.api.rtwt] Waiting for 5 frames...
2025-05-29 07:26:49 [72,452ms] [Info] [omni.rtwt.api.api.rtwt] Enabling smokeprobe mode
2025-05-29 07:26:49 [72,486ms] [Warning] [rtx.flow.plugin] FlowContext successfully imported interop handle(355) size(1073741824)

Hi @kth7186,

We have had a very similar issue posted on the GitHub and the team is helping there - [BUG]: Blank webpage with CORS error and send failure · Issue #8 · NVIDIA-Omniverse-blueprints/digital-twins-for-fluid-simulation · GitHub .

Are you also the GitHub poster?

Best,

Sophie

Hi @sophwats ,

Yes I posted on both side.
I thought they were different teams, so I asked for help each of them.

Best,

Kim

Hey @kth7186,

Completely makes sense - Just wanted to make sure you were being supported.

Thanks for letting us know it’s you both here and there! The team will continue to support through the GitHub issue.

Have a great day,

Sophie