How to run IsaacLab on one specific GPU

Hi here, I just want to ask if there is a way to run IsaacLab on one specific GPU

I have tried setting parameter device=cuda:4 for omni.isaac.lab.app.AppLauncher, and according to the source code, it should set active_gpu and physics_gpu correct for SimulationApp

however when I start running isaacLab, there still is storage allocation on other GPU especially cuda0, here is the output of nvidia-smi:

Wed Nov 13 09:41:28 2024       
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 555.42.02              Driver Version: 555.42.02      CUDA Version: 12.5     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 3090        Off |   00000000:1B:00.0 Off |                  N/A |
| 30%   41C    P0            130W /  350W |    3529MiB /  24576MiB |     17%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
|   1  NVIDIA GeForce RTX 3090        Off |   00000000:1C:00.0 Off |                  N/A |
|  0%   35C    P0            108W /  370W |     174MiB /  24576MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
|   2  NVIDIA GeForce RTX 3090        Off |   00000000:1D:00.0 Off |                  N/A |
| 30%   40C    P0            107W /  350W |     142MiB /  24576MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
|   3  NVIDIA GeForce RTX 3090        Off |   00000000:1E:00.0 Off |                  N/A |
| 30%   41C    P0            107W /  350W |     142MiB /  24576MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
|   4  NVIDIA GeForce RTX 3090        Off |   00000000:3D:00.0 Off |                  N/A |
| 30%   34C    P0            112W /  350W |    2314MiB /  24576MiB |      1%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
|   5  NVIDIA GeForce RTX 3090        Off |   00000000:44:00.0 Off |                  N/A |
| 30%   35C    P0            110W /  350W |     142MiB /  24576MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
|   6  NVIDIA GeForce RTX 3090        Off |   00000000:45:00.0 Off |                  N/A |
| 30%   37C    P5            116W /  350W |     142MiB /  24576MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
|   7  NVIDIA GeForce RTX 3090        Off |   00000000:46:00.0 Off |                  N/A |
| 30%   42C    P0            115W /  350W |     142MiB /  24576MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
                                                                                         
+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI        PID   Type   Process name                              GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
+-----------------------------------------------------------------------------------------+

and if cuda0 is running out of storage, it will show error message related to Texture:

2024-11-13 06:51:10 [19,366ms] [Error] [carb.graphics-vulkan.plugin] Out of GPU memory allocating resource 'MemoryManager chunk' [size: unknown]
2024-11-13 06:51:10 [19,367ms] [Error] [carb.graphics-vulkan.plugin] Failure injector rule to repro:
{
    debugName="MemoryManager chunk",
}
2024-11-13 06:51:10 [19,367ms] [Error] [carb.graphics-vulkan.plugin] VkResult: ERROR_OUT_OF_DEVICE_MEMORY
2024-11-13 06:51:10 [19,367ms] [Error] [carb.graphics-vulkan.plugin] vkAllocateMemory failed for flags: 0.
2024-11-13 06:51:10 [19,367ms] [Error] [gpu.foundation.plugin] Texture creation failed for the device: 0.
2024-11-13 06:51:10 [19,367ms] [Error] [carb.graphics-vulkan.plugin] Out of GPU memory allocating resource 'MemoryManager chunk' [size: unknown]
2024-11-13 06:51:10 [19,367ms] [Error] [gpu.foundation.plugin] TextureAsset - Failed to create source texture for /home/xinyili/miniconda3/envs/isaaclab/lib/python3.10/site-packages/isaacsim/extsPhysics/omni.usdphysics.ui/icons/physicsJoint/JJoint.png
2024-11-13 06:51:10 [19,367ms] [Error] [carb.graphics-vulkan.plugin] Failure injector rule to repro:
{
    debugName="MemoryManager chunk",
}
2024-11-13 06:51:10 [19,367ms] [Error] [carb.graphics-vulkan.plugin] VkResult: ERROR_OUT_OF_DEVICE_MEMORY
2024-11-13 06:51:10 [19,368ms] [Error] [carb.graphics-vulkan.plugin] vkAllocateMemory failed for flags: 0.

Could you check if this issue occurs with any example? If so, please share the command used to run it.