I have a question about the minimum GPU specification for Nvidia AODT.
Currently, I have 2 x RTX 3090 (24 GB VRAM). Is there any way to run a simple configuration of AODT in my system?
With a glimmer of hope, I tried this on my system: I successfully installed the AODT in my workstation; however, attaching a work is not possible.
We have deployed one GPU for the frontend and another for the backend. It seems that the backend is experiencing insufficient VRAM, as it requires more than 40 GB of VRAM.
We also plan to install AODT on a server equipped with 2 x RTX 3090 GPUs (24 GB VRAM each) until tommorrow.
By the way, could you suggest any alternative methods to address this issue?
We have attempted to remove the both containers and restart the one of containers, but unfortunately, the backend Docker container shuts down after only a few seconds of running.
@sjh1753 Backend need A100, A10 or L40. We are unable to debug backend issues with RTX 3090. Can you set up the back end on our qualified system and retry?
Maybe you could try the command in the path backend_bundule/docker-compose.yml
docker compose down
Then, generate the file docker-compose-sm86.yml copied from docker-compose.yml where the
connector’s image is modified from: nvcr.io/esee5uzbruax/aodt-sim:1.1.0_runtime_$GEN_CODE
to nvcr.io/esee5uzbruax/aodt-sim:1.1.0_runtime_SM86
Thanks to your guidance, we successfully resolved the issue. We hadn’t realized that the backend container configuration was within the backend_bundle folder, and we also mishandled the Docker container individually.