Understanding GPU Compatibility and Cost for AI Data Flywheel Workloads

Minimum GPU requirements to run an AI data flywheel solution:

  1. You will need a minimum of 2 NVIDIA GPUs for the Data Flywheel Blueprint – at least one for training, and another for running inference and evaluation.
  2. If you intend to deploy the LLM-as-a-judge model locally (instead of pointing to a remote managed deployment), you must account for hardware resources for that. For example, locally deploying a Llama-3.3-70B-Instruct model as a judge will need an additional 4 GPUs.

As of July 2025, we officially support H100 and A100 GPUs. Cards like Blackwell Pro 6000 Max Q or RTX 5090 are not officially supported.

For questions on hardware requirements, connect with our developers on the NeMo microservices forum.