I spent over $5,000 on a laptop with a 5090 Blackwell GPU, 64GB of RAM, 4TB SSD, and an ultra 9 275HX processor specifically so I could run AI locally rather than from a cloud-based server.
My system is capable of running any number of LLM models locally, yet is still not compatible with ChatRTX. Why? Is there any intention of releasing support for this hardware? I would have saved thousands if I’d known about this ahead of time.