Best I can tell, Puget Systems did not use compute workloads for their quad RTX 3090 system, so I do not think the power consumption they observed adequately reflects what one would see with various deep-learning codes, for example.
While a standard residential circuit in the US (15A, 120V) can theoretically supply 1800W, a 15A circuit breaker will likely trip after a few minutes of applying that maximum load continuously. In addition, if I understand US electrical code correctly, a single plug-connected device shall not pull more than 80% of the maximum current. That would be 12A at 120V. Based on that, running a 1720W load from a standard electrical outlet would not look like a good idea. In addition, even a 1600W 80PLUS Titanium rated PSU will be right at the specification limit of operation under those circumstances.
It is one thing to find that something happens to work for a limited amount of time using a few selected workloads with brand-new hardware. It is a different thing to achieve long-term reliable operation across a wider universe of computational loads, under varying environmental conditions, for aging hardware.
BTW, I forgot to mention that when projecting the nominal power budget for the system, a good rule of thumb for DDR4 memory is to assume 0.4W per GB.