Does DLSS render at a lower resolution and then upscales?

The Turing whitepaper says DLSS renders “at a lower input sample count” compared to that used for super-sampled TAA. This makes it sound like DLSS still renders at the game’s resolution setting (say for example 4K) but then uses less super-sampling per pixel than TAA to perform the anti-aliasing.

Or is DLSS actually rendering at a lower resolution (say 1080p), and then using the AI-UpRes/SuperRes NGX feature to upscale this to 4K? But the whitepaper describes AI-UpRes as good for only around 30 fps 1080p-4K upscaling, so this doesn’t seem to match the 60+ fps performance which DLSS did in the UE4 “Infiltrator” demo.

So which is the process for DLSS: render at game resolution then sample with less sub-pixels for it’s AA? Or render at a lower resolution and then upscale to game resolution?