Which GPUs support interlaced encoding

You are oversimplfying a bit. :)

You say “Converting progressive to interlaced is just a matter of separating fields”. Tell me how you would convert 1080p@30Fps to 1080i@59.94fps (notice F vs f btw) with no obvious visual artifacts by simply “separating fields”. Not so simple due to the lack of an even multiple in the temporal domain. Not to mention half of the temporal resolution has already been discarded by the 30p.

You also say that “Converting interlaced to progressive always involves restoring information that is no longer there. It is always worse than just having progressive material.” Also not quite true. On a typical North American HDTV, you typically receive interlaced content (excepting for ABC and FOX). Displaying 1080p@29.97Fps on a screen scanning at 1080i@59.94fps (again watch F vs f), you would lose half of the temporal clarity and you’d end up interpolating half of the spatial data. Again, its not that simple.

Not trying to troll you buddy. I promise.

Only a few “near perfect” conversions exist. Some include (again, watch F vs f):
1080p59.94Fps to/from 1080i@59.94fps (aka 1080i@29.97Fps)
720p@59.94Fps to/from 1080i@59.94fps (aka 1080i@29.97Fps)

Others are not so perfect and require interpolation both in the spatial and temporal domains.

As mentioned earlier, happy to help anyone that has questions about this topic, but if you want to keep sparring, I’ll have to go find something else to do. Simply because I’m not wrong. I work with millions of dollars of extremely high end gear that would show even the slightest conversion error about as obviously as a t*rd in a punchbowl. ;)