I’m trying to get 4K@30hz resolution on a Blaupunkt TV (BLA39 - a European clone of Seiki), connected to GTX760 via HDMI cable. The same system works flawlessly in Windows. In Linux, 4K mode is detected and activated, but image is blurry and text unreadable - examination of screen shows pixels split into groups of 4 with same color, indicating that it is being down-scaled to HD and then up-scaled to fill the screen.
TV’s menu has a ‘scaling’ option (point-to-point/zoom/4:3/etc), which becomes disabled when in 4K mode in Windows. It is always enabled in Linux.
My first guess would be that the data TV receives is actually HD and it’s scaled to fill the screen, but the mode printed in X.log & xrandr is definitely 3840x2160. So… Is is possible that graphics card somehow scales down the output to HD behind the scenes while telling X that it is transmitting 4K image? Is there a deeper way of querying card about its output? Can you think of any kind of data transmission that may convince a TV to scale input down and then up again? It seems too fantastic to assume either.
I have also tried to define modes manually, disabling all mode validation & EDID check etc, but results were no different. I have no idea what to do next, I would appreciate any idea on how to debug this further.
nvidia-bug-report.log.old.gz (200 KB)