I got spam this morning from a well-known PC manufacturer discussing their awesome new line of laptops. Their industrial design and technical specifications are on point. Then I checked their screen resolution, and they’re FHD for 14-inch, and 768p for their 13.3-inch. I couldn’t believe it.

PC users of either Windows or *nix persuasions like to make fun of Macs, and they have some fair points. To use a car analogy, nobody needs more than a Toyota Corolla to get from point A to B, so therefore anything else is frivolous and expensive. Okay I’m taking the piss a bit here, but there’s a grain of truth to it when one removes experience from the equation.

But once you’ve have a 2× HiDPI Retina screen, going to standard resolutions is awful. Photos look grainy. Colours are washed out. Fonts are blurry and ill-defined. And something I’ve really come to appreciate: you can’t crank the resolution up in a pinch to check out a ton of logs at once. The 1.5× default on Windows is a nasty hack; neither high enough to cleanly double everything, and still too low for most of the visual benefit. And most *nix desktops copy it.

Not everyone feels like they’re missing out having low resolution screens, just as I’m sure I’m not missing out getting an affordable, reliable car. That’s fine. The irony also isn’t lost on me that half my computing life is spent looking at Retina screens powered by Macs with laughably-poor GPUs. But screens are one of my flags I plant for computing: they’re either 2× HiDPI, or they’re not worth buying. I look at the damned things for most of my life now, so I want them to be nice!

It deserves its own post, but it’s why I still think the LG UltraFine 4K is the best screen I’ve ever owned. I don’t need 5K, or a certain expensive stand. It has high pixel density, colours look fantastic, and it’s the perfect size, regardless of the computer powering it.

But for whatever reason the majority of PC laptops and external screens sold today are firmly stuck in the mid 2000s. The last innovation was widescreens, the merits of which are still being debated. I’m not sure why things are so static, but I have theories:

  • Part of this has to be the fault of OS and application developers not making a compelling case for it. Apple make a big deal about how detailed their screens are.

  • GPUs on lower-end PCs are even worse than what Apple ship, which limits their ability to drive 2× displays. There are some theories that this is Intel’s fault for refusing to let OEMs ship discrete GPUs on certain boards. Either way, our mobile phones have better graphics than most laptops.

  • Windows defaults to 1.5×, and it’s incomplete. If this is people’s first experience with HiDPI screens, I can understand why being underwhelmed is the prevailing attitude.

  • Gamers also prioritise refresh rates and lower image retention over resolution and colour accuracy, and they’re about the only people still building their own computers. I’m glad the DIY spirit is still alive out there at all, but that’s also for another post.

But I think the biggest reason is inertia: people are used to looking at crappy screens with spindly ClearType text and either don’t know any different, or don’t care. Some people still press the Search button on search engines too, rather that hitting Return.

My FreeBSD Panasonic laptop has a great screen, and while things are a little snug running at 2× HiDPI, it has sufficient resolution to enable it. And subsequently, everything looks gorgeous! I hope this will be the norm one day, not just for computers one has to carefully check.