“Bang for the buck” when picking a graphics card
HardwareRead or watch any advice about choosing a GPU, and you’ll likely be given one of the following pieces of advice:
-
AI-written spam with a vague comparison table, concluded with choose one that’s right for you. People don’t know what’s right for them, that’s why they landed on your article!
-
Those who say you should always buy the best graphics card you can for your budget. It’s the part of any build that will make the most difference to gameplay or graphics workstation tasks.
-
Those who say it’s important to find the best bang for the buck to not be gouged on price/performance. Diminishing returns eventually kicks in, making the added price unjustifiable for the added performance. On the other side, not much extra can often lead to huge leaps in performance.
It’s the latter I find most interesting, because it has the allure of being rational. Frames per second per dollar is quantifiable, testable, and devoid of marketing or other influences, so it’s the one you should go with, right?
Hardware Unboxed’s podcast, and benchmark misconceptions
Well, not so fast. As I wrote on my recent post about benchmarking, these sorts of comparisons come with their own implicit assumptions that may not hold true for certain use cases. Lies, damned lies, and statistics! For example, what about:
-
Power use? If electricity is expensive in your area, any money saved upgrading to a more power hungry card with more frames per dollar will be quickly eroded. This basic fact completely eludes certain US reviewers, and it baffles me!
-
Heat? If you’re building a mini-ITX computer, the card that delivers the most frames per dollar may be a thermal dead end for you. A card with a blow-through or impeller cooler design might also be necessary depending on the configuration of your case, which might not be available on the most cost-effective card.
-
Physical size? The best card for frames per second per dollar might not fit, which unless you’re handy with an angle grinder or nibbling tool, is a clear deal breaker.
-
Operating environment? I use a 4K panel for working at home, and I don’t have space for another. The 3060 Ti was touted as the best bang for the buck in the previous generation, but the 3070 delivered just enough of a performance improvement to make games viable on that 4K panel.
-
Features? If your rendering workload uses an AMD or Nvidia API, claiming an Arc GPU has the best performance per dollar is meaningless. This is the same as open source people saying Mac users who need Illustrator and Office should switch to a Linux desktop.
My advice for anyone looking to buy a GPU would be to use bang for the buck charts as a starting point, but to not get hung up on them. They’re but one metric in a sea of others.