Nvidia’s RTX 3090 Ti

Hardware

Nvidia have launched their latest 3090 Ti graphics card for the eye-watering price of a kidney donation or house deposit. Here I was worrying about whether to get a 3060 Ti, 3070, or 6800 XT when I refurbished my FreeBSD/Linux desktop last month, and now a card exists that makes even the 3090 seem ridiculous rather than unattainable.

Secure in the knowledge that I’ll never own one of these space heaters, I’m therefore more interested in the power draw, and what it represents.

This lone graphics card has a TDP of 450 watts, fed to it through three—count them, three—power connectors. That’s one more connector than I have arms. That’s a nasty pun, but will only make sense in a moment.

For comparison, the MacBook Pro on my table has a 90 watt power supply, and my first four PC clone computers and iMacs I owned had less than 250 watts each. Granted its not an apples to apples comparison, because some of them were PCs. Thank you, I’m here all week… unfortunately.

I suppose one could draw parallels (using their graphics card) to needing four connectors across two SLI cards, and that they’d draw similar power, if not more, than this single 3090 Ti. It still seems ridiculous though.

Even calling such a thing a graphics card feels like an anachronism. Nobody would be buying this card for gaming, so it’s almost certainly for parallel workloads and rendering, what people now euphemistically refer to as content creation. It’s basically a standalone computer with a PCIe slot to connect to a CPU, which at this stage is a glorified bridge to your peripherals and IO.

This is a big deal not for the card itself, but what it represents for the industry. Rumours abound that Nvidia are testing the waters with this release, pending their next-generation cards that will all ship with similar performance envelopes and power requirements. That scares me a bit, I’ll admit.

Remember when Windows 11 came out and everyone scrambled to get motherboards with TPMs to satisfy its arcane requirements? I suspect people are soon going to realise that all but the most expensive power supplies will be able to deliver the required power, consistently, for these new GPUs as well. Efficiency and ATX/SFX sizing aside, I think we’ve all got comfortable with the fact that we can pick up any power supply we want and it’ll just work. This may no longer be the case… which is the box where you’d be putting such a PSU.

We’re really starting to see a divergence here between ultra power efficient components like Apple’s M1 Max desktop SOCs, and these behemoths that throw caution and electricity to the wind in the pursuit of ultimate performance. Which leads me to wonder where the reasonable midrange will end up.

Author bio and support

Me!

Ruben Schade is a technical writer and infrastructure architect in Sydney, Australia who refers to himself in the third person. Hi!

The site is powered by Hugo, FreeBSD, and OpenZFS on OrionVM, everyone’s favourite bespoke cloud infrastructure provider.

If you found this post helpful or entertaining, you can shout me a coffee or send a comment. Thanks ☺️.