this post was submitted on 02 Aug 2025
77 points (90.5% liked)

Tech

1658 readers
224 users here now

A community for high quality news and discussion around technological advancements and changes

Things that fit:

Things that don't fit

Community Wiki

founded 1 year ago
MODERATORS
 

Despite the rapid pace of GPU evolution and the hype around AI hardware, Linus Torvalds — the father of Linux — is still using a 2017-era AMD Radeon RX 580 as his main desktop GPU here in 2025. The Polaris-based graphics may be almost a decade old, but it’s aged remarkably well in Linux circles thanks to robust and mature open-source driver support. Torvalds' continued use of the RX 580, therefore, isn’t just boomer nostalgia. It's a statement of practicality, long-term support, and his disdain for unnecessary complexity.

Spotted by Phoronix, this revelation came during a bug report around AMD’s Display Stream Compression (DSC), which was causing black screen issues in Linux 6.17. Torvalds bisected the regression himself, eventually reverting a patch to maintain kernel progress. Ironically, DSC is what allows his Radeon RX 580 to comfortably drive his modern 5K ASUS ProArt monitor, a testament to how far open-source drivers have come.

“... same old boring Radeon RX 580,” Torvalds wrote in an email to the Linux Kernel Mailing List (LKML), reverting the patch for now so development can continue uninterrupted. That one line from the man himself speaks volumes about his preference for stability over novelty.

you are viewing a single comment's thread
view the rest of the comments
[–] BananaTrifleViolin@lemmy.world 32 points 11 hours ago* (last edited 11 hours ago) (1 children)

This is tech writers thinking everyone lives like them. An 8 year old graphics card if you're not high end gaming or video editing is fine. That card will still run a 4k desktop, and probably multiscreen 4k desktops without any issue.

For most users, graphics cards have long been at a level when they don't need upgrading. A mid range graphics card from even 10 years ago is more than powerful enough to watch video, or use desktop programs, and even fine for a wide range of games.

It's only if you want high end 3D gaming that upgrading is needed and arguably even that has already beyond a point of diminishing returns in the last 5 years for the majority of users and titles.

I do game a fair it and my RTX 3070 which is 5 years old really doesn't need upgrading. Admittedly that was higher end when it launched, but it still plays a game like Cyberpunk 2077 at a high end settings. It's arguable how much of the "ultra" settings on most games most users would even notice the difference, let alone actually need. New cards are certainly very powerful but the fidelity jump for the price and power just isn't there in the way it would have been when upgrading a card even 10 years ago.

[–] iAmTheTot@sh.itjust.works 2 points 3 hours ago

I do game a fair it and my RTX 3070 which is 5 years old really doesn’t need upgrading.

Resolution and frame rate? Because at 4k mine was struuuuuggling, I had to get more VRAM.