A lot of millennial gamers are going this route since endless upgrading does not yield much improvement and of course, fuck nvidia.
Hardware should be used until it either does not do the job required or it breaks outright.
A community for high quality news and discussion around technological advancements and changes
Things that fit:
Things that don't fit
A lot of millennial gamers are going this route since endless upgrading does not yield much improvement and of course, fuck nvidia.
Hardware should be used until it either does not do the job required or it breaks outright.
Me using onboard graphics....
Linus doesn't game?!!!??!?!!!!? Holy fuck, let's get channel 12 up in here to figure out what's going on in Linus House.
This is tech writers thinking everyone lives like them. An 8 year old graphics card if you're not high end gaming or video editing is fine. That card will still run a 4k desktop, and probably multiscreen 4k desktops without any issue.
For most users, graphics cards have long been at a level when they don't need upgrading. A mid range graphics card from even 10 years ago is more than powerful enough to watch video, or use desktop programs, and even fine for a wide range of games.
It's only if you want high end 3D gaming that upgrading is needed and arguably even that has already beyond a point of diminishing returns in the last 5 years for the majority of users and titles.
I do game a fair it and my RTX 3070 which is 5 years old really doesn't need upgrading. Admittedly that was higher end when it launched, but it still plays a game like Cyberpunk 2077 at a high end settings. It's arguable how much of the "ultra" settings on most games most users would even notice the difference, let alone actually need. New cards are certainly very powerful but the fidelity jump for the price and power just isn't there in the way it would have been when upgrading a card even 10 years ago.
I do game a fair it and my RTX 3070 which is 5 years old really doesn’t need upgrading.
Resolution and frame rate? Because at 4k mine was struuuuuggling, I had to get more VRAM.
The reasons to upgrade from this GPU is to launch AAA games from the last three years, AI/ML, creative tools (3D, video...) or to save a few watt-hour. If you don't care about that upgrading is just wasteful.
I remember watching an LTT video about Torvalds' setup and they mentioned he said something like "I don't game, this [the 580] is overkill"
And it probably still is overkill.
I'm going to hazard a guess that he's not doing a lot of high resolution, high refresh gaming on it.
"news"
Nice, I don't feel so bad about my RX 560 now
Good for him. My missus had one and it died pretty quickly due to RAM failure.
One of many reasons I'd never buy a laptop with soldered RAM.
I have the same GPU and Blender still doesn't give a fuck.
That's pretty much my experience with Blender: the Blender release cycle seems to be hell-bent on shutting out everybody who doesn't have the latest GPU-du-jour. i.e. if you don't have infinite resources to throw at the latest compute-cum-space-heater device, you're permanently stuck with a late version 2 or 3.
You: Please, draw a triangle.
Blender: I don't give a fuck what you want -- I won't draw anything with this GPU!
(Safety :) here before someone started seriously explaining that 580 is totally enough for Blender)
isn’t just boomer nostalgia.
Of course it isn't, Mr. Nasir. Linus is not a boomer.
“boomer” now just means “no-longer-young person”, and we’re a decade or two away from millennials being boomers.
"Boomer" has become a agist term used indiscriminately by younger generations to refer to people they perceive as old. My generation said "Pop" or "Grandpa",
Just like "Hacker" used to be something to be proud of and now means anyone with or without skills up to no good with computers, and just like "Beg the question" has nothing to do with supplication, this is simply the English language shifting in real time right in front of your eyes.
That's pretty RAD-eon.