this post was submitted on 03 Sep 2025
112 points (92.4% liked)

PC Gaming

12208 readers
572 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Omega_Jimes@lemmy.ca 16 points 2 days ago (1 children)

I know it's not indicative of the industry as a whole, but the Steam hardware survey has Nvidia at 75%. So while they're still selling strong, as others have indicated, I'm not confident they're getting used for gaming.

[–] WolfLink@sh.itjust.works 1 points 1 day ago

Everyone and their manager wants to play with LLMs and and and Intel still don’t have a real alternative to CUDA and so are much less popular for compute applications.

[–] ABetterTomorrow@sh.itjust.works 6 points 1 day ago (1 children)

GTX 1080 Ti strong here lol

[–] overload@sopuli.xyz 2 points 1 day ago

💪💎💪

AMD needs to fix their software.

I had an AMD GPU last year for a couple weeks, but their software barely works. The overlay didn't scale properly on a 4k screen and cut off half the info, and wouldn't even show up at all most of the time, 'ReLive' with instant replay enabled caused a performance hit with stuttering in high FPS games...

Maybe they have it now, but I also couldn't find a way to enable HDR on older games like Nvidia has.

[–] Bakkoda@sh.itjust.works 18 points 2 days ago (1 children)

TIL there's a lot of people who don't know what a dGPU is in here

[–] FlembleFabber@sh.itjust.works 11 points 1 day ago

Thanks for explaining

[–] 9tr6gyp3@lemmy.world 83 points 3 days ago (19 children)

Who the hell keeps buying nvidia? Stop it.

[–] zqwzzle@lemmy.ca 68 points 3 days ago (3 children)

It’s the fucking AI tech bros

[–] Tinidril@midwest.social 38 points 3 days ago (1 children)

Don't forget the crypto scammers.

[–] 9488fcea02a9@sh.itjust.works 5 points 2 days ago (4 children)

GPU hasnt been profitable to mine for many years now.

People just keep parroting anti-crypto talking points for years without actually knowing what'a going on

To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation

load more comments (4 replies)
[–] MystikIncarnate@lemmy.ca 2 points 2 days ago

Microsoft.

Microsoft is buying them for AI.

From what I understand, chatGPT is running on azure servers.

[–] brucethemoose@lemmy.world 11 points 3 days ago* (last edited 3 days ago) (2 children)

Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

4x3090 or 3060 homelabs are the standard, heh.

[–] zqwzzle@lemmy.ca 22 points 3 days ago (3 children)

Their data centre division is pulling in 41 billion revenue vs 4 billion consumer market.

https://finance.yahoo.com/news/nvidia-q2-profit-soars-59-021402431.html

load more comments (3 replies)
[–] MystikIncarnate@lemmy.ca 1 points 2 days ago (1 children)

Who the fuck buys a consumer GPU for AI?

If you're not doing it in a home lab, you'll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

[–] brucethemoose@lemmy.world 1 points 2 days ago* (last edited 2 days ago) (1 children)

Who the fuck buys a consumer GPU for AI?

Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

I can (just barely) run GLM-4.5 on a single 3090 desktop.

[–] MystikIncarnate@lemmy.ca 1 points 1 day ago (1 children)

.... Yeah, for yourself.

I'm referring to anyone running an LLM for commercial purposes.

Y'know, 80% of Nvidia's business?

[–] brucethemoose@lemmy.world 1 points 1 day ago (1 children)

I've kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

I guess my original point was agreement: the 5000 series is not great for 'AI', not like everyone makes it out to be, to the point where folks who can't drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

[–] MystikIncarnate@lemmy.ca 1 points 1 day ago (1 children)

The original post is about Nvidia's domination of discrete GPUs, not consumer GPUs.

So I'm not limiting myself to people running an LLM on their personal desktop.

That's what I was trying to get across.

And it's right on point for the original material.

[–] brucethemoose@lemmy.world 1 points 1 day ago (1 children)

I'm not sure the bulk of datacenter cards count as 'discrete GPUs' anymore, and they aren't counted in that survey. They're generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a 'low end' PCIe server card, but these don't get a ton of use compared to the big silicon sales.

[–] MystikIncarnate@lemmy.ca 1 points 1 day ago (1 children)

I wouldn't be surprised in the slightest if they are included in the list. I dunno, I'm not the statistician who crunched the numbers here. I didn't collect the data, and that source material is not available for me to examine.

What I can say is that the article defines "discrete" GPUs instead of just "GPUs" to eliminate all the iGPUs. Because Intel dominates that space with AMD, but it's hard to make an iGPU when you don't make CPUs, and the two largest CPU manufacturers make their own iGPUs.

The overall landscape of the GPU market is very different than what this data implies.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

Well, it’s no mystery:

https://www.jonpeddie.com/news/q225-pc-graphics-add-in-board-shipments-increased-27-0-from-last-quarter/

It’s specifically desktop addin boards:

AMD’s RX 9070 XT and RX 9070 represent AMD’s new RDNA 4 architecture, competing with Nvidia’s midrange offerings. Nvidia introduced two new Blackwell-series AIBs: the GeForce RTX 5080 Super and the RTX 5070. The company also announced the RTX 500 workstation AIB. Rumors have persisted about two new AIBs from Intel, including a dual-GPU model.

It is including workstation cards like the Blackwell Pro. But this is clearly not including server silicon like the B200, H200, MI325X and so on, otherwise they would have mentioned updates. They are not AIBs.

I hate to obsess over such a distinction, but it’s important: server sales are not skewing this data, and workstation sales volumes are pretty low. It’s probably a accurate chart for gaming GPUs.

[–] GaMEChld@lemmy.world 2 points 1 day ago

The vast majority of consumers do not watch or read reviews. They walk into a Best Buy or whatever retailer and grab the box that says GeForce that has the biggest number within their budget. LTT even did a breakdown at some point that showed how even their most watched reviews have little to no impact on sales numbers. Nvidia has the mind share. In a lot of people's minds GeForce = Graphics. And I say all that as someone who is currently on a Radeon 7900XTX. I'd be sad to see AMD and Intel quit the dGPU space, but I wouldn't be surprised.

[–] tidderuuf@lemmy.world 18 points 3 days ago

The same people buying Intel and Microsoft.

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 11 points 3 days ago (5 children)

Nvidia is the only real option for AI work. Before Trump lifted the really restrictive ban on GPUs to china they had to smuggle in GPUs from the US, and if you're Joe Schmo the only GPUs you can really buy are gaming ones. That's why the 5090 has been selling so well despite it being 2k and not all that much better than the 4090 in gaming.

Also AMD has no high end GPUs, and Intel barely has a mid range GPU.

[–] brucethemoose@lemmy.world 13 points 3 days ago (2 children)

To be fair, AMD is trying as hard as they can to not be appealing there. They inexplicably participate in the VRAM cartel when… they have no incentive to.

[–] GaMEChld@lemmy.world 1 points 1 day ago (1 children)

What's the VRAM cartel story? Think I missed that.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

Basically, consumer VRAM is dirt cheap, not too far from DDR5 in $/gigabyte. And high VRAM (especially 48GB+) cards are in high demand.

But Nvidia charges through the nose for the privilege of adding more VRAM to cards. See this, which is almost the same silicon as the 5090: https://www.amazon.com/Blackwell-Professional-Workstation-Simulation-Engineering/dp/B0F7Y644FQ

When the bill of materials is really only like $100-$200 more, at most. Nvidia can get away with this because everyone is clamoring for their top end cards


AMD, meanwhile, is kind of a laughing stock in the prosumer GPU space. No one's buying them for CAD. No one's buying them for compute, for sure... And yet they do the same thing as Nvidia: https://www.amazon.com/AMD-Professional-Workstation-Rendering-DisplaPortTM/dp/B0C5DK4R3G/

In other words, with a phone call to their OEMs like Asus and such, Lisa Su could lift the VRAM restrictions from their cards and say 'you're allowed to sell as much VRAM on a 7900 or 9000 series as you can make fit." They could pull the rug out from under Nvidia and charge a $100-$200 markup instead of a $3000-$7000 one.

...Yet they don't.

It makes no sense. They're maintaining an anticompetitive VRAM 'cartel' with Nvidia instead of trying to compete.

Intel has more of an excuse here, as they literally don't manufacture a GPU that can take more than 24GB VRAM, but AMD literally has none I can think of.

load more comments (1 replies)
load more comments (4 replies)
load more comments (15 replies)
[–] Brotha_Jaufrey@lemmy.world 57 points 3 days ago (5 children)

IMO there’s zero reason to buy an nvidia gpu if there’s a similarly performing amd card because the price will just be better.

[–] Garry@lemmy.dbzer0.com 19 points 3 days ago (3 children)

Amd promised a msrp of 600 for the 9070xt, it rarely goes below 750. All amd had to do was stick to their prices and have ample stock. Amd is satisfied with second place

load more comments (3 replies)
load more comments (4 replies)
[–] ArchmageAzor@lemmy.world 1 points 1 day ago

I think Nvidia has better marketing. I never really hear anything about AMD cards, where I would I instead hear about Nvidia.

[–] thisNotMyName@lemmy.world 23 points 3 days ago

Intel cards are awesome in a homeserver for media transcoding. Super cheap, super capable, power saving compared to other cards with the features. And although Intel has become a shitty company, I'd really like to see more competition on the gpu market

[–] brucethemoose@lemmy.world 17 points 3 days ago* (last edited 3 days ago) (3 children)

I don’t get this.

Well, if this includes laptops, I get that. Just try to find a dGPU laptop with AMD or Arc these days.


…But in desktops, everyone seems to complain about Nvidia pricing, yet no one is touching Battlemage or the 9000 series? Why? For gaming specifically, they seem pretty great in their price brackets.

Maybe prebuilts are overshadowing that too?

[–] empireOfLove2@lemmy.dbzer0.com 14 points 3 days ago (3 children)

But in desktops, everyone seems to complain about Nvidia pricing, yet no one is touching Battlemage or the 9000 series? Why?

Its always been this way: they want AMD and Intel to compete so Nvidia gets cheaper, not that they will ever buy AMD or Intel. Gamers seem to be the laziest, most easily influenced consumer sector ever.

[–] notthebees@reddthat.com 3 points 2 days ago

People who say buy Intel and amd probably either did or will when they upgrade, which is probably not anytime soon with the way everything seems to be going.

load more comments (2 replies)
[–] pycorax@sh.itjust.works 7 points 3 days ago (1 children)

We've come to a point where PC gaming is so mainstream that the average PC gamer likely doesn't even know that AMD makes GPUs. They'll just complain about the prices and then pay for Nvidia directly or indirectly via prebuilts.

load more comments (1 replies)
load more comments (1 replies)
load more comments
view more: next ›