Maybe some Chinese manufacturer will find a way to fill the gap in the market
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
i think the latest is that china has managed to create a GPU that’s ~7 years behind. i’m not sure that’s “a GPU from 7 years ago” or “it will take them 7 years, acknowledging that there’s a known path so will take less time”
AFAIK they’ll have to figure out EUV or some other method of lithography at that scale, which they’re trying really hard at but it’s one heck of a difficult thing to do which is why only TSMC currently actually has it working
Their current GPU is roughly equal to a 4060 which isn't that bad when you consider how far behind they are in terms of time.
iirc that was the claim but it did significantly worse in actual tests, I wish em luck though as we can always use more competition on the market/
Here's hoping
Careful what you wish for.
🇨🇳 🚣 🇹🇼
Oh I wasn't wishing for anything, just pointing out the possibility. There are some Chinese companies gearing up to fill the gap in the memory market. GPUs would be much harder, but maybe very profitable.
What could a GPU cost? $5000?

i really hope nvidia collapses when the AI bubble pops. They've been more harm than good for consumers for too long.
It won't collapse. It'll lose a huge chunk of its stock price, but it both has other business to fall back on and its chips will still likely be used in whatever the next tech trend is - probably neural network AI or something.
I am not sure. They have other businesses but not sure those other businesses are able to sustain the obligations that nVidia has committed to in this round. They are juggling more money than their pre-AI boom market cap by a wide margin, so if the bubble pops, unclear how big a bag nVidia will be left holding and if the rest of their business can survive it. Guess they might go bankrupt and come out of it eventually to continue business as usual after having financial obligations wiped away..
Also, they have somewhat tarnished their reputation with going all in on the dataenter equipment to, seemingly here, abandoning the consumer market to make more capacity for the datacenters. So if AMD ever had an opportunity to maybe cash in, well, here it might be.... Except they also dream of being a big datacenter player, but weaker demand may leave them with leftover capacity..
Never underestimate AMD's ability to miss good opportunities.
never underestimate AMDs ability to shoot itself in the foot when its not under immediate threat of collapse/bankruptcy.
We're running straight into a future where consumers' only option for computers are a cloud solution like MS 365
The only future, is one where billionaires aren't in it.
Brother, we're up to trillionaires now and they don't seem like they're going anywhere.
Didn't like 1% of them die from accidents recently? That sub accident, that guy who's penis surgery went wrong.
Pushing constantly towards a subscription economy.
That "economy" is already falling apart. Subscriptions are down, services on "the cloud" are becoming less reliable, piracy is way up again, and major nations and companies are moving to alternatives.
Hell, DDR3 is making a comeback. All that is needed is one manufacturer to start making 15 year old tech again and bam, the house of cards falls.
I have at least 80 DIMMS of DDR3. Upgrading an old tower this morning.
If you want to do work with the GPU you're still buying NVIDIA. Particularly 3D animation, video/film editing, and creative tools. Even FOSS tools like GIMP and Krita prefer NVIDIA for GPU accelerated functions.
As someone not looking to spend a ton of money on new hardware any time soon: good. The longer it takes to release faster hardware, the longer current hardware stays viable. Games aren't going to get more fun by slightly improving graphics anyway. The tech we have now is good enough.
People don't just use computers for gaming. If this continues people will struggle to do any meaningful work on their personal computes which is definitely not good. And I'm not talking about browsing facebook but about coding, doing research, editing videos and other useful shit.
But wait! They can pay for remote computing time for a fraction of the cost! Each month. Forever.
I fully expect personal computers to be phased out in favor of a remote-access, subscription model. AI popping would leave these big data centers with massive computational power available for use, plus it's the easiest way to track literally everything you do on your system.
easiest way to track literally everything you do on your system.
And ban undesired activities. "We see you're building app to track ICE agents. That's illegal. Your account was banned and all your data removed.".
"Remain in your cube - The Freedom Force is en route to administer freedom reeducation. Please be sure to provide proof of medical insurance prior to forced compliance."
Remote computing is very expensive. It's just the gated (owned by companies) LLMs that are cheap for the final consumer. Training a 2b LLM on remote compute will cost thousands of dollars if you try to.
Scientific modeling and simulations
I know radeons don't really have the performance crown, but as a life long Nvidia GPU and Linux user, the PITA drivers are not a problem when you use an AMD radeon card.
They’re AI only now.