this post was submitted on 27 Jan 2026
17 points (74.3% liked)

No Stupid Questions

45688 readers
803 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 2 years ago
MODERATORS
 

I recently learned that Britain is spending £36 million to upgrade a supercomputer:

https://www.bbc.com/news/articles/c79rjg3yqn3o

Can't you buy a very powerful gaming computer for only $6000?

CPU: AMD R9 9950X3D

Graphics: Nvidia RTX 5080 16GB

RAM: 64GB DDR5 6000MHZ RGB

https://skytechgaming.com/product/legacy-4-amd-r9-9950x3d-nvidia-rtx-5090-32gb-64gb-ram-3

This is how this CPU is described by hardware reviewers:

AMD has reinforced its dominance in the CPU market with the 9950X3D, it appears that no competitor will soon be able to challenge that position in the near future.

https://www.techpowerup.com/review/amd-ryzen-9-9950x3d/29.html

If you want to add some brutal CPU horsepower towards your PC, then this 16-core behemoth will certainly get the job done as it is an excellent processor on all fronts, and it has been a while since have been able to say that in a processor review

https://www.guru3d.com/review/ryzen-9-9950x3d-review-a-new-level-of-zen-for-gaming-pcs/page-29/

This is the best high-end CPU on the market.

Why would you spend millions on a supercomputer? Have you guys ever used a supercomputer? What for?

all 23 comments
sorted by: hot top controversial new old
[–] litchralee@sh.itjust.works 16 points 1 day ago (1 children)

An indisputable use-case for supercomputers is the computation of next-day and next-week weather models. By definition, a next-day weather prediction is utterly useless if it takes longer than a day to compute. And is progressively more useful if it can be computed even an hour faster, since that's more time to warn motorists to stay off the road, more time to plan evacuation routes, more time for farmers to adjust crop management, more time for everything. NOAA in the USA draws in sensor data from all of North America, and since weather is locally-affecting but globally-influenced, this still isn't enough for a perfect weather model. Even today, there is more data that could be consumed by models, but cannot due to making the predictions take longer. The only solution there is to raise the bar yet again, expanding the supercomputers used.

Supercomputers are not super because they're bigger. They are super because they can do gargantuan tasks within the required deadlines.

[–] Randomgal@lemmy.ca 5 points 1 day ago

Also space models and quantum models.

[–] truthfultemporarily@feddit.org 21 points 1 day ago (1 children)

If your gaming computer can do x computations every month, and you need to run a simulation that requires 1000x computations, you can wait 1000 months, or have 1000 computers work on it in parallel and have it done in one month.

[–] Tanoh@lemmy.world 8 points 1 day ago

Keep in mind that not all work loads scale perfectly. You might have to add 1100 computers due to overhead and other dcaling issues. It is still pretty good though, and most of those clusters work on highly parallelised tasks, as they are very suited for it.

There are other work loads do not scale at all. Like the old joke in programming. "A project manager is someone that thinks that 9 women can have a child in one month."

[–] adespoton@lemmy.ca 10 points 1 day ago

Here’s a vehicle analogy:

If I can get a hypercar for $3 million, why does a freight train cost $32 million? It’s not like it can go faster, and it’s more limited in what it can do.

[–] notsosure@sh.itjust.works 18 points 1 day ago* (last edited 1 day ago) (1 children)

In molecular biology, they are used for instance to calculate/ predict protein-folding. This in turn is used to create new drugs.

[–] blackbelt352@lemmy.world 11 points 1 day ago

A super computer isn't just a single computer, it's a lot of them networked together to greatly expand the calculation scaling. If you can imagine a huge data center, with thousands of racks of hardware, CPUs, GPUs and RAM chips all dedicated to the tasks of managing network traffic for major websites, its very similar to that but instead of being built to handle all the ins and outs and complexities of managing network traffic, it's purely dedicated to doing as many calculations for a specific task, such as protein folding as someone else mentioned, or something like Pixar's Render Farm, which is hundreds of GPUs all networked together dedicated solely to the task of rendering frames.

With how big and complex any given 3d scenes are in any given Pixar film one single GPU might take 10 hours to calculate the light bounces in a scene to render a single frame, assuming a 90 minute run time, that ~130,000 frames, which is potentially 1,300,000 hours (or about 150 years) to complete just 1 full movie render on a single GPU. If you have 2 GPUs working on rendering frames, you've now cut that time down to 650,000 hours. Throw 100 GPUs at the render, we've cut time to 13,000 hours, or about a year and a half. Pixar is pretty quiet about their numbers but at least according to the Science of Pixar traveling exhibit during the time of Monster University in 2013, their render farm had about 2000 machines with 24,000 processing cores, and it took 2 years worth of rendering time to render that movie out, and I can only imagine how much bigger their render farm has gotten since then.

Source: https://sciencebehindpixar.org/pipeline/rendering

You're not building a super computer to be able to play Crysis, you're building a super computer to do lots and lots and lots of math that might take centuries of calculation to do on a single 16 core machine.

[–] bingrazer@lemmy.world 12 points 1 day ago* (last edited 1 day ago)

I'm a PhD student and several of my classmates use computing clusters in their work. These types of computers typically have a lot of CPUs, GPUs, or both. The types of simulations they do are essentially putting a bunch of atoms or molecules in a box and seeing what happens in order to get information which is impossible to obtain experimentally. Simulating beyond a few nanoseconds in a reasonable amount of time is extremely difficult and requires a lot of compute time. However, there are plenty of other uses.

The clusters we have would have dozens of these CPUs or GPUs and users would submit jobs to it which would run simultaneously. AMD CPUs have better performance than Intel and Nvidia GPUs have Cuda, which is incorporated into a lot of the software people use for these.

I've personally never used anything more than a desktop, though I might apply for some time soon because I've got some datasets where certain fits take up to two days each. I don't want to sit around for a month waiting for these to finish

[–] LordMayor@piefed.social 6 points 1 day ago

Galaxy collisions, protein folding, mechanical design and much more. Big simulations of real world physics and chemistry that require massively parallel computation, problems that require insane numbers of calculations running on multiple machines that can pass data to each other very quickly.

Every supercomputing center has a website where you can read about the research being done on their machines.

No, these can’t be done at similar scale on your desktop. Your PC can’t do that many calculations in a reasonable time. Even on supercomputers, they can take weeks.

[–] slazer2au@lemmy.world 9 points 1 day ago

Run Linpack and see how many flops you get.
https://github.com/icl-utk-edu/hpl/

Then compare it to the Top 500 list.
https://www.top500.org/lists/top500/

I bet you are at least 3 orders of magnitude away from the bottom.

[–] remon@ani.social 10 points 1 day ago* (last edited 1 day ago) (1 children)

They are used to solve problems that would take years on a normal gaming PC.

[–] Rekorse@sh.itjust.works 1 points 1 day ago

Thats not the best on the market. I'm not sure who sells what else but the threadripper series is far more powerful, and expensive.

What are super-computers used for ?

Its used to build a giant bot network that makes brand new accounts to ask questions in various forums.

[–] Ziggurat@jlai.lu 4 points 1 day ago* (last edited 1 day ago)

A huge factor is how much data you can process at a given time. Often, in the end it's not that complicated per sample of data. But when you need to run on terra bytes of data (let's say wide angle telescopes or CERN style experiment) you need huge computer to simulate your system accurately (How does the glue layer size impacts the data?) and process the mountain of data coming from it.

Nowaday, practically speaking it's just a building full of standard computers and software process dispatching the load between the machines (which isn't trivial especially when you do mass parallel processing with shared memory)

[–] Zwuzelmaus@feddit.org 3 points 1 day ago

As you might know or not, computers can only count from 0 to 1.

But super-computers can do that in the best possible way.

/s

[–] SuiXi3D@fedia.io 2 points 1 day ago

I’ll toss in my two cents.

It’s mainly about handling and processing vast amounts of data. Many times more than you or I may deal with on a day to day basis. First, you have to have somewhere to put it all. Then, you’ve got to load whatever you’re working with into memory. So you need terabytes of RAM. When you’re dealing with that much data, you need beefy CPUs with crazy fast connections with a ton of bandwidth to move it all around at any kind of reasonable pace.

Imagine opening a million chrome tabs, having all of them actively playing a video, and needing to make sense of the cacophony of sound. Only instead of sound, it’s all text, and you have to read all of it all at once to do anything meaningful with it.

If you make a change to any of that data, how does it affect the output? What about a million changes? All that’s gotta be processed by those beefy CPUs or GPUs.

Part of the reason AI data enters need so much memory is because they’ve got to load increasingly large amounts of training data all at once, and then somehow have it be able to be accessed by thousands of people all at once.

But if you want to understand every permutation of whatever data you’re working with, it’s gonna take a ton of time to essentially sift through it all.

And all that’s gotta be hardware? You have to make doubly sure that the results you get are accurate, so redundancies are built in. Extremely precise engineering of parts, how they’re assembled, and how they’re ultimately used is a lot of what makes supercomputers what they are. Special CPUs, RAM with error correction, redundant connections, backups… it all takes a lot of time, space, and money to operate.

[–] bufalo1973@piefed.social 2 points 1 day ago

Think of a gaming PC from the 90s. And now imagine asking the same then.

Furry ai titties.

[–] littleomid@feddit.org -1 points 1 day ago

Obvious troll is obvious.