this post was submitted on 03 Feb 2026
128 points (94.4% liked)

Technology

81118 readers
3546 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 32 comments
sorted by: hot top controversial new old
[–] RedWeasel@lemmy.world 52 points 1 week ago* (last edited 1 week ago) (3 children)

So, around 1947. Took about 14 years to get to being able to put into chips. So another decade and a half?

Edit: and another 15 to 25 years after that for it to be in consumer households?

[–] photonic_sorcerer@lemmy.dbzer0.com 35 points 1 week ago (1 children)

From the byline:

Quantum tech is at its transistor moment—promising, real, and powerful, but still years of hard work away from changing the world

So pretty much, yeah.

[–] Corkyskog@sh.itjust.works 3 points 1 week ago (1 children)

Well years could be 3 years or 300 years so that doesn't really confirm OP's guess.

[–] sorghum@sh.itjust.works 12 points 1 week ago

In this case it's probably both until observed.

[–] funkajunk@lemmy.world 11 points 1 week ago (1 children)

Seeing as we now have a multitude of tools available to us that we didn't have in 1947, I imagine it would be faster.

[–] Gsus4@mander.xyz 4 points 1 week ago* (last edited 1 week ago)

And an already existing consumer base with expectations that were only for hobbyists before...maybe that's a bad thing, because it will constrain QC to evolve in ways that it would be better to explore rather than try to fit modern use cases (or worse: MBA-driven hype)

[–] kutt@lemmy.world 9 points 1 week ago (2 children)

I don’t think it will ever reach consumer households, since it requires extremely complex and expensive materials, tools and physical conditions. Unless a major breakthrough occurs but highly unlikely.

Also we don’t really have a use for them, at least to regular users. They won’t replace classical computers.

But you can already access some QCs online. IBM has a paid remote API for instance.

[–] baggachipz@sh.itjust.works 8 points 1 week ago (1 children)

requires extremely complex and expensive materials, tools and physical conditions.

Counterpoint: they said the same thing when a computer was made of vacuum tubes and took up an entire room to add two digits.

[–] kutt@lemmy.world 5 points 1 week ago (1 children)

Yeah but you have to consider one other thing. Before creating classical computers, we already had theorized them, we had algorithms etc. We knew why we were creating them.

For QC, the pace of hardware development is faster than our ability to create algorithms. It's very similar to what's happening with the AI bubble currently, we're investing heavily in a new technology because it looks cool to investors, but we don't even have enough algorithms to run on it. It's just a shit ton of marketing...

[–] baggachipz@sh.itjust.works 1 points 1 week ago (1 children)

Yeah, understood. I was just saying that because it doesn’t seem technically possible now, don’t discount that it could be in the future. Whether it would be useful, that’s another debate. But I have a hard time believing it has practical uses. If it does though, the innovation will be rapid like the shift to silicon transistors (assuming it is even possible).

[–] kutt@lemmy.world 4 points 1 week ago

Oh I'm not saying it is technically impossible, it's the opposite actually, it's developing extremely fast. And usefulness and having QCs in our homes aren't that far apart to be honest. Why would John Doe have a QC at home if he's not trying to create a new medication, or simulate a molecule? Probably for the same reasons he doesn't have an MRI machine in his living room :)

[–] RedWeasel@lemmy.world 1 points 1 week ago (2 children)

I can currently only see them used as accelerators of some type right now. Could see them used potentially for GPUs, but generally I suspect some form of compute first. GenAI anyone? SkyNET? But that is only if they can be made portable for laptops or phones which is still a major issue still needing to be addressed.

I don't expect them to replace traditional chips in my lifetime if ever.

Could see them used potentially for GPU

Like used as GPUs or like GPUs. The latter, certainly. The former not as much. They aren't a replacement for current tech they accelerate completely different things (and they really do nothing currently that your average consumer would be interested in anyway).

[–] kutt@lemmy.world 2 points 1 week ago

Yes they will probably never replace them because they’re actually slower than classical computers in doing simple calculations.

Quantum ML is actively being researched. However I am not informed at all about the advancement in this field specifically.

But the good news is that it doesn’t need to be portable, we can use them just as we do right now with remote access!

[–] Telorand@reddthat.com 33 points 1 week ago (1 children)

Wake me when they make the contemporary analog to the Apple 2e. Otherwise, this just sounds like a bunch of giant corporations that continue peacocking around in an effort to get VC money.

I applaud the scientists, however, who do this kind of stuff for the love of discovery. Good luck to all of them.

[–] certified_expert@lemmy.world 4 points 1 week ago

With the "vision" of current corporations... that won't happen.

[–] user28282912@piefed.social 31 points 1 week ago (4 children)

So the thing with useful quantum computers is that if they ever do make it actually work and manage to scale it up, the first thing they will do is render most modern encryption obsolete over night. My guess is that Bluffdale has a mountain of encrypted data they'd start cracking immediately.

My cynicism can't allow me to think that we'd hear about it until years after that backlog is cleared and the NSA (and now by extension Israel and Russia) have backdoored any network of interested 10 times over.

The far more likely scenario is that this like stable/cold-ish Fusion, practical graphene, CRiSPER miracle cures are still way more theory than driveable cars at this point and for next several years at least. These folks just want more money and have to keep claiming they are close to get it.

[–] LastYearsIrritant@sopuli.xyz 19 points 1 week ago

https://signal.org/blog/pqxdh/

Many companies already have transitioned to mathematically proven quantum resistant encryption.

Sure, some old stuff will be vulnerable, but we've known the risk for a while and have already started preparation.

[–] FE80@lemmy.world 16 points 1 week ago* (last edited 1 week ago) (1 children)

Post quantum cryptography is already standardized and is being actively rolled out.

https://csrc.nist.gov/projects/post-quantum-cryptography

https://www.openssh.org/pq.html

[–] pinball_wizard@lemmy.zip 7 points 1 week ago

Post quantum cryptography is already standardized and is being actively rolled out.

Yes. Which is good, because it's easy to imagine that every intelligence agency is either over or under-reporting how much quantum decryption they have available.

[–] neonix@reddthat.com 8 points 1 week ago

There's an area of research producing "quantum ready encryption", which uses algorithms that are believed to be secure against quantum attacks. There's been no wholesale migration to this yet, and the protection remains hypothetical until the attacks actually happen.

At least the NSA is not Google.

[–] bibbasa@piefed.social 9 points 1 week ago (1 children)

science is being slopified

[–] ThrowawayPermanente@sh.itjust.works 7 points 1 week ago (1 children)

"To compare how far each platform has advanced across computing, simulation, networking, and sensing, the researchers used large language AI models such as ChatGPT and Gemini to estimate technology-readiness levels (TRL)."

Mother fucker

[–] reptar@lemmy.world 3 points 1 week ago

Oh that's funny

Actually, maybe it's a perfect fit

[–] melsaskca@lemmy.ca 7 points 1 week ago

The intellectual elite always dangling that carrot (of hope) from a stick (of almost there).

[–] phoenixz@lemmy.ca 7 points 1 week ago (1 children)

Any day now!

Any day now, for the past 30 years or so.

Well? We're waiting!

[–] SuspciousCarrot78@lemmy.world 10 points 1 week ago

It already happened. And didn't happen. At the same time.

[–] BurgerBaron@piefed.social 5 points 1 week ago
[–] assassinatedbyCIA@lemmy.world 5 points 1 week ago

Ill believe it when I see it.

[–] leastaction@lemmy.ca 2 points 1 week ago

To compare how far each platform has advanced across computing, simulation, networking, and sensing, the researchers used large language AI models such as ChatGPT and Gemini to estimate technology-readiness levels (TRL).

They asked ChatGPT? That's the paper?