this post was submitted on 29 Sep 2025
154 points (84.7% liked)

Technology

75625 readers
1914 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

...In Geekbench 6.5 single-core, the X2 Elite Extreme posts a score of 4,080, edging out Apple’s M4 (3,872) and leaving AMD’s Ryzen AI 9 HX 370 (2,881) and Intel’s Core Ultra 9 288V (2,919) far behind...

...The multi-core story is even more dramatic. With a Geekbench 6.5 multi-core score of 23,491, the X2 Elite Extreme nearly doubles the Intel Core Ultra 9 185H (11,386) and comfortably outpaces Apple’s M4 (15,146) and AMD’s Ryzen AI 9 370 (15,443)...

...This isn’t just a speed play — Qualcomm is betting that its ARM-based design can deliver desktop-class performance at mobile-class power draw, enabling thin, fanless designs or ultra-light laptops with battery life measured in days, not hours.

One of the more intriguing aspects of the Snapdragon X2 Elite Extreme is its memory‑in‑package design, a departure from the off‑package RAM used in other X2 Elite variants. Qualcomm is using a System‑in‑Package (SiP) approach here, integrating the RAM directly alongside the CPU, GPU, and NPU on the same substrate.

This proximity slashes latency and boosts bandwidth — up to 228 GB/s compared to 152 GB/s on the off‑package models — while also enabling a unified memory architecture similar in concept to Apple’s M‑series chips, where CPU and GPU share the same pool for faster, more efficient data access...

... the company notes the "first half" of 2026 for the new Snapdragon X2 Elite and Snapdragon X2 Elite Extreme...

all 48 comments
sorted by: hot top controversial new old
[–] malwieder@feddit.org 13 points 12 hours ago

X2 "Elite Extreme" probably in ideal conditions vs. the base M4 chip in a real-world device. Sure, nice single core results but Apple will likely counter with the M5 (the A19 Pro already reaches around 4,000 and the M chips can probably clock a bit higher). And the M4 Pro and Max already score as high or higher in multi-core. Real world in a 14 inch laptop.

It doesn't "crush" the M4 series at all and we'll see how it'll perform in a comparable power/thermal envelope.

I don't hate what Qualcomm is doing here, but these chips only work properly under Windows and the Windows app ecosystem still hasn't embraced ARM all that much, and from what I've heard Windows' x64 to ARM translation layer is not as good as Rosetta 2. Linux support is pretty horrible, especially at launch.

[–] TheGrandNagus@lemmy.world 12 points 13 hours ago

The X1 Elite never lived up to its geekbench scores, and the drivers are absolute dogshit.

The X2 Elite wont match Apple or AMD in real world scenarios either, I'd wager.

[–] fittedsyllabi@lemmy.world 5 points 13 hours ago

Then Apple releases M5.

[–] JigglySackles@lemmy.world 13 points 17 hours ago

I am simple person. I see geekbench, I ignore claims and rest of article.

[–] verdi@feddit.org 9 points 17 hours ago

*X Elite opens browser windows faster under desktop cooling.

FTFY

[–] Alphane_Moon@lemmy.world 188 points 1 day ago* (last edited 1 day ago) (5 children)

Keep in mind the original X Elite benchmarks were never replicated in real world devices (not even close).

They used a desktop style device (with intense cooling that is not possible with laptops) and "developed solely for benchmarking" version of Linux (to this day X Elite runs like shit in Linux).

This is almost certainly a premeditated attempt at "legal false advertising".

Mark my words, you'll never see 4,000 points in GB6 ST on any real products.

[–] boonhet@sopuli.xyz 63 points 1 day ago* (last edited 1 day ago) (3 children)

They also used the base M4, not M4 Pro or Max

[–] Reverendender@sh.itjust.works 3 points 13 hours ago

Now this all makes sense

[–] Ugurcan@lemmy.world 5 points 16 hours ago (1 children)
[–] boonhet@sopuli.xyz 3 points 15 hours ago

lol that’s just the cherry on the whole apple pie.

[–] CmdrShepard49@sh.itjust.works 34 points 1 day ago (1 children)

Seems like they're also using two different Intel chips in their testing for some reason.

[–] circuitfarmer@lemmy.sdf.org 23 points 1 day ago

I'll take cherrypicking for $500, Alex

[–] Zak@lemmy.world 21 points 1 day ago (1 children)

I imagine things would be much closer if they put a giant heatsink that Ryzen 370 they're comparing and ran it at its 54W configurable TDP instead of the default 28W.

[–] pycorax@sh.itjust.works 6 points 22 hours ago

Shouldn't they also be comparing it to Strix Halo instead?

[–] itztalal@lemmings.world 5 points 21 hours ago

desktop-class performance at mobile-class power draw

This made my bullshit detector go haywire.

[–] tal@olio.cafe 12 points 1 day ago

Ah. Thanks for the context.

Well, after they have product out, third parties will benchmark them, and we'll see how they actually stack up.

[–] SharkAttak@kbin.melroy.org 6 points 1 day ago

I saw someone liquid cool an Arduino to push it to the max, but you couldn't declare it to be a regular benchmark...

[–] Buffalox@lemmy.world 67 points 1 day ago (2 children)

Snapdragon X2 Elite Extreme

That doesn't sound very high end, I think I'll wait for the Pro version, preferably Pro Plus.

[–] zaphod@sopuli.xyz 6 points 15 hours ago

Elite Extreme

Sounds like it focuses more on shiny RGB than performance.

[–] PalmTreeIsBestTree@lemmy.world 17 points 1 day ago (1 children)

It sounds like an advertisement for a condom or dildo

[–] mannycalavera@feddit.uk 7 points 1 day ago

Don't you want to put on some of this thermal paste?

Where this is going, baby, you don't need no thermal paste!

faints on floor

[–] Valmond@lemmy.world 3 points 14 hours ago (1 children)

And here I am with my cheap old quad core doing my stuff.

Except for the theoretical interest, what are we supposed to do with stuff like that? Is it just more data centers? Does I sound like 640KB is enough?

[–] friend_of_satan@lemmy.world 1 points 12 hours ago (1 children)

As an example, you could replace on-disk caching of resized images in photoprism with on-the-fly resized images, effectively trading large disks for faster CPU while retaining equivalent application performance.

[–] Valmond@lemmy.world 1 points 11 hours ago (1 children)

Ah, a not at all theoretical example but a real life one 😁 /s

[–] friend_of_satan@lemmy.world 1 points 10 hours ago

It's a real life example for me. I have too many photos for my cache drive to handle so I have to limit which photos I put into photoprism.

[–] flemtone@lemmy.world 4 points 16 hours ago

When the Snapdragon GPU performance is on par with AMD's 780m or above then we can talk.

[–] a_fancy_kiwi@lemmy.world 65 points 1 day ago* (last edited 1 day ago) (1 children)

Let me know when these X elite chips have full Linux compatibility and then I’ll be interested. Until then, I’ll stick with Mac, it has the better hardware.

[–] just_another_person@lemmy.world 49 points 1 day ago* (last edited 1 day ago) (1 children)

I'm going to call semi-bullshit here, or there is a major revisionist version or catch. If this were true, they'd be STUPID to not be working fast as hell to get full, unlocked Linux support upstreamed and start selling this as a datacenter competitor to what Amazon, Microsoft, and Amazon are offering, because it would be an entirely new class of performance. It could also dig into Nvidia and AMDs datacenter sales at scale if this efficient.

[–] boonhet@sopuli.xyz 22 points 1 day ago* (last edited 1 day ago)

They put desktop cooling on the testbench apparently.

They’re also comparing to only the base M4 chip, not the Pro.

Also the M5 could still come out this year. But it also might not so it’s still a fair comparison till then.

Anyway if you’re looking for a Windows laptop specifically and don’t need anything that doesn’t run on ARM, it might be pretty damn good. I’d still wait for independent benchmarks.

[–] the_q@lemmy.zip 41 points 1 day ago (1 children)

Yeah I'll wait for independent benchmarks, thanks.

[–] Damage@feddit.it 17 points 1 day ago

With actual devices

[–] artyom@piefed.social 31 points 1 day ago

This will be super cool when we actually have OSs that can run on them!

[–] itztalal@lemmings.world 4 points 21 hours ago

desktop-class performance at mobile-class power draw

checks source

windowcentral.com

Nothing to see here, folks.

[–] YurkshireLad@lemmy.ca 16 points 1 day ago

Windows 11 will turn this into a 486.

[–] MuskyMelon@lemmy.world 2 points 19 hours ago

In my experience, arm64 is nowhere close to x64 with heavy multi processing/threading loads.

[–] VeloRama@feddit.org 5 points 1 day ago

Can't wait for Linux to support it and Tuxedo creating a laptop with it.

[–] KiwiTB@lemmy.world 10 points 1 day ago

I highly doubt this is accurate. Be nice, but doubt it.

[–] commander@lemmy.world 8 points 1 day ago (3 children)

How's the GPU drivers though? Especially to me for Linux. These should be used in PC gaming handhelds but Qualcomm support is mediocre

[–] squaresinger@lemmy.world 2 points 16 hours ago

How's the GPU drivers though? Especially to me for Linux.

Not. The answer is not.

[–] humanspiral@lemmy.ca 2 points 1 day ago (1 children)

linux on arm is not mature. on windows, typically emulation of x86 is used. They'll need to also support all of the gpu libraries for gaming.

[–] vaionko@sopuli.xyz 3 points 17 hours ago (1 children)

Desktop linux on arm*. The kernel itself has been running on embedded arm deviced for 25 years and on a large portion of phones for 15.

[–] squaresinger@lemmy.world 2 points 16 hours ago (1 children)

The question was about GPU drivers, and GPU drivers for ARM-based SoCs aren't even mature on Android. They are going to suck on Linux.

Compared to the drivers for Mali, Adreno and consorts, Nvidia is a bunch of saints, and we know how much Nvidia drivers suck under Linux.

[–] humanspiral@lemmy.ca 1 points 9 hours ago

Asahi linux is perhaps only distro that is trying to support "desktop arm". Not just gpu, but it does not post for M3/M4 arm chips. Qualcom does not have an OS protection racket, and so could be more helpful to the project, but phone support (limited/tailored to each chip generation it seems) doesn't seem to mean all future arm automagically supported.

If it's anything like their windows driver support then also awful. Maybe things have improved in the last year or so, but has Qualcomm ever put real effort into making ARM Windows laptops good?

Oh no, each new chip is going to be tree at something than another chip and vice versa. Anyways, what did people have for lunch?