this post was submitted on 29 Sep 2025
164 points (84.7% liked)

Technology

75756 readers
2794 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

...In Geekbench 6.5 single-core, the X2 Elite Extreme posts a score of 4,080, edging out Apple’s M4 (3,872) and leaving AMD’s Ryzen AI 9 HX 370 (2,881) and Intel’s Core Ultra 9 288V (2,919) far behind...

...The multi-core story is even more dramatic. With a Geekbench 6.5 multi-core score of 23,491, the X2 Elite Extreme nearly doubles the Intel Core Ultra 9 185H (11,386) and comfortably outpaces Apple’s M4 (15,146) and AMD’s Ryzen AI 9 370 (15,443)...

...This isn’t just a speed play — Qualcomm is betting that its ARM-based design can deliver desktop-class performance at mobile-class power draw, enabling thin, fanless designs or ultra-light laptops with battery life measured in days, not hours.

One of the more intriguing aspects of the Snapdragon X2 Elite Extreme is its memory‑in‑package design, a departure from the off‑package RAM used in other X2 Elite variants. Qualcomm is using a System‑in‑Package (SiP) approach here, integrating the RAM directly alongside the CPU, GPU, and NPU on the same substrate.

This proximity slashes latency and boosts bandwidth — up to 228 GB/s compared to 152 GB/s on the off‑package models — while also enabling a unified memory architecture similar in concept to Apple’s M‑series chips, where CPU and GPU share the same pool for faster, more efficient data access...

... the company notes the "first half" of 2026 for the new Snapdragon X2 Elite and Snapdragon X2 Elite Extreme...

all 50 comments
sorted by: hot top controversial new old
[–] Alphane_Moon@lemmy.world 197 points 5 days ago* (last edited 5 days ago) (5 children)

Keep in mind the original X Elite benchmarks were never replicated in real world devices (not even close).

They used a desktop style device (with intense cooling that is not possible with laptops) and "developed solely for benchmarking" version of Linux (to this day X Elite runs like shit in Linux).

This is almost certainly a premeditated attempt at "legal false advertising".

Mark my words, you'll never see 4,000 points in GB6 ST on any real products.

[–] boonhet@sopuli.xyz 65 points 5 days ago* (last edited 5 days ago) (3 children)

They also used the base M4, not M4 Pro or Max

[–] CmdrShepard49@sh.itjust.works 36 points 5 days ago (1 children)

Seems like they're also using two different Intel chips in their testing for some reason.

[–] circuitfarmer@lemmy.sdf.org 24 points 5 days ago

I'll take cherrypicking for $500, Alex

[–] Ugurcan@lemmy.world 5 points 4 days ago (1 children)
[–] boonhet@sopuli.xyz 3 points 4 days ago

lol that’s just the cherry on the whole apple pie.

[–] Reverendender@sh.itjust.works 3 points 4 days ago

Now this all makes sense

[–] Zak@lemmy.world 21 points 5 days ago (1 children)

I imagine things would be much closer if they put a giant heatsink that Ryzen 370 they're comparing and ran it at its 54W configurable TDP instead of the default 28W.

[–] pycorax@sh.itjust.works 6 points 4 days ago

Shouldn't they also be comparing it to Strix Halo instead?

[–] tal@olio.cafe 12 points 5 days ago

Ah. Thanks for the context.

Well, after they have product out, third parties will benchmark them, and we'll see how they actually stack up.

[–] itztalal@lemmings.world 5 points 4 days ago

desktop-class performance at mobile-class power draw

This made my bullshit detector go haywire.

[–] SharkAttak@kbin.melroy.org 6 points 5 days ago

I saw someone liquid cool an Arduino to push it to the max, but you couldn't declare it to be a regular benchmark...

[–] Buffalox@lemmy.world 71 points 5 days ago (3 children)

Snapdragon X2 Elite Extreme

That doesn't sound very high end, I think I'll wait for the Pro version, preferably Pro Plus.

[–] prettybunnys@sh.itjust.works 2 points 2 days ago

The ultra absorbent one is the one to get

[–] PalmTreeIsBestTree@lemmy.world 17 points 5 days ago (1 children)

It sounds like an advertisement for a condom or dildo

[–] mannycalavera@feddit.uk 7 points 5 days ago

Don't you want to put on some of this thermal paste?

Where this is going, baby, you don't need no thermal paste!

faints on floor

[–] zaphod@sopuli.xyz 6 points 4 days ago

Elite Extreme

Sounds like it focuses more on shiny RGB than performance.

[–] a_fancy_kiwi@lemmy.world 66 points 5 days ago* (last edited 5 days ago) (1 children)

Let me know when these X elite chips have full Linux compatibility and then I’ll be interested. Until then, I’ll stick with Mac, it has the better hardware.

[–] just_another_person@lemmy.world 50 points 5 days ago* (last edited 5 days ago) (1 children)

I'm going to call semi-bullshit here, or there is a major revisionist version or catch. If this were true, they'd be STUPID to not be working fast as hell to get full, unlocked Linux support upstreamed and start selling this as a datacenter competitor to what Amazon, Microsoft, and Amazon are offering, because it would be an entirely new class of performance. It could also dig into Nvidia and AMDs datacenter sales at scale if this efficient.

[–] boonhet@sopuli.xyz 22 points 5 days ago* (last edited 5 days ago)

They put desktop cooling on the testbench apparently.

They’re also comparing to only the base M4 chip, not the Pro.

Also the M5 could still come out this year. But it also might not so it’s still a fair comparison till then.

Anyway if you’re looking for a Windows laptop specifically and don’t need anything that doesn’t run on ARM, it might be pretty damn good. I’d still wait for independent benchmarks.

[–] the_q@lemmy.zip 42 points 5 days ago (1 children)

Yeah I'll wait for independent benchmarks, thanks.

[–] Damage@feddit.it 18 points 5 days ago

With actual devices

[–] TheGrandNagus@lemmy.world 16 points 4 days ago

The X1 Elite never lived up to its geekbench scores, and the drivers are absolute dogshit.

The X2 Elite wont match Apple or AMD in real world scenarios either, I'd wager.

[–] malwieder@feddit.org 15 points 4 days ago

X2 "Elite Extreme" probably in ideal conditions vs. the base M4 chip in a real-world device. Sure, nice single core results but Apple will likely counter with the M5 (the A19 Pro already reaches around 4,000 and the M chips can probably clock a bit higher). And the M4 Pro and Max already score as high or higher in multi-core. Real world in a 14 inch laptop.

It doesn't "crush" the M4 series at all and we'll see how it'll perform in a comparable power/thermal envelope.

I don't hate what Qualcomm is doing here, but these chips only work properly under Windows and the Windows app ecosystem still hasn't embraced ARM all that much, and from what I've heard Windows' x64 to ARM translation layer is not as good as Rosetta 2. Linux support is pretty horrible, especially at launch.

[–] artyom@piefed.social 31 points 5 days ago

This will be super cool when we actually have OSs that can run on them!

[–] JigglySackles@lemmy.world 15 points 4 days ago

I am simple person. I see geekbench, I ignore claims and rest of article.

[–] YurkshireLad@lemmy.ca 16 points 5 days ago

Windows 11 will turn this into a 486.

[–] verdi@feddit.org 9 points 4 days ago

*X Elite opens browser windows faster under desktop cooling.

FTFY

[–] fittedsyllabi@lemmy.world 5 points 4 days ago

Then Apple releases M5.

[–] KiwiTB@lemmy.world 10 points 5 days ago

I highly doubt this is accurate. Be nice, but doubt it.

[–] commander@lemmy.world 8 points 5 days ago (3 children)

How's the GPU drivers though? Especially to me for Linux. These should be used in PC gaming handhelds but Qualcomm support is mediocre

[–] squaresinger@lemmy.world 2 points 4 days ago

How's the GPU drivers though? Especially to me for Linux.

Not. The answer is not.

[–] humanspiral@lemmy.ca 2 points 5 days ago (1 children)

linux on arm is not mature. on windows, typically emulation of x86 is used. They'll need to also support all of the gpu libraries for gaming.

[–] vaionko@sopuli.xyz 3 points 4 days ago (1 children)

Desktop linux on arm*. The kernel itself has been running on embedded arm deviced for 25 years and on a large portion of phones for 15.

[–] squaresinger@lemmy.world 2 points 4 days ago (1 children)

The question was about GPU drivers, and GPU drivers for ARM-based SoCs aren't even mature on Android. They are going to suck on Linux.

Compared to the drivers for Mali, Adreno and consorts, Nvidia is a bunch of saints, and we know how much Nvidia drivers suck under Linux.

[–] humanspiral@lemmy.ca 1 points 4 days ago (1 children)

Asahi linux is perhaps only distro that is trying to support "desktop arm". Not just gpu, but it does not post for M3/M4 arm chips. Qualcom does not have an OS protection racket, and so could be more helpful to the project, but phone support (limited/tailored to each chip generation it seems) doesn't seem to mean all future arm automagically supported.

[–] squaresinger@lemmy.world 2 points 3 days ago (1 children)

There are quite a few more. For example Debian, Ubuntu, Manjaro, Arch, Fedora, Alpine and Kali also have ARM ports (and probably many others too). Raspberry OS is purpose-built for ARM Desktop. There's others too.

Asahi isn't specifically an ARM Linux, but an Apple Silicon Linux.

Apple Silicon is ARM, but it's also its own semi-custom thing that's not directly compatible with other ARM stuff.

That's the main issue with supporting ARM: You don't have one platform like x86/x64.

On x86/x64 there's an abstraction between the machine code language and the microcode that's actually executed in the CPU. There's a microcode translation layer in the CPU that translates one to the other, so x86/x64 chip designers have a lot of freedom when designing their actual CPU. The downside being that the translation layer consumes a little bit of performance.

There's also the UEFI system and a ton of other things that keep the platform stable and standardized, so that you can run essentially the same software on a 15yo Intel CPU and a modern AMD.

ARM is much more diverse. Some run Devicetree, some don't. There are also multiple different ARM architectures, and since they are customizable, there's just so much variety.

[–] humanspiral@lemmy.ca 1 points 3 days ago (1 children)

thank you for correction. Do any linux distributions support qualcomm's first (last gen) "elite win/chorme books?"

[–] squaresinger@lemmy.world 2 points 3 days ago

I don't have personal experience with that, but according to google (https://www.linaro.org/blog/linux-on-snapdragon-x-elite) it is at least a thing.

Wouldn't expect it to be great though.

If it's anything like their windows driver support then also awful. Maybe things have improved in the last year or so, but has Qualcomm ever put real effort into making ARM Windows laptops good?

[–] flemtone@lemmy.world 4 points 4 days ago

When the Snapdragon GPU performance is on par with AMD's 780m or above then we can talk.

[–] VeloRama@feddit.org 5 points 5 days ago

Can't wait for Linux to support it and Tuxedo creating a laptop with it.

[–] itztalal@lemmings.world 4 points 4 days ago

desktop-class performance at mobile-class power draw

checks source

windowcentral.com

Nothing to see here, folks.

[–] Valmond@lemmy.world 3 points 4 days ago (1 children)

And here I am with my cheap old quad core doing my stuff.

Except for the theoretical interest, what are we supposed to do with stuff like that? Is it just more data centers? Does I sound like 640KB is enough?

[–] MuskyMelon@lemmy.world 2 points 4 days ago

In my experience, arm64 is nowhere close to x64 with heavy multi processing/threading loads.

Oh no, each new chip is going to be tree at something than another chip and vice versa. Anyways, what did people have for lunch?