this post was submitted on 04 Feb 2026
261 points (95.2% liked)

Technology

81653 readers
4117 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] wioum@lemmy.world 254 points 2 weeks ago (3 children)

I had to check the date on the article. They've been making GPUs for 3 years now, but I guess this announcement--although weird--is a sign that Arc is here to stay, which is good news.

[–] tomalley8342@lemmy.world 110 points 2 weeks ago (7 children)

This article was based off what the CEO said at the Second Annual AI Summit, following the news of their new head of GPU hire who says he "will lead GPU engineering with a focus on AI at Intel". The AI pivot is the actual news.

[–] SinningStromgald@lemmy.world 58 points 2 weeks ago (1 children)

Just what every consumer needs. More AI focused chips.

Intel just trying to cash in on the AI hype to buy the sinking ship, as far as investors are concerned.

[–] pastermil@sh.itjust.works 5 points 2 weeks ago

Don't worry, it's just a relabeling. The stuff is still the same.

[–] CosmoNova@lemmy.world 28 points 2 weeks ago (1 children)

Oh so they will actually not focus on GPUs as end consumer products for you and me. They’re just like Nvidia and AMD. This news really just shows how cooked gaming is.

[–] atthecoast@feddit.nl 3 points 2 weeks ago

I don’t know, perhaps gaming will get rejected AI chips with a few cores broken. The chip design requirements are slightly different but not completely foreign

[–] CIA_chatbot@lemmy.world 14 points 2 weeks ago

It feels like TechCrunch is allowing a drunk Ai to write all its articles now.

[–] AdrianTheFrog@lemmy.world 9 points 2 weeks ago

It's not even a pivot. They've been focusing on AI already. I'm sure they want it to seem like a pivot (and build up hype); the times before apparently just having the hardware and software wasn't enough. nobody cared when the gaudi cards came out, nobody uses sycl or onednn, etc

[–] ParlimentOfDoom@piefed.zip 8 points 2 weeks ago (1 children)

Weird, they're a bit late boarding this train as it already starts to derail...MS just stumbled hard as their AI shit isn't paying off and it drives consumers away.

load more comments (1 replies)
[–] Reygle@lemmy.world 4 points 2 weeks ago

focus on AI

Never mind guys, it's a nothing burger

load more comments (1 replies)
[–] BarbecueCowboy@lemmy.dbzer0.com 9 points 2 weeks ago (2 children)

The actual chips are farmed out to TSMC, I don't believe they've made any in house so I'm guessing maybe they've decided that they're going to do that sometimes now? But then, even some of their CPUs are made by TSMC so I could be on a very wrong path.

[–] ag10n@lemmy.world 15 points 2 weeks ago (3 children)

TSMC is how they stay competitive; that’s what everyone else uses

Intel is still catching up with 18A

The 18A production node itself is designed to prove that Intel can not only create a compelling CPU architecture but also manufacture it internally on a technology node competitive with TSMC's best offerings.

https://www.tomshardware.com/pc-components/cpus/intels-18a-production-starts-before-tsmcs-competing-n2-tech-heres-how-the-two-process-nodes-compare

load more comments (3 replies)
[–] UnfortunateShort@lemmy.world 1 points 2 weeks ago

They want to make Celestial on 18A, no?

[–] fleem@piefed.zeromedia.vip 3 points 2 weeks ago

thanks for your effort

[–] imetators@lemmy.dbzer0.com 101 points 2 weeks ago (1 children)

Like if ARC has never existed before?

[–] Rooster326@programming.dev 2 points 2 weeks ago (2 children)
[–] treesquid@lemmy.world 7 points 2 weeks ago (1 children)

English clearly isn't their first language, but the intent is pretty obviously "As if they aren't already making ARC GPUs?"

load more comments (1 replies)
[–] imetators@lemmy.dbzer0.com 6 points 2 weeks ago

Intel ARC is a GPU brand by Intel that are half the price of a typical Nvidia card at almost the same performance. They been unpopular due to shaky drivers but they have never been canceled. So, stating that Intel will finally enter GPU market is just plain misleading.

[–] Goodeye8@piefed.social 46 points 2 weeks ago (2 children)

Well that article was a waste of space. Intel has already stepped into the GPU market with their ARC cards, so at the very least the article should contain a clarification on what the CEO meant.

And I see people shitting on the arc cards. The cards are not bad. Last time I checked the B580 had performance comparable to the 4060 for half the cost. The hardware is good, it's simply meant for budget builds. And of course the drivers have been an issue, but drivers can be improved and last time I checked Intel is actually getting better with their drivers. It's not perfect but we can't expect perfect. Even the gold standard of drivers, Nvidia, has been slipping in the last year.

All is to say, I don't understand the hate. Do we not want competition in the GPU space? Are we supposed to have Nvidia and AMD forever until AMD gives up because it becomes too expensive to compete with Nvidia? I'd like it to be someone else than Intel but as long as the price comes down I don't care who brings it down.

And to be clear, if Intels new strategy is keeping the prices as they are I'm all for "fuck Intel".

[–] Sineljora@sh.itjust.works 18 points 2 weeks ago (1 children)

The USA owns 10% of the company, which might turn off some.

This is a big part of it, imo. They kissed the ring.

The other part of it is that, per the article, this is an “AI” pivot. This is not them making more consumer-oriented GPUs. Which is frustrating, because they absolutely could be a viable competitor in low-mid tier if they wanted to. But “AI” is (for now) much more lucrative. We’ll see how long that lasts.

[–] ZeDoTelhado@lemmy.world 1 points 2 weeks ago

CPU overhead is quite well known and actually damages a lot the arc cards' position on the budget class

[–] Reygle@lemmy.world 45 points 2 weeks ago

Am I living in an alternate timeline? They've been making GPUs for quite some time- and B580 was actually pretty good, incredibly good for the price.

[–] Jaysyn@lemmy.world 36 points 2 weeks ago

I guess the Arc a750 in my workstation is imaginary?

[–] ApplyingAutomation@lemmy.world 18 points 2 weeks ago
[–] Diplomjodler3@lemmy.world 11 points 2 weeks ago (1 children)

What the fuck? What kind of idiotic article is that? Did Techcrunch go down the drain too?

[–] LodeMike@lemmy.today 1 points 2 weeks ago* (last edited 2 weeks ago)

The comma should be replaced with " which will be"

[–] REDACTED 10 points 2 weeks ago

Slowpoke news

[–] Itdidnttrickledown@lemmy.world 9 points 2 weeks ago

The problem with intel. They never just keep going. They announce some new gpu/graphics product and when it falls short they don't or wont stick with it. They abandon it and use it as a write off. They have done this multiple times and I have no reason to believe they will do anything different. The last time was just a few years ago and when sales and performance lagged they just quit.

[–] angrywaffle@piefed.social 7 points 2 weeks ago (1 children)

Doesn't Nvidia have $5bi stakes of intel? I wonder how that influences their decisions.

[–] tal@lemmy.today 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I don't know if "GPUs" is the right term, but the only area where we're seeing large gains in computational capacity now is in parallel compute, so I'd imagine that if Intel intends to be doing high performance computation stuff moving forward, they probably want to be doing parallel compute too.

[–] badabim@lemmy.world 3 points 2 weeks ago

The term you're looking for is GPGPU (General Purpose computing on GPU)

[–] fruitycoder@sh.itjust.works 2 points 2 weeks ago

Damn. I though this thread was being hyperbolic but they really wrote it like Intel will, for the first time in their history, making GPUs lmao

[–] thedeadwalking4242@lemmy.world 2 points 2 weeks ago (1 children)

Aren't TPUs like dramatically better for any AI workload?

[–] AdrianTheFrog@lemmy.world 5 points 2 weeks ago (1 children)

Intel's Gaudi 3 datacenter GPU from late 2024 advertises about 1800 tops in fp8, at 3.1 tops/w. Google's mid 2025 TPU v7 advertises 4600 tops fp8, at 4.7 tops/w. Which is a difference, but not that dramatic of one. The reason it is so small is that GPUs are basically TPUs already; almost as much die space as is allocated to actual shader units is allocated to matrix accelerators. I have heard anecdotally.

[–] thedeadwalking4242@lemmy.world 2 points 2 weeks ago (1 children)

At scale the power efficiency is probably really important though

[–] AdrianTheFrog@lemmy.world 2 points 2 weeks ago

Yes, it works out to a ton of power and money, but on the other hand, 2x the computation could be like a few percent better in results. so it's often a thing of orders of magnitude, because that's what is needed for a sufficiently noticeable difference in use.

basing things on theoretical tops is also not particularly equivalent to performance in actual use, it just gives a very general idea of a perfect workload.

[–] ag10n@lemmy.world 2 points 2 weeks ago

Been looking at their Arc B50/B60 but still too expensive in Canada

[–] Paragone@piefed.social 1 points 2 weeks ago

From what I've read about the "quality" of their drivers, .. NVidia isn't under any threat, whatsoever.

Years before bugs get fixed, etc..

( Linux, not MS-Windows, but it's Linux where the big compute gets done, so that's relevant )

https://www.phoronix.com/review/llama-cpp-vulkan-eoy2025/5

for some relevant graphs: Intel isn't a real competitor, & while they may work to change that .. that lag is SERIOUSLY bad, behind NVidia.

_ /\ _

[–] RememberTheApollo_@lemmy.world 1 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

Oh great, some wildly overpriced and underperforming GPUs.

Edit: went looking at Intel’s desktop GPUs and found this gem:

Powerful AI Engines

Unlock new AI experiences with up to 233 TOPS of AI engine performance for content creation, real-time AI chat, editing, and upscaled gaming.3

And checked out the specs for performance of Intel’s top cards (B580/A770) against a basic 3080 card (no OC/TI, whatever) and the intel cards ranked well below the older 3080, and weren’t even in the ballpark against upper tier 4- and 5- series Nvidia cards. Plus missing features like DLSS, etc.

Good enough for non-FPS dependent gaming? Sure. Can’t beat the price, I was wrong about that. Want to play high-FPS demanding twitch gaming? No.

[–] PrivateNoob@sopuli.xyz 10 points 2 weeks ago

Atleast it's a 3rd contender, but afaik the Arc series had a decent enough pricing, although AMD's prices seemed better, but not sure

[–] notthebees@reddthat.com 7 points 2 weeks ago

they've been quite good on the pricing front?

[–] Zetta@mander.xyz 6 points 2 weeks ago* (last edited 2 weeks ago)

"oh great, competition in a market with no competition. Horrible."

Intel has already been making discrete GPUs for two generations and they are very cheap and aren't the most performant but fantastic for the price.

I'd rather a non-US player enter the market like moorethreads, but because of us capitalist assholes, handicapping China competition for a long time they aren't going to be able to make cards that are up to our performance standards till the 2030s probably

load more comments (1 replies)
load more comments
view more: next ›