this post was submitted on 05 Jan 2026
609 points (99.7% liked)

PC Gaming

13134 readers
654 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] RamRabbit@lemmy.world 240 points 5 days ago (5 children)

Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.

Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel's chips still do.

Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.

[–] real_squids@sopuli.xyz 106 points 5 days ago (7 children)

Don't forget the awfully fast socket changes

[–] nokama@lemmy.world 44 points 5 days ago (1 children)

And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.

load more comments (1 replies)
[–] Junkers_Klunker@feddit.dk 21 points 5 days ago (1 children)

Even within the same socket family, looking at you lga1151, can you run into compatibility problems.

[–] captain_aggravated@sh.itjust.works 37 points 5 days ago (4 children)

I think AMD also did a smart thing by branding their sockets. AM4, AM5, what do you think is going to be next? I bet it's AM6. What came after the Intel LGA1151? It wasn't LGA1152.

[–] Junkers_Klunker@feddit.dk 16 points 5 days ago (3 children)

Yea, for the customer it really doesn’t matter how many pins a certain socket has, only is it compatible or not.

load more comments (3 replies)
load more comments (3 replies)
[–] UnspecificGravity@piefed.social 17 points 5 days ago (3 children)

As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn't help.

load more comments (3 replies)
[–] kieron115@startrek.website 12 points 5 days ago* (last edited 5 days ago)

I just read the other day that at least one motheboard manufacturer is bringing back AM4 since DDR4 is getting cheaper than DDR5, even with the "this isn't even manufactured anymore" price markup. That's only even possible because of how much long-term support AMD gave that socket.

load more comments (3 replies)
[–] Valmond@lemmy.dbzer0.com 52 points 5 days ago (1 children)

They really segmented that market in the worst possible way, 2 cores and 4 cores only, possibility to use vms or overclock, and so on. Add windoze eating up every +5%/year.

Remember buying the 2600(maybe X) and it was soo fast.

[–] halcyoncmdr@lemmy.world 25 points 5 days ago (5 children)

The 2600k was exceptionally good and was relevant well past the normal upgrade timeframes.

Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.

[–] Valmond@lemmy.dbzer0.com 1 points 2 days ago

Yes, that was a beast! I was poor and had to wait and got the generation after, the 3770K and already the segmentation was there, I got overlooking possibilities but not the VM stuff...

load more comments (4 replies)
[–] wccrawford@discuss.online 36 points 5 days ago (1 children)

All of the exploits against Intel processors didn't help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.

[–] MotoAsh@piefed.social 19 points 5 days ago (1 children)

Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips...

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 17 points 5 days ago (1 children)

Only one affected AMD, forget which. But Intel knew about the vulnerabilities, but chose not to fix the hardware ahead of their release.

load more comments (1 replies)
[–] degen@midwest.social 2 points 3 days ago

cached in their lead

There are so many dimensions to this

load more comments (1 replies)
[–] phoenixz@lemmy.ca 34 points 4 days ago (2 children)

So the editor asked AI to come up with an image for the title "Gamers desert Intel in droves" and so we get a half-baked pic of a CPU in the desert.

Am I close?

[–] echodot@feddit.uk 3 points 4 days ago (1 children)

Could be worse.

Could have been "gamers dessert Intel in droves"

load more comments (1 replies)
[–] ShrimpCurler@lemmy.dbzer0.com 2 points 3 days ago

Looks like bad photoshop more than AI

[–] voytrekk@sopuli.xyz 118 points 5 days ago (7 children)

Worse product and worse consumer practices (changing sockets every 2 generations) made it an easy choice to go with AMD.

load more comments (7 replies)
[–] rustydrd@sh.itjust.works 56 points 5 days ago* (last edited 5 days ago) (2 children)

Intel until they realized that other companies made CPUs, too

[–] Dettweiler42@lemmy.dbzer0.com 31 points 5 days ago (1 children)

They also bring a "dying transitor problem we don't feel like fixing" to the party, too

[–] burrito@sh.itjust.works 22 points 5 days ago (3 children)

And a constantly changing socket so you have to get a new motherboard every time.

load more comments (3 replies)
[–] jnod4@lemmy.ca 14 points 5 days ago (2 children)
load more comments (2 replies)
[–] grue@lemmy.world 68 points 5 days ago (10 children)

I've been buying AMD since the K6-2, because AMD almost always had the better price/performance ratio (as opposed to outright top performance) and, almost as importantly, because I liked supporting the underdog.

That means it was folks like me who helped keep AMD in business long enough to catch up with and then pass Intel. You're welcome.

It also means I recently bought my first Intel product in decades, an Arc GPU. Weird that it's the underdog now, LOL.

[–] brucethemoose@lemmy.world 15 points 5 days ago* (last edited 5 days ago) (1 children)

AMD almost always had the better price/performance

Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.

[–] grue@lemmy.world 1 points 2 days ago* (last edited 2 days ago)

To be fair, I upgraded my main desktop directly from a Phenom II X4 840(?) to a Ryzen 1700x without owning any Bulldozer stuff in between.

(I did later buy a couple of used Opteron 6272s, but that's different for multiple reasons.)

load more comments (9 replies)
[–] imetators@lemmy.dbzer0.com 8 points 4 days ago

Intel and their last couple of processor generations were a failure. AMD, on the other hand, been consistent. Look at all these tiny AMD APUs that can run C2077 on a 35W computer that fits in the palm of a hand? Valve is about to drop a nuclear bomb on nvidia, intel and microslop with Gabecube.

[–] Lfrith@lemmy.ca 11 points 4 days ago

So happy I chose to go with AM4 board years ago. Was able to go from Zen+ CPU to X3D CPU.

I remember people said back then people usually don't upgrade their CPU, so its not that much a selling point. But, people didn't upgrade because they couldn't due to constant socket changes on the Intel side.

My fps numbers were very happy after the CPU upgrade, and I didn't have to get a new board and new set of ram.

[–] ViperActual@sh.itjust.works 41 points 5 days ago (4 children)

I switched to AMD because of Intel's chip stability issues. No problems since

load more comments (4 replies)
[–] mesamunefire@piefed.social 44 points 5 days ago* (last edited 5 days ago) (1 children)

I remember, it was a huge issue for programs. Developers were just not supporting other chipsets because Intel was faster than the competition and mostly cheaper. Then they got more expensive, did some shitty business to MINIX and stayed the same speed wise.

So now we see what actual competition does.

[–] DaddleDew@lemmy.world 46 points 5 days ago* (last edited 5 days ago) (4 children)

I do want them to stay alive and sort themselves out though. Otherwise in a few years it will be AMD who will start outputting overpriced crap and this time there will be no alternative on the market.

They're already not interested in seriously putting competitive pressure on NVidia's historically high GPU prices.

[–] mesamunefire@piefed.social 18 points 5 days ago

I'm personally hoping more 3rd parties start making affordable RISC V. But yeah I agree, having Intel stick aroundbwould be good for people as you said.

load more comments (3 replies)
[–] somethingold@lemmy.zip 23 points 5 days ago (2 children)

Just upgraded from an i7-6600k to an RX 7800x3D. Obviously a big upgrade no matter if I went AMD or Intel but I'm loving this new CPU. I had an AMD Athlon XP in the early 2000's that was excellent so I've always had a positive feeling towards AMD.

[–] NikkiDimes@lemmy.world 8 points 4 days ago (2 children)

AMD has had a history of some pretty stellar chips, imo. The fx series just absolutelty sucked and tarnished their reputation for a long time. My Phenom II x6, though? Whew that thing kicked ass.

[–] foodvacuum@lemmy.world 4 points 4 days ago

Intel Pentium D era sucked compared to the Athlon 64 II x2 from what I remember. I had an Athlon 64 3000+ just before the dual core era. Athlon 64 era was great

load more comments (1 replies)
load more comments (1 replies)
[–] eli@lemmy.world 30 points 5 days ago

I know we shouldn't have brand loyalty, but after the near decade of quad core only CPUs from Intel, I can't help but feel absolute hate towards them as a company.

I had a 3770k until AMD released their Ryzen 1000 series and I immediately jumped over, and within the next generation Intel started releasing 8 core desktop cpus with zero issues.

I haven't bought anything Intel since my 3770k and I don't think I ever will going forward.

[–] Sineljora@sh.itjust.works 20 points 5 days ago

The United States government owns 10% of Intel now.

[–] BootLoop@sh.itjust.works 32 points 5 days ago* (last edited 5 days ago) (1 children)

Pretty wild to see. Glad to see it though. Hope to see the same thing happen with GPUs against Nvidia as well.

load more comments (1 replies)

I have to lower my 12th Gen cpu multiplier to stop constant crashing when playing UE games, because everything is overlooked at the factory so they could keep up with AMD performance. Fuck Intel.

[–] salacious_coaster@feddit.online 18 points 5 days ago

You can pry my Intel CPU from my cold dead hands...Because I'm never buying a new computer again. I have enough computers already to last until Armageddon.

[–] commander@lemmy.world 10 points 5 days ago* (last edited 5 days ago) (2 children)

I bought into AM5 first year with Zen 4. I'm pretty confident Zen 7 will be AM5. There's got to be little chance for DDR6 to be priced well by the end of the decade. Confident that I'll be on AM5 for 10+ years but way better than the Intel desktop I had for 10 years because I will actually have a great update path for my motherboard. AM4 is still relevant. That's getting to almost 10 years now. It'll still be a great platform for years to come. Really if you bought early in the life of first gen chips on the socket for AM4/AM5, you're looking at a 15 year platform. Amazing

load more comments (2 replies)
load more comments
view more: next ›