this post was submitted on 03 Dec 2024
233 points (97.6% liked)

Technology

71759 readers
3599 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

If even half of Intel's claims are true, this could be a big shake up in the midrange market that has been entirely abandoned by both Nvidia and AMD.

you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 47 points 6 months ago* (last edited 6 months ago) (1 children)

If they double up the VRAM with a 24GB card, this would be great for a "self hosted LLM" home server.

3060, 3090 prices have been rising like crazy because Nvidia is vram gouging and AMD inexplicably refuses to compete. Even ancient P40s (double vram 1080 TIs with no display) are getting expensive. 16GB on the A770 is kinda meager, but 24GB is the point where you can fit the Qwen 2.5 32B models that are starting to perform like the big corporate API ones.

And if they could fit 48GB with new ICs... Well, it would sell like mad.

[–] Psythik@lemmy.world 29 points 6 months ago (4 children)

I always wondered who they were making those mid- and low-end cards with a ridiculous amount of VRAM for... It was you.

All this time I thought they were scam cards to fool people who believe that bigger number always = better.

[–] brucethemoose@lemmy.world 15 points 6 months ago

Also "ridiculously" is relative lol.

The Llm/workstation crowd would buy a 48GB 4060 without even blinking, if that were possible. These workloads are basically completely vram constrained.

[–] sugar_in_your_tea@sh.itjust.works 12 points 6 months ago (1 children)

Yeah, AMD and Intel should be running high VRAM SKUs for hobbyists. I doubt it'll cost them that much to double the RAM, and they could mark them up a bit.

I'd buy the B580 if it had 24GB RAM, at 12GB, I'll probably give it a pass because my 6650 XT is still fine.

[–] M600@lemmy.world 2 points 6 months ago (2 children)

Don’t you need nvidia cards to run ai stuff?

[–] sugar_in_your_tea@sh.itjust.works 12 points 6 months ago* (last edited 6 months ago)

Nah, ollama works w/ AMD just fine, just need a model w/ enough VRAM.

I'm guessing someone would get Intel to work as well if they had enough VRAM.

[–] Wooki@lemmy.world 3 points 6 months ago
[–] brucethemoose@lemmy.world 1 points 6 months ago (1 children)

Like the 3060? And 4060 TI?

Its ostensibly because they’re “too powerful” for their vram to be cut in half (so 6GB on the 3060 and 8GB on the 4060 TI), but yes, more generally speaking these are sweetspot for vram heavy workstation/compute workloads. Local LLMs are just the most recent one.

Nvidia cuts vram at the high end to protect their server/workstation cards, AMD does it… Just because?

[–] Psythik@lemmy.world 1 points 6 months ago* (last edited 6 months ago) (1 children)

More like back in the day when you would see vendors slapping 1GB on a card like the Radeon 9500, when the 9800 came with 128MB.

[–] brucethemoose@lemmy.world 3 points 6 months ago* (last edited 6 months ago)

Ah yeah those were the good old days when vendors were free to do that, before AMD/Nvidia restricted them. It wasn't even that long ago, I remember some AMD 7970s being double VRAM.

And, again, I'd like to point out how insane this restriction is for AMD given their market struggles...