this post was submitted on 26 Dec 2025
315 points (97.9% liked)

Programming

24065 readers
429 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
all 30 comments
sorted by: hot top controversial new old
[–] Jankatarch@lemmy.world 2 points 4 hours ago (1 children)

Software engineers and game designers should be allowed 4 gb ram.

[–] Kolanaki@pawb.social 3 points 4 hours ago

*4 kilobytes is all they will ever need.

[–] HugeNerd@lemmy.ca 12 points 14 hours ago

But how can I get anything done with these meager 128 GB computers?

[–] cassandrafatigue@lemmy.dbzer0.com 54 points 1 day ago* (last edited 20 hours ago) (3 children)

It's not running out. It's being hoarded for the entropy machine.

Edit: anyone know if entropy machine ram can be salvaged for human use? If they use the same sticks?

[–] mholiv@lemmy.world 6 points 15 hours ago (2 children)

Yes but you’ll need special hardware. Enterprise systems use registered “RDIMM” modules that won’t work in consumer systems. Even if your system supports ECC that is just UDIMM aka consumer grade with error correction.

This all being said I would bet you could find some cheap Epic or Xeon chips + an appropriate board if/when they crash comes.

[–] cassandrafatigue@lemmy.dbzer0.com 2 points 10 hours ago (1 children)

Okay so I'd just need an enterprise board.

[–] mholiv@lemmy.world 1 points 10 hours ago

Yah. And a CPU to match. Either Epic or Xeon.

[–] fruitycoder@sh.itjust.works 1 points 9 hours ago

Engineering and quality samples float around sometimes which makes them more reasonable prices too. They have minor defects sometimes but I've never had an issue yet that matter

[–] TehPers@beehaw.org 3 points 18 hours ago (1 children)

Server memory is probably reusable, though likely to be either soldered and/or ECC modules. But a soldering iron and someone sufficiently smart can probably do it (if it isn't directly usable).

[–] cassandrafatigue@lemmy.dbzer0.com 3 points 17 hours ago (1 children)

So it's salvageable if they don't burn it out running everything at 500c

[–] TehPers@beehaw.org 3 points 17 hours ago (1 children)

500°C would be way above the safe operating temps, but most likely yes.

[–] cassandrafatigue@lemmy.dbzer0.com 2 points 17 hours ago (1 children)

You think the slop cultists care?

[–] TehPers@beehaw.org 2 points 6 hours ago (1 children)

Yes, actually. Data centers are designed to cool down components pretty efficiently. They aren't cooking the RAM at 500°C.

500 might be hyperbole, but they do burn the things pretty hard. Not at like a real data center, but for the slop cultists.

[–] bluesheep@sh.itjust.works 1 points 15 hours ago

I could use some extra memory. Just jam it into my head I'm sure it'll work

[–] favoredponcho@lemmy.zip 18 points 1 day ago

Just glad I invested in 64GBs when it only cost $200. Same ram today is nearly $700.

[–] whelk@retrolemmy.com 21 points 1 day ago

TUI enthusiasts: "I've trained for this day."

P.S. Yes, I know a TUI program can still be bloated.

[–] Carnelian@lemmy.world 43 points 1 day ago

640k ought to be enough for anybody

[–] Xylight@lemdro.id 4 points 21 hours ago (1 children)

it might be time for me to learn GPUI, i wonder if it's any good.

[–] who@feddit.org 1 points 20 hours ago* (last edited 19 hours ago)

I was impressed with GPUI's description of how they render text, and hope that it either grows into a general purpose GUI toolkit, or inspires one with a similar approach. It has a long way to go, though.

You might find this interesting:
https://github.com/longbridge/gpui-component

In the meantime, Qt is still the only cross-platform desktop toolkit that does a convincing job of native look and feel. If you're not married to Rust, you might have a look at these new Qt bindings for Go and Zig:
https://github.com/mappu/miqt
https://github.com/rcalixte/libqt6zig

[–] magic_lobster_party@fedia.io 13 points 1 day ago

Hah, wishful thinking

[–] who@feddit.org 8 points 1 day ago
[–] melfie@lemy.lol 2 points 1 day ago (2 children)

Everyone better start learning Rust.

[–] SorteKanin@feddit.dk 8 points 1 day ago (1 children)

Rust programs can definitely still consume a lot of memory. Not using a garbage collector certainly helps with memory usage, but it's not going to change it from gigabytes to kilobytes. That requires completely rethinking how things are done.

That said I'm very much in favour of everyone learning Rust, as it's a great language - but for other reasons than memory usage :)

[–] melfie@lemy.lol 3 points 1 day ago

True, but memory will be freed in a more timely manner and memory leaks probably won’t happen.

[–] TheAgeOfSuperboredom@lemmy.ca 6 points 1 day ago (1 children)

.clone() everything!

I do kind of agree in a way though. Rust forces you to think a bit about memory and the language does tend to guide towards good design. But it's not magic and it's easy to write inefficient Rust too. Especially if you just clone everything. But I personally find Rust to be a good mix of low level control that feels sufficiently high level.

Garbage collected languages can be memory efficient too though. Having easily shared references is great!

[–] magic_lobster_party@fedia.io 2 points 15 hours ago

After you’ve cloned everything you’ll Arc<Mutex<>> everything.

I'm not sure what there's less excuse for, the software bloat or the memory running out.