Software engineers and game designers should be allowed 4 gb ram.
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
*4 kilobytes is all they will ever need.
But how can I get anything done with these meager 128 GB computers?
It's not running out. It's being hoarded for the entropy machine.
Edit: anyone know if entropy machine ram can be salvaged for human use? If they use the same sticks?
Yes but you’ll need special hardware. Enterprise systems use registered “RDIMM” modules that won’t work in consumer systems. Even if your system supports ECC that is just UDIMM aka consumer grade with error correction.
This all being said I would bet you could find some cheap Epic or Xeon chips + an appropriate board if/when they crash comes.
Okay so I'd just need an enterprise board.
Yah. And a CPU to match. Either Epic or Xeon.
Engineering and quality samples float around sometimes which makes them more reasonable prices too. They have minor defects sometimes but I've never had an issue yet that matter
Server memory is probably reusable, though likely to be either soldered and/or ECC modules. But a soldering iron and someone sufficiently smart can probably do it (if it isn't directly usable).
So it's salvageable if they don't burn it out running everything at 500c
500°C would be way above the safe operating temps, but most likely yes.
You think the slop cultists care?
Yes, actually. Data centers are designed to cool down components pretty efficiently. They aren't cooking the RAM at 500°C.
500 might be hyperbole, but they do burn the things pretty hard. Not at like a real data center, but for the slop cultists.
I could use some extra memory. Just jam it into my head I'm sure it'll work
Just glad I invested in 64GBs when it only cost $200. Same ram today is nearly $700.
TUI enthusiasts: "I've trained for this day."
P.S. Yes, I know a TUI program can still be bloated.
640k ought to be enough for anybody
it might be time for me to learn GPUI, i wonder if it's any good.
I was impressed with GPUI's description of how they render text, and hope that it either grows into a general purpose GUI toolkit, or inspires one with a similar approach. It has a long way to go, though.
You might find this interesting:
https://github.com/longbridge/gpui-component
In the meantime, Qt is still the only cross-platform desktop toolkit that does a convincing job of native look and feel. If you're not married to Rust, you might have a look at these new Qt bindings for Go and Zig:
https://github.com/mappu/miqt
https://github.com/rcalixte/libqt6zig
Hah, wishful thinking
Everyone better start learning Rust.
Rust programs can definitely still consume a lot of memory. Not using a garbage collector certainly helps with memory usage, but it's not going to change it from gigabytes to kilobytes. That requires completely rethinking how things are done.
That said I'm very much in favour of everyone learning Rust, as it's a great language - but for other reasons than memory usage :)
True, but memory will be freed in a more timely manner and memory leaks probably won’t happen.
.clone() everything!
I do kind of agree in a way though. Rust forces you to think a bit about memory and the language does tend to guide towards good design. But it's not magic and it's easy to write inefficient Rust too. Especially if you just clone everything. But I personally find Rust to be a good mix of low level control that feels sufficiently high level.
Garbage collected languages can be memory efficient too though. Having easily shared references is great!
After you’ve cloned everything you’ll Arc<Mutex<>> everything.
I'm not sure what there's less excuse for, the software bloat or the memory running out.