Unused memory is wasted memory
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
Cloud providers LOVE you with this one quick trick!
Also goes for mobile. You use more memory and apps get killed.
and with a good enough leak, the amount of unused memory will become negative!
Not freeing your memory at all is a memory management strategy. I think some LaTeX compilers use it as well as surprisingly many Java applications.
.net
Anything I run in C# or similar seems to allocate 512GB of virtual address space and then just populates what it actually uses.
That's the funny thing. I had a (yet) very basic Programm and did not care at all about memory management. When I did some testing I realised, that for some reason when I printed string 1 I also got characters from string 2.
That sounds like it could be memory corruption. That should not happen because every string should be separated by a null terminator.
This non-sarcastically. The operating system is better at cleaning up memory than you, and it's completely pointless to free all your allocations if you're about to exit the program. For certain workloads, it can lead to cleaner, less buggy code to not free anything.
It's important to know the difference between a "memory leak" and unfreed memory. A leak refers to memory that cannot be freed because you lost track of the address to it. Leaks are only really a problem if the amount of leaked memory is unbounded or huge. Every scenario is different.
Of course, that's not an excuse to be sloppy with memory management. You should only ever fail to free memory intentionally.
Absolutely. I once wrote a server for a factory machine that spawned child processes to work each job item. Intentionally we did not free any memory in the child process because it serves only one request and then exits anyway. It’s much more efficient to have the OS just clean up everything and provides strong guarantees that nothing can be left behind accidentally for a system where up time was money. Any code to manage memory was pointless line noise and extra developer effort.
In fact I think in the linker we specifically replaced free with a function that does nothing.
Upvoted. This is something I learned rather recently. Sometimes it's more performant to slowly leak than it would be to free properly. Then take x amount of time to restart every n amount of time.
Back when I was a kid and was learning C, I used to wonder why people considered pointers hard.
My usage of pointers was like:
void func (int * arg1)
{
// do sth with arg1
}
int main ()
{
int x;
func (&x);
return 0;
}
I didn't know stuff like malloc
and never felt the need in any of the program logic for the little thingies I made.
Pointers are not hard. Memory management makes it hard.
You haven't lived until you've produced a memory leak in JavaScript.
Valgrind to the rescue
Congratulations! Now you can get a job at Fortinet.
(Fortinet is a network security vendor...think firewalls, HLBs, etc. They get an ungodly amount of memory leak bugs, or at least far more than you would expect from an enterprise firewall)
Get a job playing Fortnite, got it
(Insert image of Kronk here)
RAII.
Can’t leak what never leaves the stack frame.
Isn’t this for C++?
Classes are just pretentious structs.
And then OP used valgrind
It'll be fun when you get to funny errors because you used free
d memory.
When I was learning about linked lists and decided to use them in a project, I "removed" items by making the previous item's next
point to this item's next
, except I misplaced a call to free
before using the fields, and it somehow still worked most of the time on debug builds, but on optimized builds it would cause a segmentation fault 100% of the time.