this post was submitted on 05 Aug 2023
501 points (92.1% liked)
Programmer Humor
23527 readers
1631 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Makes sense, cause double can represent way bigger numbers than integers.
Also, double can and does in fact represent integers exactly.
Only to 2^54. The amount of integers representable by a long is more. But it can and does represent every int value correctly
*long long, if we're gonna be taking about C types. A long is commonly limited to 32 bits.
C is irrelevant because this post is about Java and in Java long is 64 bits.
you should never be using these types in c anyway,
(u?)int(8/16/32/64)_t
are way more sanedoubles can hold numbers way larger than even 64-bit ints
A double can represent numbers up to +- 1.79769313486231570x10^308, or roughly 18 with 307 zeroes behind it. You can't fit that into a long, or even 128 bits. Even though rounding huge doubles is pointless, since only the first dozen digits or so are saved, using any kind of Integer would lead to inconsistencies, and thus potentially bugs.