Javascript is known to have 253 (9007199254740992, or 0x20000000000000) as the largest integer value for its Number object, as discussed here. I still don't understand why Number(0x20000000000000)+1 produces 0x20000000000000, but Number(0x20000000000000)+2 produces 0x20000000000002 (9007199254740994). Can someone please explain?
-
2Read this ArticleMoritz Roessler– Moritz Roessler2013-09-17 08:56:43 +00:00Commented Sep 17, 2013 at 8:56
-
Between 2^52=4,503,599,627,370,496 and 2^53=9,007,199,254,740,992 the representable numbers are exactly the integers. For the next range, from 2^53 to 2^54, everything is multiplied by 2, so the representable numbers are the even ones.Moritz Roessler– Moritz Roessler2013-09-17 08:59:33 +00:00Commented Sep 17, 2013 at 8:59
-
Thanks, that makes perfect sense. I'd accept it as an answer if you post it.noseratio– noseratio2013-09-17 09:05:17 +00:00Commented Sep 17, 2013 at 9:05
-
You're welcome =) I posted the quote as AnswerMoritz Roessler– Moritz Roessler2013-09-17 09:18:12 +00:00Commented Sep 17, 2013 at 9:18
Add a comment
|
1 Answer
Quoted from this Wikipedia article
Between 2^52=4,503,599,627,370,496 and 2^53=9,007,199,254,740,992 the representable numbers are exactly the integers. For the next range, from 2^53 to 2^54, everything is multiplied by 2, so the representable numbers are the even ones.