1

Javascript is known to have 253 (9007199254740992, or 0x20000000000000) as the largest integer value for its Number object, as discussed here. I still don't understand why Number(0x20000000000000)+1 produces 0x20000000000000, but Number(0x20000000000000)+2 produces 0x20000000000002 (9007199254740994). Can someone please explain?

4
  • 2
    Read this Article Commented Sep 17, 2013 at 8:56
  • Between 2^52=4,503,599,627,370,496 and 2^53=9,007,199,254,740,992 the representable numbers are exactly the integers. For the next range, from 2^53 to 2^54, everything is multiplied by 2, so the representable numbers are the even ones. Commented Sep 17, 2013 at 8:59
  • Thanks, that makes perfect sense. I'd accept it as an answer if you post it. Commented Sep 17, 2013 at 9:05
  • You're welcome =) I posted the quote as Answer Commented Sep 17, 2013 at 9:18

1 Answer 1

3

Quoted from this Wikipedia article

Between 2^52=4,503,599,627,370,496 and 2^53=9,007,199,254,740,992 the representable numbers are exactly the integers. For the next range, from 2^53 to 2^54, everything is multiplied by 2, so the representable numbers are the even ones.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.