I open terminal browser (Chrome for example).
I write this:
var y = "11000011010101011";
"11000011010101011"
parseInt(y)
11000011010101012
I expected 11000011010101011 but it returns me 11000011010101012.
Does anybody know why?
I open terminal browser (Chrome for example).
I write this:
var y = "11000011010101011";
"11000011010101011"
parseInt(y)
11000011010101012
I expected 11000011010101011 but it returns me 11000011010101012.
Does anybody know why?
Every number in Javascript is represented as a double precision floating point. JavaScript can accurately represent integers only up to 9007199254740991 (2^53 - 1). Once you get over that limit, you will loose precision.
According to this page.
All number in Javascript are 64-bit floating point number, and integers are represented by the 53-bit mantisa. Because of that, you can't store a integer larger than 2^53 -1 and smaller than -2^53 +1 without losing precission (Javascript rounds your number in order to be able to store it).
Your number is larger than 2^53 -1. Even though a String can store it, in order to store it in a "Number" variable, it has to be rounded, losing the precision and returning you a slightly different number.
Only 9007199254740991 is the safe integer in javascript.
This is case something like
9007199254740992 + 1 // 9007199254740992
9007199254740993 + 1 // 9007199254740992
9007199254740994 + 1 //9007199254740996
Please see the links for more details
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number
http://www.2ality.com/2012/04/number-encoding.html
Also see this one
how addition of Number works on max limit numbers?
This is the dublicate questions see its original