JavaScript comes with 64bit float numbers for all numeric literals, and the storage layout and range is regulated under ITEE 754.
On the other hand, I learned that the float has a range of ± ~10^-323.3 to ~10^308.3 and an as-reliable-as-possible precision. Integer has a range of -2^53 - 2^53 and a reliable precision.
ITEE 754 explains the behavior of float numbers but I get confused about the integer in JS. How is the range of precision generated from the 64bit data format?
[Solved]
The value is stored in the fraction position.
1 is (1+0)*2^0, 2 is (1 + 0) * 2^1, 3 is (1 + 2^-1)*2^1... Any number between -2^53 and 2^53 could be expressed precisely.
As there is a leading 1 for all fraction, the range is -2^53 to 2^53