According to w3schools
Unlike many other programming languages, JavaScript does not define different types of numbers, like integers, short, long, floating-point etc.
JavaScript numbers are always stored as double precision floating point numbers, following the international IEEE 754 standard.
So my question comes: if we want to perform a bitwise operation, how javascript translates that 64-bit IEEE 754 standard float to an ordinary 32-bit integer, and how efficient it is? From intuition, converting numbers will be costly, so will it still be more efficient to use bit shifting than multiply by 2n?
Thank you very much!