Why is this:
console.log("1100" ^ "0001")
=> 1101 // as expected
console.log("1100" ^ "1001")
=> 1957 // ???
Please explain. Thanks.
Those numbers are interpreted as decimal numbers.
Try:
console.log(parseInt("1100", 2) ^ parseInt("1001", 2))
Of course the answer (0101) is printed in decimal (5).
The JavaScript token grammar supports numbers in decimal, octal, and hex, but not binary. Thus:
console.log(0xC0 ^ 0x09)
The first one worked, by the way, because 1100 (decimal) is 1101 (decimal) after the xor with 1.
1101 | 0001 = 1101(OR), while1101 ^ 0001 = 0101(XOR).