In chrome 87, the length in array gives following output:
length in ['test']
false
but if we try with more elements, it outputs following:
length in ['test', 'test2']
true
Can somebody explain these different outputs?
The length you're using refers not to the 'length' property, but the value contained in the global identifier length, which exists on window. window.length is, per MDN:
Returns the number of frames (either
<frame>or<iframe>elements) in the window.
So, your code will produce that result if you have exactly 1 frame or iframe in the window - since the second array has a [1] property, but the first array does not have a [1] property.
Live snippet illustrating this:
// NOTE THE EXISTENCE OF ONE IFRAME in the HTML
console.log(length in ['test']);
console.log(length in ['test', 'test2']);
<iframe></iframe>
Similarly:
const length = 3;
console.log(
length in [0, 1, 2, 3, 4, 5] // true; more than 3 items
);
console.log(
length in [0, 1] // false; less than 3 items
);
length in your snippets is a variable, which I assume is defined somewhere else and has some value. Depending on its value length in [0, 1] can return true or false.
Since you mentioned Chrome, I assume you're running this snippet in chrome dev tools. In this case length is a global variable (actually, it's a property of window: window.length). Check its value - it's probably 1. This explains why 1 in ['test'] is false (there is no element with index 1 here), but 1 in ['test1', 'test2'] is `true.
If you want to check a presence of a property "length", you need to have it as a string:
'length' in []
In this case it will return true for any array - because all arrays have the "length" property
length?