I'm implementing a sort, and I've run into some unexpected behavior:
var searches = ['beta', 'alpha'];
var i = 0; j = 0;
for(i = 0; i < searches.length; i++){
min = i;
// first time through, i = 0
alert(i);
for(j = i; j<searches.length; j++);
{
// first time through j = 2. If i = 0, how does j = 2?
alert(j);
// .. sort code
}
}
In fact, j is always 2. Why isn't j being set to i when it enters the for loop?
Here's the jsfiddle: http://jsfiddle.net/w2kK9/3/