I have an array of arrays declared as the following:
var rangeValues = [[0.001, 0.01], [0.0000001, 0.000001]];
This array is used to fill the values of a drop down, since I essentially need a tuple for each drop down item's value. I proceed to access the values of the drop down with the following:
rangeTuple = document.getElementById('rangeSelection').value;
console.log(rangeTuple);
selectedMinRange = rangeTuple[0];
selectedMaxRange = rangeTuple[1];
console.log(selectedMinRange + " | " + selectedMaxRange[1]);
And I receive the following output:
0.001,0.01
0 | .
In my understanding (albeit limited with JS :) ), rangeTuple should be an array with two items in it. When rangeTuple is logged, it looks correct. However, when I try and assign the items in this tuple to a pair global variables, the values are not the ones I expect.
Any help is appreciated,