I am trying to create an array that begins at 0.05 and ends at 2.5. I want the values in between the min and max to grow by increments of 0.245.
var min = 0.05;
var max = 2.5;
var increments = ((max - min) / 100);
var arr = [];
the final output should be like this:
[0.05, 0.0745, 0.99, 0.1235, 0.148 ... 2.5]