If I have an array object of times in milliseconds like this:
_m7q7sj2wp:
{ '1433891300733': 1433891300733,
'1433891364133': 1433891364133,
'1433891526736': 1433891526736,
'1433891542793': 1433891542793,
'1433891550861': 1433891550861 },
_7gh4d0f6l:
{ '1433891352741': 1433891352741,
'1433891390836': 1433891390836,
'1433891461772': 1433891461772,
'1433891523376': 1433891523376,
'1433891598007': 1433891598007 },
It was created with this:
for (var i = 0, j = results.length; i < j; i++){
var userguid = results[i].guid;
var timestamp = results[i].ts;
if(!guid_t[userguid]){
guid_t[userguid] = {};
}
guid_t[userguid][timestamp] = timestamp;
}
If there is a better way to create the array I'm all for it. I don't really like having to double up the key:value, but I don't know how else to do this.
how would I go about getting the time between each item in the array.
For example, items 1-2, 2-3, 3-4, 4-5.
I'm trying to get the average time between each item.
Fun part is that there are about 400 of these arrays, and each one may have a different amount of items in each.
EDIT: I should mention these are full dates created with +new Date();
Conclusion: I've had time to test all three answers out, and each one worked! (Don't you just love coding?!)
The following tests were using approximately 7000-8000 arrays each. I also had to convert each answer into a function to go through that many, but that addition was trivial and took a few seconds. Then I had to add an additional for loop to combine all the averages into a single value.
Also, please keep in mind this is done with nodejs, with the data coming from mongodb.
These are not the most verbose tests, but they should give people an overview of the performance I had using each answer.
Test 1: Initially, I chose Saumil Soni's answer because it was the first one I tried. The calculations took between 200ms and 1200ms depending on the size of the arrays, and the current server load.
Test 2: Next I tried Jack's answer, and while preforming well, I saw pretty much the same results; sometimes a slight improvment. Calculations were complete in about 175ms to 1100ms.
Test 3:
Lastly I tried Keith A's answer. I'm not going to lie, I held off because I didn't understand reduce at all. I did about 10 minutes of reading to learn about it, and it turns out to be dead simple. It's just another way to do a for loop, but uses a different pattern to accomplish it.
This is what I read: Understand Javascript Array Reduce in 1 Minute
So with reduce in my pocket, I tried his answer with a small addition to account for an average not being possible because the array has only one value. Performance was slightly better again, coming in at 75ms to 985ms total
Here is the final code I came to with the help of all three answers.
function avg(sum){
var avg = sum.reduce(function(total, current, index, array){
if (index < (array.length-1)){
return total + (array[index+1] - current);
}else{
return total / (array.length-1);
}
}, 0);
if(isNaN(avg)){
avg = 30000; //Gave it 30 seconds. Making it 0 lowered the over all average by about 20%. Another option would be to discard it, but I needed it. Change to your liking.
}
console.log(avg);
}
In closing I've changed my accepted answer to Keith A's because it worked better in my case. But all three answer are totally amazing!!!