I am using a API to generate the average time spent visiting a webpage and then dividing the sum of all returned values by the number of days set in the API.
let avgtime = 0;
for (i in data) {
if (data[i].avg_time_on_site) {
avgtime += data[i].avg_time_on_site;
}
}
$("#avgtime").html(avgtime / datePeriod);
Whereas datePeriod is the total amount of days returned by the API. For example, if the datePeriod is set to 30 it would return the last 30 days of avgtime.
The snippet of avgtime += data[i].avg_time_on_site; returns all of avg_time_on_site from JSON as a sum and is divided by datePeriod to get the average over the set amount of days $("#avgtime").html(avgtime / datePeriod);.
My issue, however, is that this only works if a webpage has a return value for each and every day. If the webpage is not visited on a single day as shown in the example JSON below, avgtime will not be correct because it is dividing the sum of all return values by the total days, not the total amount of days that have rows.
{
"2017-12-30":[],
"2017-12-31":{
"nb_uniq_visitors":1,
"nb_visits":1,
"avg_time_on_site":41,
},
"2018-01-01":{
"nb_uniq_visitors":1,
"nb_visits":2,
"avg_time_on_site":52,
},
"2018-01-02":[],
"2018-01-03":{
"nb_uniq_visitors":1,
"nb_visits":3,
"avg_time_on_site":83,
},
"2018-01-04":[]
}
I would like to replace the function datePeriod from $("#avgtime").html(avgtime / datePeriod); with another that counts the total rows in JSON, excluding any days that do not have values.
I have tried researching the solution first but am at a loss. Thank you for your help.
Here is my full script
setTimeout(function() {
$('#DateSelector').val('30').trigger("change")
}, 1000);
function datePeriod() {
let datePeriod = $("#DateSelector").val();
$("#DateShow").html(datePeriod);
$.getJSON(`https://discovrbookings.innocraft.cloud/?module=API&method=API.get&format=json&idSite=2&period=day&date=last${datePeriod}&token_auth=68aa5bd12137f13255dcb98794b65dff`, (data) => {
//console.log(data);
let avgtime = 0;
for (i in data) {
if (data[i].avg_time_on_site) {
avgtime += data[i].avg_time_on_site;
}
}
$("#avgtime").html(avgtime / datePeriod);
});
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<select name="DateSelector" id="DateSelector" onchange="datePeriod();">
<option value="7">Last 7 Days</option>
<option value="14">Last 14 Days</option>
<option value="30">Last 30 Days</option>
<option value="90">Last 90 Days</option>
<option value="365">Last 365 Days</option>
</select>
<h2 id="avgtime"></h2>
ifblock. Init a secondvarto 0 before the loop, then increase by one whenever you add toavgtime.data[i].avg_time_on_siteand divide it by thelengthofavg_time_on_site. So to count only theavg_time_on_site? I had tried to perform the sum inside of here but couldn't get a valid response.(0 + (1 * 41) + (2 * 52) + 0 + (3 * 83) + 0) / (0 + 1 + 2 + 0 + 3 + 0) => 249 / 6 ~> 185sum_visit_lengthfrom the API.