I have an array of objects. Key of the array is generated automatically with .push(). Some objects might be removed from the array with delete array[index];. Array might contain anything from 0 to 10K+ entries.
Here is the declaration and some of the logic:
var allClients = [];
someFunction(client) {
allClients.push(client);
var callbackFunction = function() {
var index = allClients.indexOf(client);
try {
delete allClients[index];
} catch (err) {
console.error(err);
}
}
}
function getClientsMeetingCriteria(some_unique_id) {
var filteredClients = allClients.filter(function(client) {
return client.id_list.indexOf(some_unique_id) > -1;
});
return filteredClients;
}
Now above code (very simplified version of what it is IRL) works well with 100-300 clients, but it becomes too slow for it's purpose with 500+. Problem is that getClientsMeetingCriteria() might be called 10 times per second asynchronously. I need to set up caching with some_unique_id as the key. I have cache code, but I don't know how to store this data in it.. Imagine this:
client = {id:1, id_list='123;124;'}is added toclients[]getClientsMeetingCriteria('123')returns[client {id:1...}]and caches the responsedelete client[0]is called and client is removed from theclients[]getClientsMeetingCriteria('123')is called again and returns cached[client {id:1...}], but that entry is no longer in theclients[]
I know that JS passes by value in this case (return filteredClients). So far I came up with this: I can actually loop through the array in getClientsMeetingCriteria() and find indexes of the matching clients. I place an array with them (indexes) in cache under some_unique_id. I also keep an array indexed with client_ids with data being list of cached some_unique_id referencing client ids. Before calling delete allClients[index], I fetch array entry for index and hence get list of cache keys ('some_unique_id') that should be flushed. That seems like a generally big overhead, but it is nothing compared to looping through 1K objects every 100 ms...
Can you think of a better solution? Is there a way to make an object of links to array entries (indexes) which will be up to date - in other words refresh (become null) when index is removed? Something like an iterator.
P.S. I also realize that I can load the list of client ids from the cache and check if the index exists, but that does not solve the problem with adding a new client that has a matching 'some_unique_id'.
getClientsMeetingCriteria()would be too slow...