I am pulling JSON data from Salesforce. I can have roughly about 10 000 records, but never more. In order to prevent Api limits and having to hit Salesforce for every request, I thought I could query the data every hour and then store it in memory. Obviously this will be much much quicker, and much less error prone.
A JSON object would have about 10 properties and maybe one other nested JSON object with two or three properties.
I am using methods similar to below to query the records.
getUniqueProperty: function (data, property) {
return _.chain(data)
.sortBy(function(item) { return item[property]; })
.pluck(property)
.uniq()
.value();
}
My questions are
What would the ramifications be by storing the data into memory and working with the data in memory? I obviously don't want to block the sever by running heavy filtering on the data.
I have never used redis before, but would something like a caching db help?
Would it be best to maybe query the data every hour, and store the JSON response in something like Mongo. I would then do all my querying against Mongo as opposed to in-memory? Every hour I query Salesforce, I just flush the database and reinsert the data.