2

I have a result set returned from a database that takes a few seconds to run. To increase performance I'm going to cache the results for up to 30 minutes. The call to the database takes a date parameter, a date within the next 60 days that the user selects from a calendar.

Here is a code sample that uses a list of strings to keep the example simple:

public List<String> GetConcertList(DateTime selectedDate)
        {
            String cacheKey;

            cacheKey = "ConcertList" + selectedDate.Date.Ticks.ToString();

            List<String> concertList = HttpContext.Current.Cache[cacheKey] as List<String>;

            if (concertList == null)
            {
                // Normally this is the call to the database that passes the selected date
                concertList.Add("Just a test for " + selectedDate.ToString());

                HttpContext.Current.Cache.Insert(cacheKey, concertList, null, DateTime.Now.AddMinutes(30),
                                                 System.Web.Caching.Cache.NoSlidingExpiration);
            }

            return concertList;
        }

Is there a better approach then using the date in the key to cache each day's list?

1
  • 1
    As a general rule of thumb, if you're caching the results of a stored procedure (directly), then the cache key should match the parameters to the stored proc. So, you're cache key looks good. But i agree with @o6tech's answer - depending on your data set, maybe you could perform multi-level caching (cache the whole dataset, then cache portions of the dataset again). All depends on your dataset and how often it will change. Commented Sep 1, 2010 at 1:29

1 Answer 1

1

How large is the dataset we're talking about here? If it's not huge, you may want to Cache the whole list and then just pull out the subset based on the user's selected date.

A better way to do this would be to keep a master dataset that's an aggregate of all of the user-requested dates. Basically, you would just append any new, not-yet-requested results to your cached dataset and work from there.

Sign up to request clarification or add additional context in comments.

1 Comment

Depending on the setup, if the dataset is on the larger side, using SQL to do the filtering, but making sure the database has enough available RAM to cache the dataset with a good index, can sometimes be almost as fast. (Again, depending on the system.) As always, profile!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.