Client Pay Portal

Caching a Query Data Source in Kentico

In my opinion, Kentico’s caching engine is one of (if not THE) most powerful and usable features of the platform. Kentico is using caching behind the scenes by default to improve performance and deliver content quickly. A wise developer will learn the ways of the Caching Jedi to speed up their custom code and ensure their site is running as optimally as possible.

Standard Caching

We use custom caching in nearly every module so implementing it is pretty easy. Here’s an example of how to load a DataSet into cache with a few lines of code:
//get user data
DataSet ds = null;
using (CachedSection<DataSet> cs = new CachedSection<DataSet>(ref ds60truenullConvert.ToString(CMSContext.CurrentUser.UserID+ "cachedata"))
    if (cs.LoadData)
        ds = GetOurAwesomeData("UserID = " + CMSContext.CurrentUser.UserIDnull);
        cs.Data = ds// Save data to cache
    ds = cs.Data;

Query Data Source Caching

Recently, I needed to cache things a little differently and had to overcome a small obstacle. I had a Kentico Query Data Source on a page. That QDS loaded a lot of data (of which I can’t use paging on for business reasons) and I wanted to find a way to prevent a user from having to load that data. Kentico’s standard caching will cache this data for a configured amount of time, but in the end someone will take the hit of pulling the data when the cache data is expired.

To get around this issue, I figured I could find the cached value Kentico creates when the content is first requested and make a Scheduled Task that would fill that data at a set interval. Seems easy enough…

To that, I needed to:
  • Get the QDS cached data name (using SQL Debugging)
  • Create a Scheduled Task
  • In my task, set the QDS dataset to cache on a time internal I specified.

With the above in place the site should always have the data cached, preventing a user from taking the performance hit. Seems awesome, right? Well, actually caching this data proved a little tricky.

From our sample code above, we use the Caching API to save the DataSet into cache:
//get user data
DataSet ds = null;
using (CachedSection<DataSet> cs = new CachedSection<DataSet>(ref ds60truenullConvert.ToString(CMSContext.CurrentUser.UserID+ "cachedata"))

In this code, we are explicitly saying that we caching a single object, a DataSet. This is where the problem comes in. When Kentico caches its data for a QDS, it actually caches an array of objects. (the actual DataSet and an identifier). This is important because the identifier tells Kentico how to handle the cached value from the control-side of things.

The Goods

The code to save the array is slightly different. Note that I'm also using the CacheHelper API directly to save this data.
System.Object[] obj = new object[2];
obj[0= dsExtraAwesomeUserData;
obj[1= 2;
//Load the new data into cache
CacheHelper.Add(strCacheNameobjnullDateTime.Now.AddMinutes(intCacheTimeoutInterval), Cache.NoSlidingExpiration);

After updating my code to save it as an array (using “2” as the identifier) my site worked great and did exactly what I wanted. The Scheduled task is now “refreshing” the QDS data every 60 minutes (based on the config) which ensures that it’s always up to date. More importantly, users will never have to wait for the page to refresh the data because the task is doing the heavy lifting for us.

Wrapping it Up

This article shows how you can extend the Caching API a little further and work with different types of data / objects. Kentico follows a very standard approach so taking a look at some of their code is a great place to start when you need to figure out how they are saving data behind the scenes. Hopefully this tells you how to you further utilize the Caching API in your code and really improve performance!


Wiz E. Wig, Mascot & Director of Magic
Wiz E. Wig

Director of Magic