In my first blog of this series I covered several topics related to Kentico CMS and Windows Azure. With more and more companies looking to adopt Microsoft’s cloud platform I realized there’s a lot of information to know and not many areas to find it. So to help the dev community out I started this series to hopefully offer some guidance and answers to some of tough areas when deploying to Azure. So, time for another round of tough Kentico / Azure questions!
“What is the best way to manage session in Windows Azure?”
Just as in any web farm scenario, deploying Kentico to Windows Azure requires an alternative management of session. First, a little about how session works in .NET. When requests comes into the site, the application will store a number of items in session to identify the user and customize their experience. In a web farm scenario, requests will come into any server so this session information has to be available to all servers running the site.
With Kentico and Azure, the preferred method is to setup a cache in the Windows AppFabric. This cache is essentially a universally-available memory bank that the application will write session information to. Through the use of a “custom session provider” Kentico created, the application will read/write data to this cache as session values are created and maintained.
Another option is to use SQL Database (formerly SQL Azure) to store session. This is very similar to using “SQL Server Session Management”, a very common practice for traditional, on-premise hosting. In this solution, scripts are run against the database to create tables / stored procedures for holding / maintaining the session information. The application connects to SQL to read/write session directly in the database.
It is possible to use SQL Database for session management, however, it’s a little more complicated than the traditional SQL Server method. There are certain scripts that have to be run / adopted in order to properly connect to and use SQL Database. This method is currently un-tested with Kentico and not the preferred method. I am working on some prototypes for this solutions and will definitely update the community when I have some answers.
“Why do my CDN-stored files always cache and never update?”
Microsoft created Windows Azure Storage CDN to be as efficient as possible. This means that it will utilize internal caching heavily to improve performance and cut down on excessive processing. For every file copied to storage, there is a TLL (Time-To-Live) assigned to file. This tells Azure how long to cache the file and when to “refresh” it from the main storage file. If a TTL is not specified when a file is copied to storage, Windows Azure assigns the max TTL of 72 hours. This means the file will be cached on the CDN for up 72 hours after its original copy.
Kentico uses the Azure API to write files to storage. In the current version, Kentico is not supplying a TTL when copying the files up. This results in the “max” TTL being applied and therefore files are being cached the full amount. This is not necessarily a bad thing. Caching js files and background images can significantly reduce the processing and bandwidth a site uses and Kentico has no knowledge as to what files will change often within your site. I am have spoken with Kentico developers about this and I suspect there will be a way to specify a TLL in future versions of the platform.
“Should I store files in the database or file system in Azure?”
One of the best features of Windows Azure is its flexibility and scalability. Microsoft has done a great job at building out the platform in a way that makes sense for application. Part of this solution is Windows Azure Storage and CDN. This built-in functionality allows applications to deliver content quickly and efficiently using a network of 26+ data centers around the world. For this reason alone, using the “file system” for file storage within Kentico is the preferred method. When deploying to Azure, Kentico adds the ability to specify a CDN path that will be automatically appended to media library calls to utilize the CDN functionality. This is a great way to leverage this functionality with minimal configuration on the application side.
Additionally, SQL Database space is at a premium (there is currently a 150 GB limit to databases) so not storing files in the database reduces the size and saves money. Windows Azure Storage has a 150TB limit and is pennies for the GB. This means that using Windows Azure Storage (file system) for your Kentico files is a much better use of the platform and much more cost-effective.
“How does Kentico handling licensing multiple servers in Windows Azure?”
Running Kentico on multiple servers in Azure is the same as when running in an on-premise deployment. Each server you want to run the site on requires its own base license. If you want to run on 3 servers in Azure, you will need to buy the appropriate licenses through Kentico so your key will allow for 3 web farm servers.
With Windows Azure, Kentico did make a modification to the web farm functionality to facilitate the dynamic aspects of the Azure environment. As servers are allocated / de-allocated in Azure (by the underlying platform in Microsoft’s normal processing), Kentico will automatically register/de-register servers. This solution eliminates the need to modify each server web.config file and manually register web farm servers.
OK, that’s enough of the Kentico / Azure novel for now. I’ll post another installment in the series again as I get more great questions from partners.
You can check out my 1st edition of this blog series here