ASP.NET Core Caching

ASP.NET Core comes with support for three types of caching:

  • Local in-memory caching
  • Distributed caching with SQL Server
  • Distributed caching with Redis

In-memory caching will have the best performance, but you will need to consider maintaining cache consistency between servers in a web farm;  ASP.NET will not do this for you.  Multiple servers can share a distributed cache, so you don’t have to worry about cache coherence, but you trade-off performance, as we’ll see.

At the moment, using Redis is not supported when targeting .NET Core, so I will only include sample code for Visual Studio/Windows.  ASP.NET Core relies on StackExchange.Redis, which will support .NET Core (there is an alpha release with that support).  When that happens, I will add example projects for .NET Core/Linux.

You can find source code for the projects in this post here:

Local Memory Caching

Before writing this, I thought the only valid use-case for a local, in-memory cache, would be for a small site with limited traffic, where you knew a single server would be sufficient.  Otherwise, when scaling up to multiple servers in a web farm, you would have to worry about maintaining consistency between local caches, and that didn’t seem worth the effort given the ease of setting up a distributed cache.

After running the code in this post, I’ve had a change of heart.  The difference in performance is so great (three orders of magnitude), that I can imagine cases where an application would need to scale to multiple servers and still maintain the highest possible performance offered by an in-memory local cache.  In this scenario, you would have to come up with a method to keep the local caches in sync.

To use local memory caching, add the following dependency to the project.json file:

In the Startup.ConfigureServices method, add support for caching:

Now, when you want to use the cache, use dependency injection to get an instance of IMemoryCache, which has methods and extension methods to set, get, and remove items from the cache.  In our example, we request a cache instance in the Startup.Configure method:

We use Map to configure ad hoc middleware to work with the cache based on the path and query string.  If the path starts with /set, we treat the query string as key-value pairs to add to the cache:

If the path starts with /get, we look for all query string fields named “key” and retrieve the values associated with them from the cache:

Finally, if the path starts with /del, we remove any items for the associated keys from the cache: 

Running the project, we can set values using the following path and query string: /set?mykey1=myvalue1&mykey2=myvalue2

memory-cache-set

We then test retrieval of cached values using /get?key=mykey1&key=mykey2&key=mykey3:

memory-cache-get

Distributed Caching with Redis

We will only be using Redis as a distrubted cache in these examples, but as an in-memory NoSQL database, it is capable of much more.  If you are not familiar with its capabilities and want to learn more, the Redis documentation is a great place to start.

The easiest way to install Redis on Windows is with chocolatey.  If you don’t have chocolatey, run the following command from an administrative PowerShell window:

Install and run Redis:

redis-server

From a second PowerShell window, you can run redis-cli and use the ping command.  You should get a response of PONG from the server:

redis-cli

Now that we have Redis installed and running, we can use it as a distributed cache.  Add the following dependency to the project.json file:

We configure Redis in the Startup.ConfigureServices method:

The InstanceName option partitions the key space, as we’ll see later.

Now we can request an instance of IDistributedCache:

The body of the Startup.Configure method is largely the same as the memory cache example, except we are using the async methods provided by the IDistributedCache interface, which work with byte arrays.  

When we run the project and use the same path and query string to cache values as we did for memory caching, we can use the redis-cli to check the keys stored in Redis, and verify that they are the same ones we used:

redis-cache-keys

Notice that the key names are the ones we used prefixed with the instance value from the configuration.

Using Azure Redis Cache

To use Azure Redis Cache, follow these instructions to create a new Redis Cache (ignore the section about configuring clients).

After you create your Azure Redis Cache, click ‘”Show access keys…” and in the Manage Keys section, copy the “Primary connection string (StackExchange.Redis)” value (in the bottom right corner of the screen shot below).

azure-redis-cache

Since ASP.NET Core uses StackExchange.Redis for it’s Redis caching implementation, we can take the StackExchange.Redis connection string and copy it to the Redis options (replace "localhost" in the highlighted line below):

That’s the only change needed; now when you run the project, you will be caching to the Azure Redis cache.

Caching with SQL Server

Using SQL Server as a distributed cache offers no technical advantages over Redis.  I can’t think of a reason to use SQL Server, other than for non-technical reasons, e.g you are not allowed to bring Redis in to your IT infrastructure.

Before you can use SQL Server as a cache, you must create a table with this schema:

Add the following dependency to your project.json file:

In the Startup.ConfigureServices method we configure SQL Server caching:

The Startup.Configure method remains largely unchanged from the code we used in the Redis example.  The only difference is that we have to provide options when we set a value:

After you run this example and set some values you can query the cache table and verify that the values are stored in the database:

sql-cache-table

Performance Comparison

All of the projects contain middleware to execute a simple performance test:  a value is saved to the cache, and then retrieved a number of times (1,000,000 times for the local memory cache and 10,000 times for the distributed caches).  The total elapsed time in ticks is divided by the number of retrievals to get an average number of ticks per retrieval.  The goal here wasn’t to do an exhaustive performance test, but rather to get a quick idea of the relative performance between the three methods available to us.

Method Ticks/Read
Memory  2.338
Redis 2,305.663
SQL Server  6,611.593

From these results we see how much faster the local memory cache is, and that Redis is nearly 3 times faster than SQL Server.  Regarding the SQL Server cache, I wanted to try a memory-optimized table for caching, but the varbinary(max) and datetimeoffset data types are not supported for use in memory-optimized tables.  If you do use SQL Server for caching, I recommend using dbcc pintable to ensure the table is kept in memory.

Leave a Reply