Quick Overview:
Distributed caching is highly used by development professionals to support performance improvements. In .NET, a dedicated interface named “IDistributedCache” is provided, which works best with the Redis cache mechanism. In the blog, you will gain a fundamental understanding of distributed caching in .NET and Redis configuration.

Distributed Caching with Redis in .NET

Among the significant strategies for improving application performance, distributed caching has been placed at the top of the recommendations list. It’s a convenient way of optimizing not a single but multiple dotnet applications simultaneously. And when it’s supported by Redis cache, the speed is exceptional.

In this blog, we’re going to dig right into distributed caching in .NET using Redis to understand its benefits and configuration procedure.

What is Distributed Caching in .NET?

The distributed caching mechanism uses a separate infrastructure to provide caching services to the application. It can be used by multiple applications or servers at the same time. .NET developers use distributed caching for ASP.NET Core applications, but only when the software is hosted on a server farm or cloud infrastructure.

In addition, there are numerous ways to implement distributed caching for dotnet software, such as NCache, Redis, and SQL server cache. However, Redis is considered the most viable and reliable option due to its excellent compatibility, high speed, and scalability advantages.

Furthermore, in .NET apps, the “IDistributedCache” interface is used with Redis to implement distributed caching. It helps define the expiration timer and manage data based on the key-value pair.

The Benefits of Distributed Caching

The following are the top pros of using distributed caching, which can upgrade your business operations significantly.

1: Accessible by multiple servers

As a distributed cache is configured outside of a dotnet application, it can also be accessible by other servers. It means that multiple servers or applications can use a single cache to optimize their performance and provide services within minimal time.

2: High Performance

It helps in improving the overall software performance and speed. Whenever the end-user requests some data, the app instantly fetches it from the cache memory. It eliminates the need to connect with a database, which takes more time as compared to a cache.

3: No Data Loss

The distributed cache is on another server, meaning that if the web server goes down, the data in the cache will still retain its state. The users can be supported for some time in such cases until the required data is in the cache.

The Redis Distributed Caching Configuration

Let’s learn about the Redis distributed caching configuration in an ASP.NET Core Web API project. To start with it, use Visual Studio to create a project using the provided templates. In addition, download the Redis on your system for streamlined process completion and follow the below steps as provided.

Step 1: First, add the extension class “DistributedCacheExtensions” in accordance with the below code. It will enable the distributed cache interface “IDistributedCache” to work efficiently with the byte array.

The following are the main components of the code.

  • SetAsync(): It’s configured as a wrapper to which objects converted to byte array are passed.
  • TryGetValue(): It receives the byte array from the cache and converts it to the required type.
  • GetJsonSerializerOptions(): It’s used to initialize JSON serializer options. Also, this method is only used for the purpose of this example. For large apps, it’s recommended to utilize a static field.
public static class DistributedCacheExtensions
    public static Task SetAsync<T>(this IDistributedCache cache, string key, T value)
        return SetAsync(cache, key, value, new DistributedCacheEntryOptions());

    public static Task SetAsync<T>(this IDistributedCache cache, string key, T value, DistributedCacheEntryOptions options)
        var bytes = Encoding.UTF8.GetBytes(JsonSerializer.Serialize(value, GetJsonSerializerOptions()));
        return cache.SetAsync(key, bytes, options);

    public static bool TryGetValue<T>(this IDistributedCache cache, string key, out T? value)
        var val = cache.Get(key);
        value = default;

        if (val == null) return false;

        value = JsonSerializer.Deserialize<T>(val, GetJsonSerializerOptions());

        return true;

    private static JsonSerializerOptions GetJsonSerializerOptions()
        return new JsonSerializerOptions()
            PropertyNamingPolicy = null,
            WriteIndented = true,
            AllowTrailingCommas = true,
            DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,

Step 2:  Open the controller and add the code below to support the distributed caching mechanism. The “ILogger” and “IDistributedCache” are injected into the controller along with the “SemaphoreSlim” object to manage concurrency.

private const string employeeListCacheKey = "employeeList";

private readonly IDataRepository<Employee> _dataRepository;
private readonly IDistributedCache _cache;
private readonly ILogger<EmployeeController> _logger;
private static readonly SemaphoreSlim semaphore = new(1, 1);

public EmployeeController(IDataRepository<Employee> dataRepository, IDistributedCache cache,
    ILogger<EmployeeController> logger)
    _dataRepository = dataRepository ?? throw new ArgumentNullException(nameof(dataRepository));
    _cache = cache ?? throw new ArgumentNullException(nameof(cache));
    _logger = logger ?? throw new ArgumentNullException(nameof(logger));

Step 3: Add the “Get” action, as we are implementing caching in a Web API project.

public async Task<IActionResult> GetAsync()
    _logger.Log(LogLevel.Information, "Trying to fetch the list of employees from cache.");

    if (_cache.TryGetValue(employeeListCacheKey, out IEnumerable<Employee>? employees))
        _logger.Log(LogLevel.Information, "Employee list found in cache.");
            await semaphore.WaitAsync();

            if (_cache.TryGetValue(employeeListCacheKey, out employees))
                _logger.Log(LogLevel.Information, "Employee list found in cache.");
                _logger.Log(LogLevel.Information, "Employee list not found in cache. Fetching from database.");

                employees = _dataRepository.GetAll();

                var cacheEntryOptions = new DistributedCacheEntryOptions()

                await _cache.SetAsync(employeeListCacheKey, employees, cacheEntryOptions);

    return Ok(employees);

The above code looks for the requested data in the cache. However, if it’s not available, the cache communicates with the database, fetches and stores the data in its memory, and then provides it to the application. Also, the semaphore is entered and exited to ensure thread safety.

In addition, the expiration is also configured using the absolute and sliding expiration values.

Step 4: Configure the POST action with the following source code.

public IActionResult Post([FromBody] Employee employee)
    if (employee == null)
        return BadRequest("Employee is null.");



    return new ObjectResult(employee) { StatusCode = (int)HttpStatusCode.Created };

Step 5: Test the distributed cache implementation in the .NET application. By navigating to the “/api/employee,” you can start the testing. It will provide a similar result as the one below.

info: Trying to fetch the list of employees from cache.
info: Employee list not found in cache. Fetching from database.

The first API call will take a longer time, as the cache has to fetch details from the data store, and you will see an output similar to the following snippet.

Status 200 Ok Time: 4.89 s Size: 591 B

Again, test the distributed caching by sending another API request.

info: Trying to fetch the list of employees from cache.
info: Employee list found in cache.

This time, the time required to retrieve details will be less, as the data is available in the cache.

Status 200 Ok Time: 31 ms s Size: 591 B

As you can see, during the first API Get request, the app takes 4.89 seconds to provide data. But, when the data was available in the cache, it only took around 31 milliseconds to perform the same operation.

However, our process has not yet been completed. We still need to configure the Redis cache mechanism to optimize the performance for multiple servers.

Step 6: Open the Azure portal and configure an Azure Cache for Redis service. Leave all the settings as default and input the DNS name for the distributed cache.

In addition, choose the plan according to your budget and navigate to the “Access Keys” menu to copy the connection string. Further, open the “appSettings” file and paste the copied string in it. Lastly, open the file consisting of the “Program” class, and configure the Redis cache there by using the “AddStackExchangeRedisCache()” method.

builder.Services.AddNCacheDistributedCache(configuration =>
    configuration.CacheName = "TestNCache";
    configuration.EnableLogs = true;
    configuration.ExceptionsEnabled = true;

Concluding Up

Distributed caching mechanism is a spectacular way of enabling multiple applications to use a single cache service. It helps optimize performance and reduce the time required to fetch data from the database. To implement distributed caching in the .NET Core application, the “IDistributedCache” interface is used with Redis cache. Firstly, the interface is implemented, and then Redis is configured on the Azure platform. And as a result, you improve the application speed and scalability.

Parag Mehta

Verified Expert in Software & Web App Engineering

Parag Mehta, the CEO and Founder of Positiwise Software Pvt Ltd has extensive knowledge of the development niche. He is implementing custom strategies to craft highly-appealing and robust applications for its clients and supporting employees to grow and ace the tasks. He is a consistent learner and always provides the best-in-quality solutions, accelerating productivity.

Related Posts