Caching is a critical technique for enhancing the performance and scalability of applications by temporarily storing frequently accessed data. In .NET Core, Microsoft provides several caching mechanisms to suit different application architectures and requirements.

1. In-Memory Caching

Description:

In-memory caching stores data within the application’s memory, making it ideal for single-server applications where data consistency across multiple instances isn’t a concern.

Implementation:
// Register in-memory caching services in the DI container
services.AddMemoryCache();
public class MyService
{
    private readonly IMemoryCache _cache;

    // Constructor injection of the IMemoryCache service
    public MyService(IMemoryCache cache)
    {
        _cache = cache;
    }

    public string GetData()
    {
        // Try to get the value from the cache
        if (!_cache.TryGetValue("myKey", out string value))
        {
            // If the key is not found in the cache, set a new value
            value = "Expensive data";

            // Set the value in the cache with an expiration time of 5 minutes
            _cache.Set("myKey", value, TimeSpan.FromMinutes(5));
        }

        // Return the cached or newly created value
        return value;
    }
}
When to Use:
  • Single-server applications.
  • Data that doesn’t need to persist beyond the application’s lifetime.
Limitations:
  • Not suitable for multi-server or cloud-based applications.

2. Distributed Caching

Description:

Distributed caching stores data in an external system accessible by multiple application instances, essential for applications running in cloud environments or across multiple servers.

Common Providers:
  • Redis: A high-performance, open-source in-memory data store.
  • SQL Server: Uses a relational database for caching.
Implementation (Redis Example):
// Register Redis distributed caching in the DI container
services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = "localhost:6379";      // Redis server address
    options.InstanceName = "SampleInstance";       // Optional instance prefix
});
public class MyService
{
    private readonly IDistributedCache _cache;

    // Constructor injection of the IDistributedCache service
    public MyService(IDistributedCache cache)
    {
        _cache = cache;
    }

    public async Task<string> GetDataAsync()
    {
        // Attempt to retrieve the value from the distributed cache
        var value = await _cache.GetStringAsync("myKey");

        if (value == null)
        {
            // If not found, simulate an expensive operation
            value = "Expensive data";

            // Store the value in cache with a 5-minute expiration
            await _cache.SetStringAsync("myKey", value, new DistributedCacheEntryOptions
            {
                AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5)
            });
        }

        return value;
    }
}
When to Use:
  • Multi-server or cloud-based applications.
  • Data that needs to be shared across different instances.
Limitations:
  • Introduces network latency.
  • Requires external infrastructure.

3. Response Caching

Description:

Response caching stores the entire HTTP response, including headers and content, to serve future requests more efficiently.

Implementation :
// Enable response caching middleware
services.AddResponseCaching();
// Cache the response for 60 seconds at any location (client/proxy/server)
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Any)]
public IActionResult Get()
{
    // This response will be cached and served for 60 seconds without re-processing
    return Ok("Cached response");
}
When to Use:
  • APIs or pages with content that doesn’t change frequently.
  • Public content that can be cached across different users.
Limitations:
  • Not suitable for personalized or dynamic content.

4. Output Caching (ASP.NET Core 7+)

Description:

Output caching stores the result of an action method, allowing subsequent requests to be served without re-executing the action.

Implementation :
// Enable output caching middleware and define a custom policy
services.AddOutputCache(options =>
{
    options.AddPolicy("Default", policy => policy
        .WithDuration(TimeSpan.FromMinutes(5))   // Cache duration
        .WithVaryByQuery("id"));                 // Vary the cache by query parameter "id"
});
// Apply the "Default" output cache policy to this endpoint
[OutputCache("Default")]
public IActionResult Get(int id)
{
    // Response will be cached separately for each value of "id"
    return Ok($"Data for {id}");
}
When to Use:
  • Requires ASP.NET Core 7 or later.
  • Not suitable for highly dynamic content.
Limitations:
  • Not suitable for multi-server or cloud-based applications.

Conclusion

Selecting the appropriate caching strategy depends on your application’s architecture and requirements:

  • In-Memory Caching: Best for single-server applications with non-critical data.
  • Distributed Caching: Ideal for cloud-based or multi-server applications requiring shared data access.
  • Response Caching: Suitable for APIs or pages with static content to improve performance.
  • Output Caching: Beneficial for ASP.NET Core 7+ applications with expensive-to-generate content.

By leveraging these caching strategies, you can significantly enhance your application’s performance and scalability.

Categorized in:

.NET, .NET Core,