Back to blog
Backend Systemsbeginner

Caching Fundamentals — Why, When, and How

Understand caching layers in .NET: in-memory cache, distributed cache (Redis), cache-aside pattern, TTL, cache invalidation, and when not to cache.

Asma HafeezApril 17, 20264 min read
dotnetcachingredisperformanceaspnet-core
Share:š•

Caching Fundamentals

Caching stores results of expensive operations so the next request gets the result from memory instead of recomputing or re-querying the database.


Why Cache?

  • Database queries — most APIs spend 80% of time waiting for the DB
  • External API calls — rate limits, latency, costs
  • Expensive computations — reports, aggregations
  • Static or slowly changing data — product catalog, config, translations

The Cache-Aside Pattern

This is the most common pattern — the application manages the cache manually.

1. Check cache for key
2. If found (cache hit) → return cached value
3. If not found (cache miss):
   a. Fetch from source (DB, API)
   b. Store in cache with TTL
   c. Return value
C#
public async Task<Product?> GetProductAsync(int id)
{
    var cacheKey = $"product:{id}";

    // 1. Check cache
    if (_cache.TryGetValue(cacheKey, out Product? cached))
        return cached;

    // 2. Cache miss — fetch from DB
    var product = await _db.Products.FindAsync(id);
    if (product is null) return null;

    // 3. Store in cache for 5 minutes
    _cache.Set(cacheKey, product, TimeSpan.FromMinutes(5));
    return product;
}

In-Memory Cache (IMemoryCache)

Best for single-server apps. Lost on restart.

C#
// Register
builder.Services.AddMemoryCache();

// Inject
public class ProductService(IMemoryCache cache) { }

// Basic set/get
cache.Set("key", value, TimeSpan.FromMinutes(10));
cache.TryGetValue("key", out MyType? result);
cache.Remove("key");

// With size limit and priority
var options = new MemoryCacheEntryOptions
{
    AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10),
    SlidingExpiration = TimeSpan.FromMinutes(2),  // reset TTL on access
    Priority = CacheItemPriority.Normal,
    Size = 1  // requires setting SizeLimit on the cache
};
cache.Set("key", value, options);

Distributed Cache (IDistributedCache)

Works across multiple servers. Survives restarts. Use Redis in production.

Bash
dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis
C#
// Register Redis
builder.Services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = builder.Configuration.GetConnectionString("Redis");
});

// Use (works with same interface — easy to swap)
public class ProductService(IDistributedCache cache)
{
    public async Task<Product?> GetAsync(int id)
    {
        var key  = $"product:{id}";
        var json = await cache.GetStringAsync(key);

        if (json is not null)
            return JsonSerializer.Deserialize<Product>(json);

        var product = await _db.Products.FindAsync(id);
        if (product is null) return null;

        await cache.SetStringAsync(key,
            JsonSerializer.Serialize(product),
            new DistributedCacheEntryOptions
            {
                AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5)
            });

        return product;
    }
}

Cache Invalidation

The hardest part of caching. You need to remove stale data when the source changes.

C#
// Remove on update
public async Task UpdateProductAsync(Product product)
{
    await _db.SaveChangesAsync();
    await _cache.RemoveAsync($"product:{product.Id}");  // invalidate
}

// Pattern: invalidate by prefix (requires Redis)
// All product-related keys: "product:1", "product:2", "product:list"
// Use a version tag approach:
public async Task InvalidateProductsAsync()
{
    var version = await _cache.GetStringAsync("products:version") ?? "1";
    var newVersion = (int.Parse(version) + 1).ToString();
    await _cache.SetStringAsync("products:version", newVersion);
    // Keys now include version: "products:v2:list"
}

TTL Strategies

| Strategy | When to use | |----------|-------------| | Short TTL (1-5 min) | Frequently updated data | | Long TTL (1-24h) | Reference data, product catalog | | Sliding expiration | Session-like data — reset on access | | No expiration | Static data with explicit invalidation | | Never cache | Personalized, financial, real-time data |


When NOT to Cache

āœ— User-specific data without per-user keys
āœ— Financial balances or inventory counts (stale = wrong)
āœ— Sensitive PII (cache eviction isn't guaranteed)
āœ— Data that changes on every request
āœ— Small sets where DB is just as fast

Key Takeaways

  1. Cache-aside is the default pattern — check cache, on miss fetch + store
  2. IMemoryCache for single-server; IDistributedCache with Redis for scaled deployments
  3. Always set a TTL — unbounded caches eventually OOM your server
  4. Cache invalidation is hard — prefer short TTLs over complex invalidation logic
  5. Cache your most-read, least-changed data: product catalog, user profiles, configuration

Enjoyed this article?

Explore the Backend Systems learning path for more.

Found this helpful?

Share:š•

Leave a comment

Have a question, correction, or just found this helpful? Leave a note below.