Caching Fundamentals ā Why, When, and How
Understand caching layers in .NET: in-memory cache, distributed cache (Redis), cache-aside pattern, TTL, cache invalidation, and when not to cache.
Caching Fundamentals
Caching stores results of expensive operations so the next request gets the result from memory instead of recomputing or re-querying the database.
Why Cache?
- Database queries ā most APIs spend 80% of time waiting for the DB
- External API calls ā rate limits, latency, costs
- Expensive computations ā reports, aggregations
- Static or slowly changing data ā product catalog, config, translations
The Cache-Aside Pattern
This is the most common pattern ā the application manages the cache manually.
1. Check cache for key
2. If found (cache hit) ā return cached value
3. If not found (cache miss):
a. Fetch from source (DB, API)
b. Store in cache with TTL
c. Return valuepublic async Task<Product?> GetProductAsync(int id)
{
var cacheKey = $"product:{id}";
// 1. Check cache
if (_cache.TryGetValue(cacheKey, out Product? cached))
return cached;
// 2. Cache miss ā fetch from DB
var product = await _db.Products.FindAsync(id);
if (product is null) return null;
// 3. Store in cache for 5 minutes
_cache.Set(cacheKey, product, TimeSpan.FromMinutes(5));
return product;
}In-Memory Cache (IMemoryCache)
Best for single-server apps. Lost on restart.
// Register
builder.Services.AddMemoryCache();
// Inject
public class ProductService(IMemoryCache cache) { }
// Basic set/get
cache.Set("key", value, TimeSpan.FromMinutes(10));
cache.TryGetValue("key", out MyType? result);
cache.Remove("key");
// With size limit and priority
var options = new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10),
SlidingExpiration = TimeSpan.FromMinutes(2), // reset TTL on access
Priority = CacheItemPriority.Normal,
Size = 1 // requires setting SizeLimit on the cache
};
cache.Set("key", value, options);Distributed Cache (IDistributedCache)
Works across multiple servers. Survives restarts. Use Redis in production.
dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis// Register Redis
builder.Services.AddStackExchangeRedisCache(options =>
{
options.Configuration = builder.Configuration.GetConnectionString("Redis");
});
// Use (works with same interface ā easy to swap)
public class ProductService(IDistributedCache cache)
{
public async Task<Product?> GetAsync(int id)
{
var key = $"product:{id}";
var json = await cache.GetStringAsync(key);
if (json is not null)
return JsonSerializer.Deserialize<Product>(json);
var product = await _db.Products.FindAsync(id);
if (product is null) return null;
await cache.SetStringAsync(key,
JsonSerializer.Serialize(product),
new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5)
});
return product;
}
}Cache Invalidation
The hardest part of caching. You need to remove stale data when the source changes.
// Remove on update
public async Task UpdateProductAsync(Product product)
{
await _db.SaveChangesAsync();
await _cache.RemoveAsync($"product:{product.Id}"); // invalidate
}
// Pattern: invalidate by prefix (requires Redis)
// All product-related keys: "product:1", "product:2", "product:list"
// Use a version tag approach:
public async Task InvalidateProductsAsync()
{
var version = await _cache.GetStringAsync("products:version") ?? "1";
var newVersion = (int.Parse(version) + 1).ToString();
await _cache.SetStringAsync("products:version", newVersion);
// Keys now include version: "products:v2:list"
}TTL Strategies
| Strategy | When to use | |----------|-------------| | Short TTL (1-5 min) | Frequently updated data | | Long TTL (1-24h) | Reference data, product catalog | | Sliding expiration | Session-like data ā reset on access | | No expiration | Static data with explicit invalidation | | Never cache | Personalized, financial, real-time data |
When NOT to Cache
ā User-specific data without per-user keys
ā Financial balances or inventory counts (stale = wrong)
ā Sensitive PII (cache eviction isn't guaranteed)
ā Data that changes on every request
ā Small sets where DB is just as fastKey Takeaways
- Cache-aside is the default pattern ā check cache, on miss fetch + store
- IMemoryCache for single-server; IDistributedCache with Redis for scaled deployments
- Always set a TTL ā unbounded caches eventually OOM your server
- Cache invalidation is hard ā prefer short TTLs over complex invalidation logic
- Cache your most-read, least-changed data: product catalog, user profiles, configuration
Enjoyed this article?
Explore the Backend Systems learning path for more.
Found this helpful?
Leave a comment
Have a question, correction, or just found this helpful? Leave a note below.