Skip to main content
Career Paths
Concepts
Bep Caching Strategies
The Simplified Tech

Role-based learning paths to help you master cloud engineering with clarity and confidence.

Product

  • Career Paths
  • Interview Prep
  • Scenarios
  • AI Features
  • Cloud Comparison
  • Resume Builder
  • Pricing

Community

  • Join Discord

Account

  • Dashboard
  • Credits
  • Updates
  • Sign in
  • Sign up
  • Contact Support

Stay updated

Get the latest learning tips and updates. No spam, ever.

Terms of ServicePrivacy Policy

© 2026 TheSimplifiedTech. All rights reserved.

BackBack
Interactive Explainer

Caching Strategies: Redis, Memcached, and Cache Invalidation

There are only two hard problems in computer science: cache invalidation and naming things

🎯Key Takeaways
Caching eliminates database load for hot data — 80% of traffic typically hits 20% of data.
Cache-Aside (Lazy Loading) is the production default: check cache → miss → load DB → populate cache.
Always set TTL on cache entries. Redis without TTL = memory leak.
Cache invalidation: delete on write for critical data; TTL-based expiry for eventually-consistent data.
Redis supports Strings, Hashes, Lists, Sets, Sorted Sets — use the right data structure for the problem.
Redis is in-memory: not a database replacement. Always use with a durable source of truth.

Caching Strategies: Redis, Memcached, and Cache Invalidation

There are only two hard problems in computer science: cache invalidation and naming things

~5 min read
Be the first to complete!
What you'll learn
  • Caching eliminates database load for hot data — 80% of traffic typically hits 20% of data.
  • Cache-Aside (Lazy Loading) is the production default: check cache → miss → load DB → populate cache.
  • Always set TTL on cache entries. Redis without TTL = memory leak.
  • Cache invalidation: delete on write for critical data; TTL-based expiry for eventually-consistent data.
  • Redis supports Strings, Hashes, Lists, Sets, Sorted Sets — use the right data structure for the problem.
  • Redis is in-memory: not a database replacement. Always use with a durable source of truth.

Lesson outline

Why Caches Exist: The Speed of Electrons vs. the Speed of Disks

Twitter's homepage once required 800 database queries. After adding a Redis caching layer, it required 0 database queries for most requests. That's the power of caching — and why Twitter could serve 500 million tweets per day on commodity hardware.

Storage LayerLatencyThroughputUse Case
CPU L1 Cache0.5ns∞In-process variables
RAM (in-process cache)100ns∞Application memory cache
Redis (same datacenter)0.1–1ms100k+ ops/secDistributed session, hot data
SSD (local)150μs10k IOPSWrite-through cache
Database (network query)5–50ms1k–10k QPSSource of truth
Database (cross-region)50–200ms1k QPSDisaster recovery reads

The 80/20 Rule of Caching

80% of traffic typically hits 20% of data. Caching that hot 20% eliminates 80% of database load. You don't need to cache everything — just the hot path.

Cache Patterns: How Data Gets In and Out

PatternHow It WorksProsCons
Cache-Aside (Lazy Loading)App checks cache first; on miss, loads from DB and populates cacheOnly caches what's requested; resilient to cache failureCache miss penalty; potential stale data; thundering herd on cold start
Write-ThroughApp writes to cache AND DB synchronously on every writeCache always has fresh data; no stale readsWrite latency increases; caches data that may never be read
Write-Behind (Write-Back)App writes to cache immediately; DB updated asynchronouslyLowest write latencyRisk of data loss if cache fails before DB write
Read-ThroughCache handles DB loading automatically on miss (middleware pattern)App code simpler; consistent cache populationLess control; requires cache-aware library
Refresh-AheadCache proactively refreshes hot data before expiryNo miss penalty for hot keysComplex; may cache data that becomes stale

Cache-Aside Is the Production Default

Cache-Aside (Lazy Loading) is the most common production pattern because it's simple, doesn't require special infrastructure, and handles cache failures gracefully (just goes to DB). Use it as your default and only switch to more complex patterns when you have specific needs.

cache-aside.ts
1import Redis from 'ioredis';
2
3const redis = new Redis(process.env.REDIS_URL);
4
5// Cache-Aside Pattern with TTL and error handling
6async function getUser(id: string): Promise<User> {
7 const cacheKey = `user:${id}`;
8
9 // 1. Check cache first
Check cache first — fast path returns immediately without touching DB
10 const cached = await redis.get(cacheKey);
11 if (cached) {
12 return JSON.parse(cached);
13 }
14
15 // 2. Cache miss — load from database
16 const user = await db.users.findById(id);
17 if (!user) throw new Error('User not found');
18
Always set TTL. Redis with no TTL = memory leak. 1 hour is reasonable for user profiles
19 // 3. Populate cache with TTL
20 await redis.setex(cacheKey, 3600, JSON.stringify(user)); // 1 hour TTL
21
22 return user;
23}
24
25// Cache invalidation on update
Invalidate on write — simpler and safer than updating the cached value
26async function updateUser(id: string, data: Partial<User>): Promise<User> {
27 const user = await db.users.update(id, data);
28
29 // Invalidate cache — force next read to go to DB
30 await redis.del(`user:${id}`);
31
32 return user;
33}
34
35// Thundering herd prevention with distributed lock
36async function getUserWithLock(id: string): Promise<User> {
37 const cacheKey = `user:${id}`;
Thundering herd: 1000 requests hit cache miss simultaneously → 1000 DB queries. Lock prevents this
38 const lockKey = `lock:user:${id}`;
39
40 const cached = await redis.get(cacheKey);
41 if (cached) return JSON.parse(cached);
NX = set only if Not eXists. EX 5 = 5-second lock timeout prevents deadlocks
42
43 // Only one process refills cache at a time
44 const acquired = await redis.set(lockKey, '1', 'EX', 5, 'NX');
45
46 if (!acquired) {
47 // Another process is loading — wait briefly and retry
48 await new Promise(r => setTimeout(r, 50));
49 return getUserWithLock(id);
50 }
51
52 try {
53 const user = await db.users.findById(id);
54 await redis.setex(cacheKey, 3600, JSON.stringify(user));
55 return user;
56 } finally {
57 await redis.del(lockKey);
58 }
59}

Redis: Much More Than a Cache

Redis Data Structures for Backend Engineers

  • 📦String — GET/SET. Session storage, simple caching, counters (INCR is atomic).
  • 📋Hash — HGET/HSET. User profile object. More memory-efficient than separate string keys.
  • 📝List — LPUSH/RPOP. Job queues, activity feeds, recent items (bounded with LTRIM).
  • 🎯Set — SADD/SMEMBERS. Unique visitors, tag membership, social graph (SINTER for mutual friends).
  • 🏆Sorted Set (ZSet) — ZADD/ZRANGE. Leaderboards, rate limiting with sliding window, delayed job scheduling.
  • 🔴Pub/Sub — PUBLISH/SUBSCRIBE. Real-time notifications. Not durable — messages lost if subscriber disconnects.

Redis Is Not a Database

Redis stores data in RAM. Without RDB/AOF persistence configured, data is lost on restart. Never use Redis as the sole data store for critical data. Always use Redis as a cache or ephemeral store, with a real database as the source of truth.

Cache Invalidation: The Hard Problem

Data TypeAcceptable StalenessStrategyTTL
User profile5 minutes OKTTL + invalidate on update300s
Product catalog1 hour OKTTL-based3600s
User sessionMust be liveShort TTL + sliding expiry900s (refreshed on activity)
Account balanceNever staleNo cache, or write-through with instant invalidation0 (no cache)
Product inventorySeconds OKShort TTL + explicit invalidation on purchase30s
Leaderboard rankingsMinutes OKRedis Sorted Set as cache (atomic updates)600s

The Practical Rule

Use TTL-based expiry when stale-for-N-seconds is acceptable. Use explicit deletion for data where stale reads cause correctness issues (financial balances, inventory). Never cache financial data that must be real-time.

How this might come up in interviews

Caching questions are almost universal in backend interviews. Demonstrate you understand not just "add Redis" but specific patterns, invalidation strategies, and failure modes.

Common questions:

  • Describe the cache-aside pattern and when you would use it
  • How do you handle cache invalidation when data is updated?
  • What is the thundering herd problem and how would you solve it?
  • Design a rate limiter using Redis sorted sets

Strong answers include:

  • Discusses write-through vs cache-aside tradeoffs
  • Mentions TTL selection based on data characteristics
  • Knows Redis sorted sets for rate limiting windows
  • Mentions cache warming strategies for new deployments

Red flags:

  • "Just cache everything" — no understanding of invalidation
  • Doesn't know TTL or thinks Redis is a database
  • No understanding of thundering herd or cache warming

Quick check · Caching Strategies: Redis, Memcached, and Cache Invalidation

1 / 3

You cache user profiles in Redis. A user updates their name. What should happen to the cached profile?

Key takeaways

  • Caching eliminates database load for hot data — 80% of traffic typically hits 20% of data.
  • Cache-Aside (Lazy Loading) is the production default: check cache → miss → load DB → populate cache.
  • Always set TTL on cache entries. Redis without TTL = memory leak.
  • Cache invalidation: delete on write for critical data; TTL-based expiry for eventually-consistent data.
  • Redis supports Strings, Hashes, Lists, Sets, Sorted Sets — use the right data structure for the problem.
  • Redis is in-memory: not a database replacement. Always use with a durable source of truth.

From the books

Designing Data-Intensive Applications — Martin Kleppmann (2017)

Chapter 5: Replication

Caches introduce eventual consistency. Understanding when stale reads are acceptable vs when they cause correctness issues is the key to good caching design.

Ready to see how this works in the cloud?

Switch to Career Paths for structured paths (e.g. Developer, DevOps) and provider-specific lessons.

View role-based paths

Sign in to track your progress and mark lessons complete.

Discussion

Questions? Discuss in the community or start a thread below.

Join Discord

In-app Q&A

Sign in to start or join a thread.