← All Concepts
Performance
Caching
Stores frequently accessed data in a fast-access layer to reduce latency and database load.
**Caching** stores copies of data in a faster storage layer to serve future requests more quickly.
**Cache levels:**
- **Client-side**: Browser cache, mobile app cache
- **CDN**: Edge servers cache static content globally
- **Application**: In-memory cache (Redis, Memcached)
- **Database**: Query cache, buffer pool
**Caching strategies:**
- **Cache-aside (Lazy loading)**: App checks cache → miss → read DB → populate cache
- **Write-through**: Write to cache + DB simultaneously. Consistent but slower.
- **Write-behind**: Write to cache, async write to DB. Fast but risk of data loss.
- **Read-through**: Cache automatically loads from DB on miss.
- **Refresh-ahead**: Proactively refresh before expiry.
**Eviction policies:** LRU (most common), LFU, FIFO, TTL-based.
**Challenges:** Cache invalidation, thundering herd, cold start, data consistency.
Common Use Cases
- Database query result caching
- Session storage
- API response caching
- Computed result caching (e.g., news feed)
Advantages
- +Dramatically reduces latency (ms vs 100s of ms)
- +Reduces database load
- +Handles read-heavy workloads efficiently
- +CDN caching reduces bandwidth costs
Disadvantages
- -Data staleness (cache vs DB inconsistency)
- -Cache invalidation complexity
- -Memory cost for large datasets
- -Cold start problem after cache flush