Redis Caching Strategies: Boosting Application Performance

12 April 2025 · CodeMatic Team

Redis Caching Strategies

Redis is one of the most popular in-memory data stores, perfect for caching, session storage, and real-time applications. This guide covers advanced Redis caching strategies to maximize your application's performance.

Why Redis for Caching?

  • Speed: In-memory storage provides sub-millisecond latency
  • Data Structures: Strings, hashes, lists, sets, sorted sets
  • Persistence: Optional disk persistence (RDB, AOF)
  • Scalability: Clustering and replication support
  • Versatility: Caching, sessions, queues, pub/sub

Caching Patterns

Cache-Aside (Lazy Loading)

async function getUser(userId: string) {
  // 1. Check cache first
  const cached = await redis.get(`user:${userId}`);
  if (cached) {
    return JSON.parse(cached);
  }
  
  // 2. If not in cache, fetch from database
  const user = await db.users.findUnique({ where: { id: userId } });
  
  // 3. Store in cache for future requests
  if (user) {
    await redis.setex(
      `user:${userId}`,
      3600, // TTL: 1 hour
      JSON.stringify(user)
    );
  }
  
  return user;
}

Write-Through

async function updateUser(userId: string, data: UserData) {
  // 1. Update database
  const user = await db.users.update({
    where: { id: userId },
    data,
  });
  
  // 2. Update cache
  await redis.setex(
    `user:${userId}`,
    3600,
    JSON.stringify(user)
  );
  
  return user;
}

Write-Back (Write-Behind)

async function updateUser(userId: string, data: UserData) {
  // 1. Update cache immediately
  const user = { ...data, id: userId };
  await redis.setex(
    `user:${userId}`,
    3600,
    JSON.stringify(user)
  );
  
  // 2. Queue database update (async)
  await queue.add('update-user', { userId, data });
  
  return user;
}

Cache Invalidation Strategies

Time-Based Expiration (TTL)

// Set expiration time
await redis.setex('key', 3600, 'value'); // Expires in 1 hour

// Or use SET with EX option
await redis.set('key', 'value', 'EX', 3600);

// Different TTLs for different data
await redis.setex('user:123', 3600, userData); // Users: 1 hour
await redis.setex('product:456', 86400, productData); // Products: 24 hours
await redis.setex('session:789', 1800, sessionData); // Sessions: 30 minutes

Event-Based Invalidation

async function updateProduct(productId: string, data: ProductData) {
  // Update database
  const product = await db.products.update({
    where: { id: productId },
    data,
  });
  
  // Invalidate related cache keys
  await redis.del(`product:${productId}`);
  await redis.del(`products:list`);
  await redis.del(`products:category:${product.categoryId}`);
  
  return product;
}

Eviction Policies

Configure how Redis handles memory when it's full:

  • allkeys-lru: Evict least recently used keys (recommended)
  • volatile-lru: Evict LRU among keys with expiration
  • allkeys-lfu: Evict least frequently used keys
  • volatile-ttl: Evict keys with shortest TTL
  • noeviction: Return errors when memory is full

Advanced Data Structures

Hashes for Object Caching

// Store user object as hash
await redis.hset('user:123', {
  name: 'John Doe',
  email: 'john@example.com',
  age: '30',
});

// Get specific fields
const email = await redis.hget('user:123', 'email');

// Get all fields
const user = await redis.hgetall('user:123');

// Update single field
await redis.hset('user:123', 'age', '31');

Sorted Sets for Leaderboards

// Add scores
await redis.zadd('leaderboard', 100, 'user:1');
await redis.zadd('leaderboard', 200, 'user:2');
await redis.zadd('leaderboard', 150, 'user:3');

// Get top 10
const top10 = await redis.zrevrange('leaderboard', 0, 9, 'WITHSCORES');

// Get user rank
const rank = await redis.zrevrank('leaderboard', 'user:1');

Redis Clustering

Scale Redis horizontally with clustering:

  • Automatic sharding across nodes
  • High availability with replication
  • Linear scalability
  • Automatic failover

Performance Optimization

Pipelining

// Instead of multiple round trips
const pipeline = redis.pipeline();
pipeline.get('user:1');
pipeline.get('user:2');
pipeline.get('user:3');
const results = await pipeline.exec(); // Single round trip

Connection Pooling

import { createClient } from 'redis';

const redis = createClient({
  url: process.env.REDIS_URL,
  socket: {
    reconnectStrategy: (retries) => Math.min(retries * 50, 1000),
  },
});

// Use connection pool
const pool = new Map();
function getRedisClient() {
  if (!pool.has('default')) {
    pool.set('default', createClient({ url: process.env.REDIS_URL }));
  }
  return pool.get('default');
}

Real-World Example

E-commerce platform caching strategy:

  • Product data: 24-hour TTL with event-based invalidation
  • User sessions: 30-minute TTL
  • Shopping cart: 7-day TTL
  • Search results: 1-hour TTL
  • Leaderboards: Real-time with sorted sets
  • Result: 80% reduction in database load, 5x faster page loads

Best Practices

  • Use appropriate TTLs based on data freshness requirements
  • Implement cache warming for critical data
  • Monitor cache hit rates
  • Use consistent key naming conventions
  • Handle cache misses gracefully
  • Consider cache stampede prevention
  • Use Redis persistence for critical data
  • Monitor memory usage and eviction rates

Conclusion

Redis caching can dramatically improve application performance. Choose the right caching pattern, implement proper invalidation strategies, and optimize for your specific use case. Monitor performance and adjust strategies based on real-world usage patterns.