Beyond the Basics: Building a Custom Redis Cache Module in NestJS - Andy Primawan
Ditch the limitations of NestJS's default cache module. Let's build a powerful, reusable Redis cache module from scratch using `ioredis` to unlock advanced features and seriously boost your app's performance.

If you've worked with a backend framework like NestJS for any length of time, you know the drill. Your application grows, more users come on board, and suddenly your database starts to feel the strain. Queries to your trusty PostgreSQL or MySQL database that were once instant now introduce noticeable latency. The dreaded bottleneck has arrived.
The first and often most effective line of defense is caching. Storing frequently accessed data in a fast, in-memory store can dramatically improve response times and reduce the load on your primary database.
Enter Redis. It's so much more than a simple key-value cache. As noted by some, Redis is a versatile, open-source, in-memory data store that can also function as a database or even a message broker. It has a rich feature set, supporting powerful data structures like hashes, lists, sets, and even advanced tools like Bloom filters. This opens the door to solutions for rate limiting, real-time leaderboards, job queues, and more.
Now, NestJS has a built-in CacheModule
(@nestjs/cache-manager
) which is fantastic for getting started.
But what happens when you need more control? What if you want to tap into those advanced Redis commands or structure your caching logic in a very specific way? That's when the built-in abstraction can feel a bit limiting.
Today, we're going to roll up our sleeves and build our own custom, injectable CacheModule
using ioredis
. This will give us full control and a solid foundation for any Redis-based feature we want to implement down the road.
The Blueprint: Folder Structure
Before we write any code, let's visualize our new module. We'll create a dedicated directory for our caching logic to keep things clean and modular.
src/
├── app.module.ts
├── main.ts
│
├── caches/
│ ├── interfaces/caches-module-options.interface.ts
│ ├── caches.module.ts
│ └── caches.service.ts
│ └── constants.ts
│
└── posts/
├── posts.controller.ts
├── posts.module.ts
└── posts.service.ts
Let's Get Our Hands Dirty: The Implementation
Step 1: Install ioredis
First things first, we need to add the ioredis
library to our project. It's a robust and popular Redis client for Node.js.
# if you use npm
$ npm install ioredis
# if you use pnpm
$ pnpm add ioredis
# if you use yarn
$ yarn add ioredis
Step 2: Create the Custom CachesModule
This is the heart of our setup. The CachesModule
will be responsible for creating and configuring the Redis connection and making our CachesService
available to the rest of the application through dependency injection.
Create src/caches/caches.module.ts
:
// src/caches/caches.module.ts
import { DynamicModule, Global, Module, Provider } from '@nestjs/common';
import Redis from 'ioredis';
import { CachesService } from './caches.service';
import { CACHE_CLIENT } from './constants';
import {
CachesModuleAsyncOptions,
CachesModuleOptions,
} from './interfaces/caches-module-options.interface';
@Global() // Making the module available throughout the application
@Module({})
export class CachesModule {
/**
* Synchronous configuration of the module.
*/
static forRoot(options: CachesModuleOptions): DynamicModule {
const redisProvider: Provider = {
provide: CACHE_CLIENT,
useFactory: () => {
return new Redis(options);
},
};
return {
module: CachesModule,
providers: [redisProvider, CachesService],
exports: [CachesService], // Export CacheService so it can be injected
};
}
/**
* Asynchronous configuration of the module (e.g. fetching data from ConfigService).
*/
static forRootAsync<T = any>(
options: CachesModuleAsyncOptions<T>,
): DynamicModule {
const redisProvider: Provider = {
provide: CACHE_CLIENT,
useFactory: async (...args: T[]) => {
const redisOptions = await options.useFactory(...args);
return new Redis(redisOptions);
},
inject: options.inject || [],
};
return {
module: CachesModule,
imports: options.imports,
providers: [redisProvider, CachesService],
exports: [CachesService],
};
}
}
What's happening here?
@Module({})
: We define a standard NestJS module.providers
: This is where the magic starts. We're defining two providers.- The first one uses a custom provider token,
'CACHE_CLIENT'
. TheuseFactory
function creates and returns ourioredis
client instance. This is great because it allows us to inject the rawioredis
client anywhere we might need it, and it keeps connection logic neatly encapsulated. We've also added a basic error handler. - The second provider is our
CachesService
, which we'll create next.
- The first one uses a custom provider token,
exports
: We export theCachesService
so that any other module that importsCachesModule
can inject and use it.
Step 3: Build the CachesService
The service will contain the methods we'll use to interact with Redis (e.g., get
, set
, del
). It acts as a wrapper around the ioredis
client, allowing us to create a clean, business-logic-oriented API for caching.
Create src/caches/caches.service.ts
:
// src/caches/caches.service.ts
import { Inject, Injectable, OnModuleDestroy } from '@nestjs/common';
import Redis, { RedisKey } from 'ioredis';
import { CACHE_CLIENT } from './constants';
@Injectable()
export class CachesService implements OnModuleDestroy {
// Inject Redis client using the defined token
constructor(@Inject(CACHE_CLIENT) private readonly redisClient: Redis) {}
/**
* Closing Redis connection when the application is stopped.
*/
onModuleDestroy() {
this.redisClient.quit().catch((error) => {
console.error('Error closing Redis connection:', error);
});
}
/**
* Getting data from cache.
* @param key Cache key
* @returns Stored data or null if not found
*/
async get<T>(key: RedisKey): Promise<T | null> {
const data = await this.redisClient.get(key);
if (!data) {
return null;
}
// Assuming data is stored as a JSON string
return JSON.parse(data) as T;
}
/**
* Saving data to cache.
* @param key Cache key
* @param value Data to be saved (will be serialized to JSON)
* @param ttl Time-to-live in seconds (optional)
*/
async set(key: RedisKey, value: any, ttl?: number): Promise<'OK'> {
const stringValue = JSON.stringify(value);
if (ttl) {
return this.redisClient.set(key, stringValue, 'EX', ttl);
} else {
return this.redisClient.set(key, stringValue);
}
}
/**
* Deleting data from cache.
* @param key Cache key
* @returns Number of keys deleted
*/
async del(key: RedisKey): Promise<number> {
return this.redisClient.del(key);
}
/**
* Giving direct access to the ioredis client if you need to run a command that is not wrapped.
* Running a command that is not wrapped.
* @returns instance ioredis
*/
getClient(): Redis {
return this.redisClient;
}
}
Key Points:
- We use
@Inject('CACHE_CLIENT')
to get the Redis instance we created in our module. - We're using
JSON.stringify
before setting andJSON.parse
after getting because Redis stores data as strings. This is a crucial step! - Our
set
method includes an optionalttlSeconds
(Time To Live). When provided, we use thesetex
command, which is a neat Redis feature for automatically expiring keys. This is perfect for cache invalidation.
Step 4: Put It to Use!
Now for the payoff. Let's use our new CachesService
in a hypothetical PostsService
.
First, import the CachesModule
into your feature module.
// src/posts/posts.module.ts
import { Module } from '@nestjs/common';
import { PostsService } from './posts.service';
import { PostsController } from './posts.controller';
import { CachesModule } from '../caches/caches.module'; // <-- Import it
@Module({
imports: [CachesModule], // <-- Add it to imports
controllers: [PostsController],
providers: [PostsService],
})
export class PostsModule {}
Next, inject CachesService
into your PostsService
and use it to implement a cache-aside strategy.
// src/posts/posts.service.ts
import { Injectable } from '@nestjs/common';
import { CachesService } from '../caches/caches.service';
// Assume you have a Post type and a database service
import { Post } from './post.entity';
@Injectable()
export class PostsService {
constructor(private readonly cachesService: CachesService) {}
async findOne(id: number): Promise<Post> {
const cacheKey = `post:${id}`;
// 1. Check cache first
const cachedPost = await this.cachesService.get<Post>(cacheKey);
if (cachedPost) {
console.log('Serving from cache!');
return cachedPost;
}
// 2. If not in cache, fetch from database
console.log('Fetching from database...');
// const postFromDb = await this.database.query(...);
const postFromDb: Post = { id, title: 'My Awesome Post', content: '...' }; // Dummy data
// 3. Store the result in the cache for next time (e.g., for 1 hour)
await this.cachesService.set(cacheKey, postFromDb, 3600);
return postFromDb;
}
}
And that's it! The first time findOne(1)
is called, it will log "Fetching from database..." and hit your DB. Every subsequent call within the next hour will log "Serving from cache!" and return instantly, without ever touching the database.
Conclusion: Power and Flexibility
While NestJS's built-in CacheModule
is a great tool, building your own gives you a new level of power and flexibility. With this custom module, you have a solid, reusable foundation. You're no longer limited to basic GET/SET operations. You can easily extend your CacheService
to use Redis Hashes for storing user sessions, Lists for implementing simple job queues, or Sets for tracking unique page views—all while keeping your code clean, modular, and testable.
You've now built a component that not only boosts performance but also gives you direct access to the full power of Redis, right within the elegant structure of your NestJS application.
See the Code in Action
Want to see this in action? I've put together a complete working example on my GitHub. Check out the Dojotek AI Chatbot repository to clone the code, run it yourself, and see how everything fits together.