Caching can be a thorn in any developer’s side. I’ve spent too many hours wrestling with slow APIs and overburdened databases, searching for a solution that’s both effective and easy to implement.
That’s why I was thrilled when Karol (karol71927), a talented member of our open-source organization, Nestixis, created @nestixis/cache-manager.
This lightweight, Redis-powered library has streamlined caching in my NestJS projects, and I’m eager to share how it’s made a difference.
The Challenge: Caching Complexity in NestJS
The scenario is all too familiar: your application runs smoothly until traffic surges, and suddenly your database buckles under the load. Caching is the obvious fix—store data once, serve it quickly—but integrating it into NestJS often feels cumbersome. Redis offers powerful capabilities, but setting it up typically involves wrangling configurations, managing expiration policies, and defining custom cache keys.
I needed a tool that simplified the process while allowing precise control over what gets cached, like request parameters or queries.
So, Karol designed @nestixis/cache-manager to address these pain points with a clean, efficient approach. This package provides a straightforward API for managing Redis caching, complete with configurable TTLs and support for advanced caching strategies. It’s available at its GitHub repo, and its design reflects our team’s commitment to practical, reusable tools.
Getting Started: Seamless Setup
Installation is as simple as it gets:
npm i @nestixis/cache-manager
To integrate it into your NestJS app, register it in a module:
import { Module } from '@nestjs/common';
import { CacheModule } from '@nestixis/cache-manager';
import { ConfigModule, ConfigService } from '@nestjs/config';
@Module({
imports: [
CacheModule.registerAsync({
isGlobal: true,
imports: [ConfigModule],
useFactory: (configService: ConfigService) => ({
redis: {
host: configService.get('REDIS_HOST') || 'localhost',
port: +configService.get('REDIS_PORT') || 6379,
},
cachePrefix: 'cache:',
defaultCacheTTL: 1000, // 1-second default
}),
inject: [ConfigService],
}),
],
})
export class AppModule {}
This sets up Redis caching across your app with minimal effort. From there, you can interact with it manually in a service:
import { Injectable } from '@nestjs/common';
import { CacheManager } from '@nestixis/cache-manager';
@Injectable()
export class MyService {
constructor(private readonly cacheManager: CacheManager) {}
async getData(key: string) {
const cached = await this.cacheManager.get(key);
if (cached) return cached;
const data = await this.fetchData();
await this.cacheManager.add(key, data, 1000); // Cache for 1 second
return data;
}
async clearData(key: string) {
await this.cacheManager.remove(key);
}
private fetchData() {
return new Promise((resolve) => setTimeout(() => resolve('Data'), 2000));
}
}
What sets this package apart is its ability to cache based on specific request details—a feature Karol thoughtfully included. Take this
controller:
import { Controller, Get, Post, Delete, Param, UseInterceptors } from '@nestjs/common';
import { CacheInterceptor, CacheRemoveInterceptor, CacheTrackBy } from '@nestixis/cache-manager';
import { MyService } from './my.service';
@Controller('site/:token')
@CacheTrackBy({
prefix: 'site',
ttl: 10000, // 10 seconds
by: [
{
by: 'param',
name: 'token',
},
],
})
export class SiteController {
constructor(private readonly service: MyService) {}
@Get()
@UseInterceptors(CacheInterceptor)
async get(@Param('token') token: string) {
return this.service.getData(`site:${token}`);
}
// Will clear cache on add or remove of resource to keep fresh state
@Post()
@UseInterceptors(CacheRemoveInterceptor)
async add(@Param('token') token: string) {
await this.service.getData(`site:${token}`); // Refresh data
}
@Delete()
@UseInterceptors(CacheRemoveInterceptor)
async remove(@Param('token') token: string) {
await this.service.clearData(`site:${token}`);
}
}
The @CacheTrackBy
decorator is the key here. It ensures caching is tied to the :token parameter, so /site/abc and /site/xyz each get their own cache entry.
You can adjust it to use queries or other criteria instead, offering the flexibility I’d always wanted. The CacheInterceptor handles GET requests, while CacheRemoveInterceptor clears the cache on updates—elegant and intuitive.