Modern Angular applications communicate heavily with backend APIs. Without a good caching strategy, the application repeatedly fetches the same data, leading to unnecessary network calls, slow rendering, and a poor user experience. A well-defined Client-Side Caching Framework helps Angular teams impose structure, reduce waste, and ensure predictable behaviour across modules.
This article explains how to design and implement a production-grade caching framework that supports:
TTL (time-to-live) expiration
LRU (least recently used) eviction
LFU (least frequently used) eviction
Route-level and service-level caching
Auto-invalidation triggers
Full Angular integration using services, interceptors, and RxJS
Real-world patterns for enterprise apps
The goal is to make a reusable caching layer that any Angular module can plug into without rewriting logic.
1. Why Angular Needs a Client-Side Caching Framework
Most teams start caching in an ad-hoc way:
Storing API results inside components
Using BehaviourSubject as a semi-cache
Using sessionStorage or localStorage randomly
Forgetting TTL invalidation
Not handling memory pressure
Over time, this creates inconsistent behaviour and hard-to-debug issues.
A centralised caching framework solves this by:
Reducing API calls
Improving time-to-render
Making offline mode easier
Offering deterministic invalidation
Making caching rules configurable
Ensuring consistent patterns across feature teams
2. Requirements for a Universal Caching Framework
A good Angular caching framework must support:
Pluggable strategies (LRU / LFU / TTL).
Cache scopes: per route, per service, global.
Auto-invalidations: TTL expiry, event-based triggers, manual clearing.
Persistence options: memory, localStorage, sessionStorage.
Integration with HttpInterceptor, so caching works automatically for GET requests.
Observability: ability to inspect cache entries, size, eviction logs.
Strong typing for TypeScript.
Predictable behaviour even under high usage.
3. High-Level Architecture
The framework consists of:
4. Architecture Workflow Diagram
+-----------------------+
| Angular Application |
+-----------+-----------+
|
v
+---------+--------+
| HttpInterceptor |
+---------+--------+
|
Cache Hit | Cache Miss
|
+--------------+---------------+
| |
v v
+--------+--------+ +--------+--------+
| CacheStore | | Backend API |
+--------+--------+ +--------+--------+
| |
+--------------+---------------+
|
v
+---------+--------+
| Response to App |
+------------------+
5. Flowchart: Cache Decision Process
+---------------------------+
| Incoming HTTP GET Request |
+--------------+------------+
|
v
+------------+------------+
| Is caching enabled for |
| this request? |
+------------+-----------+
|
Yes | No
|
v
+-----------+----------+
| Generate Cache Key |
+-----------+----------+
|
v
+-----------+----------+
| Entry exists in cache?|
+-----------+-----------+
|
Yes | No
|
v
+------------+------------+
| Is entry expired by TTL?|
+------------+------------+
|
No | Yes
|
v
+------------+------------+ +------------------------+
| Return cached response | | Call Backend API |
+-------------------------+ +-----------+------------+
|
v
+------------+-----------+
| Store response in |
| CacheStore using LRU/ |
| LFU/TTL strategy |
+------------+-----------+
|
v
+------------+-----------+
| Return fresh response |
+------------------------+
6. Core Building Blocks of the Framework
6.1 Cache Entry Structure
export interface CacheEntry<T> {
key: string;
value: T;
lastAccessed: number;
frequency: number;
createdAt: number;
ttl?: number; // in milliseconds
}
6.2 CacheStore (in-memory map)
export class CacheStore {
private store = new Map<string, CacheEntry<any>>();
get<T>(key: string): CacheEntry<T> | undefined {
return this.store.get(key);
}
set<T>(key: string, entry: CacheEntry<T>) {
this.store.set(key, entry);
}
delete(key: string) {
this.store.delete(key);
}
keys(): string[] {
return Array.from(this.store.keys());
}
size(): number {
return this.store.size;
}
}
7. Implementing Caching Strategies
7.1 TTL Strategy (Time-To-Live)
TTL defines how long a cached item remains fresh.
export function isTTLExpired(entry: CacheEntry<any>): boolean {
if (!entry.ttl) return false;
return (Date.now() - entry.createdAt) > entry.ttl;
}
7.2 LRU Strategy (Least Recently Used)
Remove entries that were accessed least recently.
export function evictLRU(store: CacheStore, maxSize: number) {
if (store.size() <= maxSize) return;
const items = Array.from(store['store'].values());
items.sort((a, b) => a.lastAccessed - b.lastAccessed);
const toEvict = items[0];
store.delete(toEvict.key);
}
7.3 LFU Strategy (Least Frequently Used)
Remove entries with lowest usage.
export function evictLFU(store: CacheStore, maxSize: number) {
if (store.size() <= maxSize) return;
const items = Array.from(store['store'].values());
items.sort((a, b) => a.frequency - b.frequency);
const toEvict = items[0];
store.delete(toEvict.key);
}
8. Combined Strategy Implementation
A realistic framework uses TTL + LRU or TTL + LFU.
export type CacheStrategy = 'LRU' | 'LFU';
export interface CacheConfig {
maxSize: number;
strategy: CacheStrategy;
defaultTTL?: number;
}
Eviction selector:
function evict(store: CacheStore, config: CacheConfig) {
if (config.strategy === 'LRU') {
evictLRU(store, config.maxSize);
} else {
evictLFU(store, config.maxSize);
}
}
9. Building the Core CacheService
@Injectable({ providedIn: 'root' })
export class CacheService {
constructor(@Inject('CACHE_CONFIG') private config: CacheConfig) {}
private store = new CacheStore();
get<T>(key: string): T | undefined {
const entry = this.store.get<T>(key);
if (!entry) return undefined;
if (isTTLExpired(entry)) {
this.store.delete(key);
return undefined;
}
entry.frequency++;
entry.lastAccessed = Date.now();
return entry.value;
}
set<T>(key: string, value: T, ttl?: number) {
const entry: CacheEntry<T> = {
key,
value,
frequency: 1,
lastAccessed: Date.now(),
createdAt: Date.now(),
ttl: ttl || this.config.defaultTTL
};
this.store.set(key, entry);
evict(this.store, this.config);
}
clear() {
this.store = new CacheStore();
}
}
10. Integrating with HttpInterceptor (Transparent Caching)
10.1 Interceptor logic
@Injectable()
export class CacheInterceptor implements HttpInterceptor {
constructor(private cache: CacheService) {}
intercept(req: HttpRequest<any>, next: HttpHandler): Observable<HttpEvent<any>> {
if (req.method !== 'GET') {
return next.handle(req);
}
const cacheKey = req.urlWithParams;
const cachedResponse = this.cache.get<HttpResponse<any>>(cacheKey);
if (cachedResponse) {
return of(cachedResponse.clone());
}
return next.handle(req).pipe(
tap(event => {
if (event instanceof HttpResponse) {
this.cache.set(cacheKey, event);
}
})
);
}
}
10.2 Provide the interceptor
{
provide: HTTP_INTERCEPTORS,
useClass: CacheInterceptor,
multi: true
}
11. Route-Level and Service-Level Caching
Route-level cache example
Use resolve or store route-based TTL config.
Service-level example using decorator
export function Cached(ttl?: number) {
return function (target: any, key: string, descriptor: PropertyDescriptor) {
const original = descriptor.value;
descriptor.value = function (...args: any[]) {
const cacheKey = `${key}:${JSON.stringify(args)}`;
const cached = this.cache.get(cacheKey);
if (cached) return of(cached);
return original.apply(this, args).pipe(
tap((result: any) => this.cache.set(cacheKey, result, ttl))
);
};
return descriptor;
};
}
12. Persistence Options (Memory, LocalStorage, SessionStorage)
When to use memory-only
Short-lived data
Sensitive data
When to use localStorage
When to use sessionStorage
You can extend CacheStore to plug in alternative stores.
13. Invalidating Cache Automatically
13.1 Event-based invalidation
Examples:
User logs out → clear user-specific cache
Feature flag updates → wipe relevant cached APIs
Settings update → invalidate affected resources
13.2 Server-driven invalidation
Backends send ETag or Last-Modified.
Interceptor compares and updates.
13.3 Manual invalidation
Expose:
clearKey(key: string)
clearGroup(prefix: string)
clearAll()
14. Monitoring and Debugging
Add a small debug panel accessible only in development:
This encourages developers to understand and respect cache behaviour.
15. Performance Metrics to Track
You can publish metrics via console logs, Angular dev tools, or custom dashboards.
16. Production Recommendations
Always use TTL + LRU for general data.
Use LFU for lookup APIs that are accessed frequently.
Cache only GET requests.
Keep cache maxSize small: 100 to 300 entries for most SPAs.
Avoid caching large objects (> 1 MB).
Do not cache authenticated-sensitive endpoints without strong controls.
Always implement an easy way to disable caching when debugging issues.
17. Conclusion
A well-designed caching framework significantly improves the performance and stability of Angular applications. Using TTL, LRU, and LFU together creates predictable and memory-efficient caching behaviour. By integrating the system with HttpInterceptor and providing service-level decorators, developers gain a unified and reusable approach across the entire codebase.
Such a framework also allows advanced features such as offline support, route-based caching, and distributed invalidation. Most importantly, it brings structure and governance to client-side caching so teams can scale confidently.