@asaidimu/utils-cache v2.0.4
@asaidimu/utils-cache
An intelligent, configurable in-memory cache library for Node.js and browser environments, designed for optimal performance, data consistency, and developer observability.
š Quick Links
- Overview & Features
- Installation & Setup
- Usage Documentation
- Project Architecture
- Development & Contributing
- Additional Information
š¦ Overview & Features
Detailed Description
@asaidimu/utils-cache
provides a robust, in-memory caching solution designed for applications that require efficient data retrieval, resilience against network failures, and state persistence across sessions or processes. It implements common caching patterns like stale-while-revalidate and Least Recently Used (LRU) eviction, along with advanced features such as automatic retries for failed fetches, an extensible persistence mechanism, and a comprehensive event system for real-time monitoring.
Unlike simpler caches, Cache
manages data freshness intelligently, allowing you to serve stale data immediately while a fresh copy is being fetched in the background. Its pluggable persistence layer enables you to save and restore the cache state, making it ideal for client-side applications that need to maintain state offline or server-side applications that need rapid startup with pre-populated data. With built-in metrics and events, @asaidimu/utils-cache
offers deep insights into cache performance and lifecycle, ensuring both speed and data integrity.
Key Features
- Configurable In-Memory Store: Provides fast access to cached data with an underlying
Map
structure. - Stale-While-Revalidate (SWR): Serve existing data immediately while fetching new data in the background, minimizing perceived latency and improving user experience.
- Automatic Retries with Exponential Backoff: Configurable retry attempts and an exponentially increasing delay between retries for
fetchFunction
failures, enhancing resilience to transient network issues. - Pluggable Persistence: Seamlessly integrates with any
SimplePersistence
implementation (e.g., LocalStorage, IndexedDB via@asaidimu/utils-persistence
, or custom backend) to save and restore cache state across application restarts or sessions.- Debounced Persistence Writes: Optimizes write frequency to the underlying persistence layer, reducing I/O operations and improving performance.
- Remote Update Handling: Automatically synchronizes cache state when the persistence layer is updated externally by other instances or processes.
- Custom Serialization/Deserialization: Provides options to serialize and deserialize complex data types (e.g.,
Date
,Map
, custom classes) for proper storage and retrieval.
- Configurable Eviction Policies:
- Time-Based (TTL): Automatically evicts entries that haven't been accessed for a specified
cacheTime
, managing memory efficiently. - Size-Based (LRU): Evicts least recently used items when the
maxSize
limit is exceeded, preventing unbounded memory growth.
- Time-Based (TTL): Automatically evicts entries that haven't been accessed for a specified
- Comprehensive Event System: Subscribe to granular cache events (e.g.,
'hit'
,'miss'
,'fetch'
,'error'
,'eviction'
,'invalidation'
,'set_data'
,'persistence'
) for real-time logging, debugging, analytics, and advanced reactivity. - Performance Metrics: Built-in tracking for
hits
,misses
,fetches
,errors
,evictions
, andstaleHits
, providing insights into cache efficiency with calculated hit rates. - Flexible Query Management: Register asynchronous
fetchFunction
s for specific keys, allowing theCache
instance to intelligently manage their data lifecycle, including fetching, caching, and invalidation. - Imperative Control: Offers direct methods for
invalidate
(making data stale),prefetch
(loading data proactively),refresh
(forcing a re-fetch),setData
(manual data injection), andremove
operations. - TypeScript Support: Fully typed API for enhanced developer experience, compile-time safety, and autocompletion.
š ļø Installation & Setup
Prerequisites
- Node.js (v14.x or higher)
- npm, yarn, or bun
Installation Steps
Install @asaidimu/utils-cache
using your preferred package manager:
bun add @asaidimu/utils-cache
# or
npm install @asaidimu/utils-cache
# or
yarn add @asaidimu/utils-cache
Configuration
Cache
is initialized with a CacheOptions
object, allowing you to customize its behavior globally. Individual queries registered via registerQuery
can override these options for specific data keys.
import { Cache } from '@asaidimu/utils-cache';
// Example persistence layer (install separately, e.g., @asaidimu/utils-persistence)
import { IndexedDBPersistence } from '@asaidimu/utils-persistence'; // Example
const myCache = new Cache({
staleTime: 5 * 60 * 1000, // Data considered stale after 5 minutes (5 * 60 * 1000ms)
cacheTime: 30 * 60 * 1000, // Data evicted if not accessed for 30 minutes
retryAttempts: 2, // Retry fetch up to 2 times on failure
retryDelay: 2000, // 2-second initial delay between retries (doubles each attempt)
maxSize: 500, // Maximum 500 entries in cache (LRU eviction)
enableMetrics: true, // Enable performance tracking
// Persistence options (optional but recommended for stateful caches)
persistence: new IndexedDBPersistence('my-app-db'), // Plug in your persistence layer
persistenceId: 'my-app-cache-v1', // Unique ID for this cache instance in persistence
persistenceDebounceTime: 1000, // Debounce persistence writes by 1 second
// Custom serializers/deserializers for non-JSON-serializable data (optional)
serializeValue: (value: any) => {
if (value instanceof Map) return { _type: 'Map', data: Array.from(value.entries()) };
if (value instanceof Date) return { _type: 'Date', data: value.toISOString() };
return value;
},
deserializeValue: (value: any) => {
if (typeof value === 'object' && value !== null) {
if (value._type === 'Map') return new Map(value.data);
if (value._type === 'Date') return new Date(value.data);
}
return value;
},
});
// Negative option values are automatically clamped to 0 with a console warning.
const invalidCache = new Cache({ staleTime: -100, cacheTime: -1, maxSize: -5 });
// console.warn output for each negative value will appear
Verification
To verify that Cache
is installed and initialized correctly, you can run a simple test:
import { Cache } from '@asaidimu/utils-cache';
const cache = new Cache();
console.log('Cache initialized successfully!');
// Register a simple query
cache.registerQuery('hello', async () => {
console.log('Fetching "hello" data...');
return 'world';
});
// Try to fetch data
cache.get('hello').then(data => {
console.log(`Fetched 'hello': ${data}`); // Expected: Fetching "hello" data... \n Fetched 'hello': world
}).catch(error => {
console.error('Error fetching:', error);
});
š Usage Documentation
Basic Usage
The core of Cache
involves registering queries (data fetching functions) and then retrieving data using those queries.
import { Cache } from '@asaidimu/utils-cache';
const myCache = new Cache({
staleTime: 5000, // Data becomes stale after 5 seconds
cacheTime: 60000, // Data will be garbage collected if not accessed for 1 minute
});
// 1. Register a query with a unique string key and an async function to fetch the data.
myCache.registerQuery('user/123', async () => {
console.log('--- Fetching user data from API... ---');
// Simulate network delay
await new Promise(resolve => setTimeout(resolve, 1000));
return { id: 123, name: 'Alice', email: 'alice@example.com' };
}, { staleTime: 2000 }); // Override staleTime for this specific query to 2 seconds
// 2. Retrieve data from the cache using `get()`.
async function getUserData(label: string) {
console.log(`\n${label}: Requesting user/123`);
const userData = await myCache.get('user/123'); // Default: stale-while-revalidate
console.log(`${label}: User data received:`, userData);
}
// First call: Data is not in cache (miss). Triggers fetch.
getUserData('Initial Call');
// Subsequent calls (within staleTime): Data is returned instantly from cache. No fetch.
setTimeout(() => getUserData('Cached Call'), 500);
// Call after query's staleTime: Data is returned instantly, but a background fetch is triggered.
setTimeout(() => getUserData('Stale & Background Fetch'), 2500);
// Example of waiting for fresh data
async function getFreshUserData() {
console.log('\n--- Requesting FRESH user data (waiting for fetch)... ---');
try {
const freshUserData = await myCache.get('user/123', { waitForFresh: true });
console.log('Fresh user data received:', freshUserData);
} catch (error) {
console.error('Failed to get fresh user data:', error);
}
}
// This will wait for the background fetch triggered by the previous call (if still ongoing) or trigger a new one.
setTimeout(() => getFreshUserData(), 3000);
API Usage
new Cache(defaultOptions?: CacheOptions)
Creates a new Cache
instance with global default options.
import { Cache } from '@asaidimu/utils-cache';
const cache = new Cache({
staleTime: 5 * 60 * 1000, // 5 minutes
cacheTime: 30 * 60 * 1000, // 30 minutes
maxSize: 1000,
});
cache.registerQuery<T>(key: string, fetchFunction: () => Promise<T>, options?: CacheOptions): void
Registers a data fetching function associated with a unique key
. This fetchFunction
will be called when data for the key
is not in cache, is stale, or explicitly invalidated/refreshed.
key
: A unique string identifier for the data.fetchFunction
: Anasync
function that returns aPromise
resolving to the data of typeT
.options
: OptionalCacheOptions
to override the instance's default options for this specific query (e.g., a shorterstaleTime
for frequently changing data).
cache.registerQuery('products/featured', async () => {
const response = await fetch('https://api.example.com/products/featured');
if (!response.ok) throw new Error('Failed to fetch featured products');
return response.json();
}, {
staleTime: 60 * 1000, // This query's data is stale after 1 minute
retryAttempts: 5, // It will retry fetching up to 5 times
});
cache.get<T>(key: string, options?: { waitForFresh?: boolean; throwOnError?: boolean }): Promise<T | undefined>
Retrieves data for a given key
.
- If data is fresh, returns it immediately.
- If data is stale (and
waitForFresh
isfalse
or unset), returns it immediately and triggers a background refetch (stale-while-revalidate). - If data is not in cache (miss), it triggers a fetch.
waitForFresh
: Iftrue
, the method will await thefetchFunction
to complete and return fresh data. Iffalse
(default), it will return existing stale data immediately if available, otherwiseundefined
while a fetch is ongoing in the background.throwOnError
: Iftrue
, and thefetchFunction
fails after all retries, the promise returned byget
will reject with the error. Iffalse
(default), it will returnundefined
on fetch failure, or the last successfully fetched data if available.
// Basic usage (stale-while-revalidate)
const post = await cache.get('posts/latest');
// Wait for fresh data, throw if fetch fails
try {
const userProfile = await cache.get('user/profile', { waitForFresh: true, throwOnError: true });
console.log('Latest user profile:', userProfile);
} catch (error) {
console.error('Could not get fresh user profile due to an error:', error);
}
cache.peek<T>(key: string): T | undefined
Retrieves data from the cache without triggering any fetches, updating lastAccessed
time, or accessCount
. Useful for quick synchronous checks.
const cachedValue = cache.peek('some-config-key');
if (cachedValue) {
console.log('Value is in cache:', cachedValue);
} else {
console.log('Value not found in cache.');
}
cache.has(key: string): boolean
Checks if a non-stale, non-loading entry exists in the cache for the given key
.
if (cache.has('config/app')) {
console.log('App config is ready and fresh.');
} else {
console.log('App config is missing, stale, or currently loading.');
}
cache.invalidate(key: string, refetch = true): Promise<void>
Marks a specific cache entry as stale, forcing the next get
call for that key to trigger a refetch. Optionally triggers an immediate background refetch.
key
: The cache key to invalidate.refetch
: Iftrue
(default), triggers an immediate background fetch for the invalidated key using its registeredfetchFunction
.
// After updating a user, invalidate their profile data to ensure next fetch is fresh
await cache.invalidate('user/123/profile');
// Invalidate and don't refetch until `get` is explicitly called later
await cache.invalidate('admin/dashboard/stats', false);
cache.invalidatePattern(pattern: RegExp, refetch = true): Promise<void>
Invalidates all cache entries whose keys match the given regular expression. Similar to invalidate
, it optionally triggers immediate background refetches for all matched keys.
pattern
: ARegExp
object to match against cache keys.refetch
: Iftrue
(default), triggers immediate background fetches for all matched keys.
// Invalidate all product-related data (e.g., after a mass product update)
await cache.invalidatePattern(/^products\//); // Matches 'products/1', 'products/list', etc.
// Invalidate all items containing 'temp' in their key, without immediate refetch
await cache.invalidatePattern(/temp/, false);
cache.prefetch(key: string): Promise<void>
Triggers a background fetch for a key
if it's not already in cache or is stale. Useful for loading data proactively before it's explicitly requested.
// On application startup or route change, prefetch common data
cache.prefetch('static-content/footer');
cache.prefetch('user/notifications/unread');
cache.refresh<T>(key: string): Promise<T | undefined>
Forces a re-fetch of data for a given key
, bypassing staleness checks and any existing fetch promises. This ensures you always get the latest data. Returns the fresh data or undefined
if the fetch ultimately fails.
// After an API call modifies a resource, force update its cached version
const updatedUser = await cache.refresh('user/current');
console.log('User data refreshed:', updatedUser);
cache.setData<T>(key: string, data: T): void
Manually sets or updates data in the cache for a given key
. This immediately updates the cache entry, marks it as fresh (by setting lastUpdated
to Date.now()
), and triggers persistence if configured. It bypasses any registered fetchFunction
.
// Manually update a shopping cart item count after a local UI interaction
cache.setData('cart/item-count', 5);
// Directly inject data fetched from another source or computed locally
const localConfig = { theme: 'dark', fontSize: 'medium' };
cache.setData('app/settings', localConfig);
cache.remove(key: string): boolean
Removes a specific entry from the cache. Returns true
if an entry was found and removed, false
otherwise. Also clears any ongoing fetches for that key and triggers persistence.
// When a user logs out, remove their specific session data
cache.remove('user/session');
cache.on<EType extends CacheEventType>(event: EType, listener: (ev: Extract<CacheEvent, { type: EType }>) => void): void
Subscribes a listener function to specific cache events.
event
: The type of event to listen for (e.g.,'hit'
,'miss'
,'error'
,'persistence'
). SeeCacheEventType
intypes.ts
for all available types.listener
: A callback function that receives the specific event payload for the subscribed event type.
import { Cache, CacheEvent, CacheEventType } from '@asaidimu/utils-cache';
const myCache = new Cache();
myCache.on('hit', (e) => {
console.log(`[CacheEvent] HIT for ${e.key} (isStale: ${e.isStale})`);
});
myCache.on('miss', (e) => {
console.log(`[CacheEvent] MISS for ${e.key}`);
});
myCache.on('error', (e) => {
console.error(`[CacheEvent] ERROR for ${e.key} (attempt ${e.attempt}):`, e.error.message);
});
myCache.on('persistence', (e) => {
if (e.event === 'save_success') {
console.log(`[CacheEvent] Persistence: Cache state saved successfully for ID: ${e.key}`);
} else if (e.event === 'load_fail') {
console.error(`[CacheEvent] Persistence: Failed to load cache state for ID: ${e.key}`, e.error);
} else if (e.event === 'remote_update') {
console.log(`[CacheEvent] Persistence: Cache state updated from remote source for ID: ${e.key}`);
}
});
// For demonstration, register a query and trigger events
myCache.registerQuery('demo-item', async () => {
console.log('--- Fetching demo-item ---');
await new Promise(r => setTimeout(r, 200));
return 'demo-data';
}, { staleTime: 100 });
myCache.get('demo-item'); // Triggers miss, fetch, set_data
setTimeout(() => myCache.get('demo-item'), 50); // Triggers hit
setTimeout(() => myCache.get('demo-item'), 150); // Triggers stale hit, background fetch
cache.off<EType extends CacheEventType>(event: EType, listener: (ev: Extract<CacheEvent, { type: EType }>) => void): void
Unsubscribes a previously registered listener from a cache event. The listener
reference must be the exact same function that was passed to on()
.
const myHitLogger = (e: any) => console.log(`[Log] Cache Hit: ${e.key}`);
myCache.on('hit', myHitLogger);
// Later, when you no longer need the listener:
myCache.off('hit', myHitLogger);
cache.getStats(): { size: number; metrics: CacheMetrics; hitRate: number; staleHitRate: number; entries: Array<{ key: string; lastAccessed: number; lastUpdated: number; accessCount: number; isStale: boolean; isLoading?: boolean; error?: boolean }> }
Returns current cache statistics and detailed metrics.
size
: Number of active entries in the cache.metrics
: An object containing raw counts (hits
,misses
,fetches
,errors
,evictions
,staleHits
).hitRate
: Ratio of hits to total requests (hits + misses).staleHitRate
: Ratio of stale hits to total hits.entries
: An array of objects providing details for each cached item (key, lastAccessed, lastUpdated, accessCount, isStale, isLoading, error status).
const stats = myCache.getStats();
console.log('Cache Size:', stats.size);
console.log('Metrics:', stats.metrics);
console.log('Overall Hit Rate:', (stats.hitRate * 100).toFixed(2) + '%');
console.log('Entries details:', stats.entries);
cache.clear(): Promise<void>
Clears all data from the in-memory cache, resets metrics, and attempts to clear the associated persisted state via the persistence
layer.
console.log('Clearing cache...');
await myCache.clear();
console.log('Cache cleared. Current size:', myCache.getStats().size);
cache.destroy(): void
Shuts down the cache instance, clearing all data, stopping the automatic garbage collection timer, unsubscribing from persistence updates, and clearing all internal maps. Call this when the cache instance is no longer needed (e.g., on application shutdown or component unmount) to prevent memory leaks and ensure proper cleanup.
myCache.destroy();
console.log('Cache instance destroyed. All timers stopped and data cleared.');
Configuration Examples
The CacheOptions
interface provides extensive control over the cache's behavior:
import { CacheOptions, SimplePersistence, SerializableCacheState } from '@asaidimu/utils-cache';
// A mock persistence layer for demonstration purposes.
// In a real application, you'd use an actual implementation like IndexedDBPersistence.
class MockPersistence implements SimplePersistence<SerializableCacheState> {
private store = new Map<string, SerializableCacheState>();
private subscribers = new Map<string, Array<(data: SerializableCacheState) => void>>();
async get(id: string): Promise<SerializableCacheState | undefined> {
console.log(`[MockPersistence] Getting state for ID: ${id}`);
return this.store.get(id);
}
async set(id: string, data: SerializableCacheState): Promise<void> {
console.log(`[MockPersistence] Setting state for ID: ${id}`);
this.store.set(id, data);
// Simulate remote update notification to all subscribed instances
this.subscribers.get(id)?.forEach(cb => cb(data));
}
async clear(id?: string): Promise<void> {
console.log(`[MockPersistence] Clearing state ${id ? 'for ID: ' + id : '(all)'}`);
if (id) {
this.store.delete(id);
} else {
this.store.clear();
}
}
subscribe(id: string, callback: (data: SerializableCacheState) => void): () => void {
console.log(`[MockPersistence] Subscribing to ID: ${id}`);
if (!this.subscribers.has(id)) {
this.subscribers.set(id, []);
}
this.subscribers.get(id)?.push(callback);
// Return unsubscribe function
return () => {
const callbacks = this.subscribers.get(id);
if (callbacks) {
this.subscribers.set(id, callbacks.filter(cb => cb !== callback));
}
console.log(`[MockPersistence] Unsubscribed from ID: ${id}`);
};
}
}
const fullOptions: CacheOptions = {
staleTime: 1000 * 60 * 5, // 5 minutes: After this time, data is stale; a background fetch is considered.
cacheTime: 1000 * 60 * 60, // 1 hour: Items idle (not accessed) for this long are eligible for garbage collection.
retryAttempts: 3, // Max 3 fetch attempts (initial + 2 retries) on network/fetch failures.
retryDelay: 1000, // 1 second initial delay for retries (doubles each subsequent attempt).
maxSize: 2000, // Keep up to 2000 entries; LRU eviction kicks in beyond this limit.
enableMetrics: true, // Enable performance tracking (hits, misses, fetches, etc.).
persistence: new MockPersistence(), // Provide an instance of your persistence layer implementation.
persistenceId: 'my-unique-cache-instance', // A unique identifier for this cache instance within the persistence store.
persistenceDebounceTime: 750, // Wait 750ms after a cache change before writing to persistence to batch writes.
// Custom serializers/deserializers for data that isn't natively JSON serializable (e.g., Maps, Dates, custom classes).
serializeValue: (value: any) => {
// Example: Convert Date objects to ISO strings for JSON serialization
if (value instanceof Date) {
return { _type: 'Date', data: value.toISOString() };
}
// Example: Convert Map objects to an array for JSON serialization
if (value instanceof Map) {
return { _type: 'Map', data: Array.from(value.entries()) };
}
return value; // Return as is for other types
},
deserializeValue: (value: any) => {
// Example: Convert ISO strings back to Date objects
if (typeof value === 'object' && value !== null && value._type === 'Date') {
return new Date(value.data);
}
// Example: Convert array back to Map objects
if (typeof value === 'object' && value !== null && value._type === 'Map') {
return new Map(value.data);
}
return value; // Return as is for other types
},
};
const configuredCache = new Cache(fullOptions);
Common Use Cases
Caching API Responses with SWR (Stale-While-Revalidate)
This is the default and most common pattern, where you prioritize immediate responsiveness while ensuring data freshness in the background.
import { Cache } from '@asaidimu/utils-cache';
const apiCache = new Cache({
staleTime: 5 * 60 * 1000, // Data considered stale after 5 minutes
cacheTime: 30 * 60 * 1000, // Idle data garbage collected after 30 minutes
retryAttempts: 3, // Retry fetching on network failures
});
// Register a query for a list of blog posts
apiCache.registerQuery('blog/posts', async () => {
console.log('--- Fetching ALL blog posts from API... ---');
const response = await fetch('https://api.example.com/blog/posts');
if (!response.ok) throw new Error('Failed to fetch blog posts');
return response.json();
});
// Function to display blog posts
async function displayBlogPosts(source: string) {
console.log(`\nDisplaying blog posts from: ${source}`);
const posts = await apiCache.get('blog/posts'); // Uses SWR by default
if (posts) {
console.log(`Received ${posts.length} posts (first 2):`, posts.slice(0, 2).map((p: any) => p.title));
} else {
console.log('No posts yet, waiting for initial fetch...');
}
}
displayBlogPosts('Initial Load'); // First `get`: cache miss, triggers fetch.
setTimeout(() => displayBlogPosts('After 1 sec (cached)'), 1000); // Second `get`: cache hit, returns instantly.
setTimeout(() => displayBlogPosts('After 6 mins (stale & background fetch)'), 6 * 60 * 1000); // After `staleTime`: returns cached, triggers background fetch.
Using waitForFresh
for Critical Data
For scenarios where serving outdated data is unacceptable (e.g., user permissions, critical configuration).
import { Cache } from '@asaidimu/utils-cache';
const criticalCache = new Cache({ retryAttempts: 5, retryDelay: 1000 });
criticalCache.registerQuery('user/permissions', async () => {
console.log('--- Fetching user permissions from API... ---');
// Simulate potential network flakiness
if (Math.random() > 0.7) {
throw new Error('Network error during permission fetch!');
}
await new Promise(resolve => setTimeout(resolve, 500));
return { canEdit: true, canDelete: false, roles: ['user', 'editor'] };
});
async function checkPermissionsBeforeAction() {
console.log('\nAttempting to get FRESH user permissions...');
try {
// We MUST have the latest permissions before proceeding with a sensitive action
const permissions = await criticalCache.get('user/permissions', { waitForFresh: true, throwOnError: true });
console.log('User permissions received:', permissions);
// Proceed with action based on permissions
} catch (error) {
console.error('CRITICAL: Failed to load user permissions:', error);
// Redirect to error page, show critical alert, or disable functionality
}
}
checkPermissionsBeforeAction();
// You might call this repeatedly in a test scenario to see retries and eventual success/failure
setInterval(() => checkPermissionsBeforeAction(), 3000);
Real-time Monitoring with Events
Utilize the comprehensive event system to log, monitor, or react to cache lifecycle events.
import { Cache } from '@asaidimu/utils-cache';
const monitorCache = new Cache({ enableMetrics: true });
monitorCache.registerQuery('stock/AAPL', async () => {
const price = Math.random() * 100 + 150;
console.log(`--- Fetching AAPL price: $${price.toFixed(2)} ---`);
return { symbol: 'AAPL', price: parseFloat(price.toFixed(2)), timestamp: Date.now() };
}, { staleTime: 1000 }); // Very short staleTime for frequent fetches
// Subscribe to various cache events
monitorCache.on('fetch', (e) => {
console.log(`[EVENT] Fetching ${e.key} (attempt ${e.attempt})`);
});
monitorCache.on('hit', (e) => {
console.log(`[EVENT] Cache hit for ${e.key}. Stale: ${e.isStale}`);
});
monitorCache.on('miss', (e) => {
console.log(`[EVENT] Cache miss for ${e.key}`);
});
monitorCache.on('eviction', (e) => {
console.log(`[EVENT] Evicted ${e.key} due to ${e.reason}`);
});
monitorCache.on('set_data', (e) => {
console.log(`[EVENT] Data for ${e.key} manually set. Old price: ${e.oldData?.price}, New price: ${e.newData.price}`);
});
monitorCache.on('persistence', (e) => {
if (e.event === 'save_success') console.log(`[EVENT] Persistence: ${e.message || 'Save successful'}`);
});
// Continuously try to get data (will trigger fetches due to short staleTime)
setInterval(() => {
monitorCache.get('stock/AAPL');
}, 500);
// Manually set data to trigger 'set_data' and 'persistence' events
setTimeout(() => {
monitorCache.setData('stock/AAPL', { symbol: 'AAPL', price: 160.00, timestamp: Date.now() });
}, 3000);
// Log cache statistics periodically
setInterval(() => {
const stats = monitorCache.getStats();
console.log(`\n--- CACHE STATS ---`);
console.log(`Size: ${stats.size}, Hits: ${stats.metrics.hits}, Misses: ${stats.metrics.misses}, Fetches: ${stats.metrics.fetches}`);
console.log(`Hit Rate: ${(stats.hitRate * 100).toFixed(2)}%, Stale Hit Rate: ${(stats.staleHitRate * 100).toFixed(2)}%`);
console.log(`Active entries: ${stats.entries.map(e => `${e.key} (stale:${e.isStale})`).join(', ')}`);
console.log(`-------------------\n`);
}, 5000); // Log stats every 5 seconds
šļø Project Architecture
The @asaidimu/utils-cache
library is structured to provide a clear separation of concerns, making it modular, testable, and extensible.
Directory Structure
src/cache/
āāā cache.ts # Main Cache class implementation
āāā index.ts # Entry point for the module (re-exports Cache)
āāā types.ts # TypeScript interfaces and types for options, entries, events, etc.
āāā cache.test.ts # Unit tests for the Cache class
package.json # Package metadata and dependencies for this specific module
Core Components
Cache
Class (cache.ts
): The central component of the library. It orchestrates all caching logic, including:- Managing the in-memory
Map
(this.cache
) that storesCacheEntry
objects. - Handling data fetching, retries, and staleness checks.
- Implementing time-based (TTL) and size-based (LRU) garbage collection.
- Integrating with the pluggable persistence layer.
- Emitting detailed cache events.
- Tracking performance metrics.
- Managing the in-memory
CacheOptions
(types.ts
): An interface defining the configurable parameters for aCache
instance or individual queries. This includesstaleTime
,cacheTime
,retryAttempts
,maxSize
, persistence settings, and custom serialization/deserialization functions.CacheEntry
(types.ts
): Represents a single item stored within the cache. It encapsulates the actualdata
,lastUpdated
andlastAccessed
timestamps,accessCount
, and flags likeisLoading
orerror
status.QueryConfig
(types.ts
): Stores thefetchFunction
and the resolvedCacheOptions
(merged with instance defaults) for each registered query, enabling tailored behavior per data key.CacheMetrics
(types.ts
): Defines the structure for tracking cache performance statistics, including hits, misses, fetches, errors, and evictions.SimplePersistence<SerializableCacheState>
(from@asaidimu/utils-persistence
): An external interface thatCache
relies on for persistent storage. It requires implementations ofget()
,set()
,clear()
, and optionallysubscribe()
methods to handle data serialization and deserialization for the specific storage medium (e.g., IndexedDB, LocalStorage, or a remote backend).CacheEvent
/CacheEventType
(types.ts
): A union type defining all possible events emitted by the cache (e.g.,'hit'
,'miss'
,'fetch'
,'error'
,'eviction'
,'invalidation'
,'set_data'
,'persistence'
). This enables a fine-grained observability model for the cache's lifecycle.
Data Flow
Initialization:
- The
Cache
constructor sets up global default options, initializes performance metrics, and starts the automatic garbage collection timer. - If a
persistence
layer is configured, it attempts to load a previously saved state usingpersistence.get()
. - It then subscribes to
persistence.subscribe()
(if available) to listen for remote state changes from the underlying storage, ensuring cache consistency across multiple instances or processes.
- The
registerQuery
:- When
registerQuery(key, fetchFunction, options)
is called, thefetchFunction
and its specificoptions
(merged with the globaldefaultOptions
) are stored internally in thethis.queries
map. This prepares the cache to handle requests for thatkey
.
- When
get
Request:- When
get(key, options)
is invoked,Cache
first checksthis.cache
for an existingCacheEntry
for thekey
. - Cache Hit: If an entry exists,
lastAccessed
andaccessCount
are updated, a'hit'
event is emitted, and metrics are incremented. The entry's staleness is evaluated based onstaleTime
.- If
waitForFresh
istrue
OR if the entry is stale/loading, it proceeds tofetchAndWait
. - If
waitForFresh
isfalse
(default) and the entry is stale, the cached data is returned immediately, and a backgroundfetch
is triggered to update the data. - If
waitForFresh
isfalse
and the entry is fresh, the cached data is returned immediately.
- If
- Cache Miss: If no entry exists, a
'miss'
event is emitted. A placeholderCacheEntry
(markedisLoading
) is created, and afetch
is immediately triggered to retrieve the data.
- When
fetch
/fetchAndWait
:- These methods ensure that only one
fetchFunction
runs concurrently for a givenkey
by tracking ongoing fetches inthis.fetching
. - They delegate the actual data retrieval and retry logic to
performFetchWithRetry
.
- These methods ensure that only one
performFetchWithRetry
:- This is where the registered
fetchFunction
is executed. It attempts to call thefetchFunction
multiple times (up toretryAttempts
) with exponential backoff (retryDelay
). - Before each attempt, a
'fetch'
event is emitted, andfetches
metrics are updated. - On Success: The
CacheEntry
is updated with the newdata
,lastUpdated
timestamp, and itsisLoading
status is set tofalse
. The cache then callsschedulePersistState()
to save the updated state andenforceSizeLimit()
to maintain themaxSize
. - On Failure: If the
fetchFunction
fails, an'error'
event is emitted, anderrors
metrics are updated. IfretryAttempts
are remaining, it waits (delay
) and retries. After all attempts, theCacheEntry
is updated with the lasterror
,isLoading
is set tofalse
, andschedulePersistState()
is called.
- This is where the registered
schedulePersistState
:- This method debounces write operations to the
persistence
layer. It prevents excessive writes by waiting for a configurablepersistenceDebounceTime
before serializing the current cache state (usingserializeCache
andserializeValue
) and writing it viapersistence.set()
. Appropriate'persistence'
events (save_success
/save_fail
) are emitted.
- This method debounces write operations to the
handleRemoteStateChange
:- This callback is invoked by the
persistence
layer'ssubscribe
mechanism when an external change to the persisted state is detected. It deserializes theremoteState
(usingdeserializeValue
) and intelligently updates the localthis.cache
to reflect these external changes, emitting a'persistence'
event (remote_update
).
- This callback is invoked by the
garbageCollect
:- Running on a
setInterval
timer (gcTimer
), this method periodically scansthis.cache
. It removes anyCacheEntry
that has not beenlastAccessed
for longer than its (or global)cacheTime
, emitting'eviction'
events.
- Running on a
enforceSizeLimit
:- Triggered after successful data updates (
fetch
success orsetData
). If thecache.size
exceedsmaxSize
, it evicts the Least Recently Used (LRU) entries until themaxSize
is satisfied, emitting'eviction'
events.
- Triggered after successful data updates (
Extension Points
The design of @asaidimu/utils-cache
provides several powerful extension points for customization and integration:
SimplePersistence
Interface: This is the primary mechanism for integratingCache
with various storage backends. By implementing this interface, you can useCache
withlocalStorage
,IndexedDB
(e.g., via@asaidimu/utils-persistence
), a custom database, a server-side cache, or any other persistent storage solution.serializeValue
/deserializeValue
Options: These functions withinCacheOptions
allow you to define custom logic for how your specific data types are converted to and from a serializable format (e.g., JSON-compatible strings or objects) before being passed to and received from thepersistence
layer. This is crucial for handlingDate
objects,Map
s,Set
s, or custom class instances.- Event Listeners (
on
/off
): The comprehensive event system allows you to subscribe to a wide range of cache lifecycle events. This enables powerful integrations for:- Logging: Detailed logging of cache activity (hits, misses, errors, evictions).
- Analytics: Feeding cache performance metrics into an analytics platform.
- UI Reactivity: Updating UI components in response to cache changes (e.g., showing a "stale data" indicator or a "refreshing" spinner).
- Debugging: Gaining deep insights into cache behavior during development.
- External Synchronization: Triggering side effects or synchronizing with other systems based on cache events.
š¤ Development & Contributing
We welcome contributions to @asaidimu/utils-cache
! Whether it's a bug fix, a new feature, or an improvement to the documentation, your help is appreciated.
Development Setup
To set up the development environment for @asaidimu/utils-cache
:
- Clone the monorepo:
git clone https://github.com/asaidimu/erp-utils.git cd erp-utils
- Navigate to the cache package:
cd src/cache
- Install dependencies:
npm install # or yarn install # or bun install
- Build the project:
npm run build # or yarn build # or bun run build
Scripts
The following npm
scripts are typically available in this project's setup:
npm run build
: Compiles TypeScript source files fromsrc/
to JavaScript output indist/
.npm run test
: Runs the test suite usingVitest
.npm run test:watch
: Runs tests in watch mode for continuous feedback during development.npm run lint
: Runs ESLint to check for code style and potential errors.npm run format
: Formats code using Prettier according to the project's style guidelines.
Testing
Tests are written using Vitest. To run tests:
npm test
# or
yarn test
# or
bun test
We aim for high test coverage. Please ensure that new features or bug fixes come with appropriate unit and/or integration tests to maintain code quality and prevent regressions.
Contributing Guidelines
Please follow these steps to contribute:
- Fork the repository on GitHub.
- Create a new branch for your feature or bug fix:
git checkout -b feature/my-awesome-feature
orbugfix/resolve-issue-123
. - Make your changes, ensuring they adhere to the existing code style and architecture.
- Write or update tests to cover your changes and ensure existing functionality is not broken.
- Ensure all tests pass locally by running
npm test
. - Run lint and format checks (
npm run lint
andnpm run format
) and fix any reported issues. - Write clear, concise commit messages following the Conventional Commits specification (e.g.,
feat: add new caching strategy
,fix: correct staleTime calculation
). - Push your branch to your fork.
- Open a Pull Request to the
main
branch of the original repository. Provide a detailed description of your changes and why they are necessary.
Issue Reporting
Found a bug, have a feature request, or need clarification? Please open an issue on our GitHub Issues page.
When reporting a bug, please include:
- A clear and concise description of the issue.
- Detailed steps to reproduce the behavior.
- The expected behavior.
- Any relevant screenshots or code snippets.
- Your environment details (Node.js version, OS, browser, package version).
š Additional Information
Troubleshooting
- "No query registered for key: key" Error:
- Cause: This error occurs if you try to
get()
,prefetch()
, orrefresh()
akey
that has not been previously associated with afetchFunction
usingcache.registerQuery()
. - Solution: Ensure you call
cache.registerQuery(key, fetchFunction)
for everykey
you intend to use with the cache before attempting to retrieve data.
- Cause: This error occurs if you try to
- Data not persisting:
- Cause: The cache state is not being correctly saved or loaded from the underlying storage.
- Solution:
persistence
instance: Double-check that you are passing a validSimplePersistence
instance to theCache
constructor'spersistence
option.persistenceId
: Ensure you've provided a uniquepersistenceId
if multiple cache instances share the same persistence layer.- Serialization: Verify that your data types are correctly handled by
serializeValue
anddeserializeValue
options, especially for non-JSON-serializable types likeMap
s,Date
objects, or custom classes. - Persistence Layer: Confirm your
SimplePersistence
implementation correctly handlesget()
,set()
,clear()
, andsubscribe()
operations for the specific storage medium (e.g., local storage quota, IndexedDB permissions). - Event Errors: Check for
persistence
event errors in your browser's or Node.js console (cache.on('persistence', ...)
).
- Cache not evicting data:
- Cause: Eviction policies might be disabled or configured with very long durations.
- Solution:
cacheTime
: EnsurecacheTime
inCacheOptions
is set to a finite, non-zero positive number (in milliseconds).Infinity
or0
forcacheTime
disables time-based garbage collection.maxSize
: EnsuremaxSize
is set to a finite, non-zero positive number.Infinity
disables size-based LRU eviction, and0
means the cache will always be empty (evicting immediately).- Garbage Collection Interval: The garbage collection runs periodically. While generally sufficient, verify that
cacheTime
isn't so large that you rarely hit the GC interval.
- Event listeners not firing:
- Cause: The listener might be removed, or the expected event is not actually occurring.
- Solution:
- Correct Event Type: Ensure you are subscribing to the exact
CacheEventType
you expect (e.g.,'hit'
,'error'
). enableMetrics
: If you expect metric-related events or updates, ensureenableMetrics
is not set tofalse
in yourCacheOptions
.- Listener Reference: When using
off()
, ensure the listener function is the exact same reference passed toon()
.
- Correct Event Type: Ensure you are subscribing to the exact
FAQ
Q: How does staleTime
differ from cacheTime
?
A: staleTime
determines when a cached data entry is considered "stale." Once Date.now() - entry.lastUpdated
exceeds staleTime
, the data is marked stale. If waitForFresh
is false
(default), get()
will return the stale data immediately while triggering a background refetch. cacheTime
, on the other hand, determines how long an item can remain unaccessed (idle) before it's eligible for garbage collection and removal from the cache. An item can be stale but still within its cacheTime
.
Q: When should I use waitForFresh
?
A: Use waitForFresh: true
when your application absolutely needs the most up-to-date data before proceeding, and cannot tolerate serving stale or old data. This will block execution until the fetchFunction
successfully resolves. For most UI display purposes where latency is critical and a slightly outdated display is acceptable, waitForFresh: false
(the default SWR behavior) is usually preferred, as it provides an immediate response.
Q: Can I use Cache
in a web worker?
A: Yes, Cache
is designed to be environment-agnostic. Its persistence mechanism is pluggable, so you can implement a SimplePersistence
that works within a web worker (e.g., using IndexedDB directly or communicating with the main thread via postMessage
).
Q: Is Cache
thread-safe (or safe with concurrent access)?
A: JavaScript is single-threaded. Cache
manages its internal state with Map
s and Promise
s. For concurrent get
requests to the same key, it ensures only one fetchFunction
runs via the this.fetching
map, preventing redundant fetches. Therefore, it is safe for concurrent access within a single JavaScript runtime context. For multiple JavaScript runtimes (e.g., different browser tabs or Node.js processes), the persistence
layer's subscribe
mechanism handles synchronization.
Changelog/Roadmap
For a detailed history of changes, new features, and bug fixes, please refer to the CHANGELOG.md file in the repository root. Our future plans are outlined in the ROADMAP.md (if available).
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Inspired by modern data fetching and caching libraries like React Query and SWR.
- Uses the
uuid
library for generating unique cache instance IDs.