@keyvhq/memoize v2.1.9
@keyvhq/memoize
Memoize any function using Keyv as storage backend.
Install
npm install --save @keyvhq/memoizeUsage
const memoize = require('@keyvhq/memoize')
const memoizedRequest = memoize(request)
memoizedRequest('http://example.com').then(res => { /* from request */ })
memoizedRequest('http://example.com').then(res => { /* from cache */ })You can pass a keyv instance or options to be used as argument:
const memoize = require('@keyvhq/memoize')
const Keyv = require('@keyvhq/core')
memoize(request, { store: new Keyv({ namespace: 'ssr' }) })Defining the key
By default the first argument of your function call is used as cache key.
You can pass a function to define how key will be defined. The key function will be called with the same arguments as the function.
const sum = (n1, n2) => n1 + n2
const memoized = memoize(sum, new Keyv(), {
key: (n1, n2) => `${n1}+${n2}`
})
// cached as { '1+2': 3 }
memoized(1, 2)The library uses flood protection internally based on the result of the key.
This means you can make as many requests as you want simultaneously while being sure you won't flood your async resource.
Setup your TTL
Set ttl to a number for a static TTL value.
const memoizedRequest = memoize(request, new Keyv(), { ttl: 60000 })
// cached for 60 seconds
memoizedRequest('http://example.com')Set ttl to a function for a dynamic TTL value.
const memoizedRequest = memoize(request, new Keyv(), {
ttl: (res) => res.statusCode === 200 ? 60000 : 0
})
// cached for 60 seconds only if response was 200 OK
memoizedRequest('http://example.com')Stale support
Set staleTtl to any number of milliseconds.
If the ttl of a requested resource is below this staleness threshold we will still return the stale value but meanwhile asynchronously refresh the value.
const memoizedRequest = memoize(request, new Keyv(), {
ttl: 60000,
staleTtl: 10000
})
// cached for 60 seconds
memoizedRequest('http://example.com')
// … 55 seconds later
// Our cache will expire in 5 seconds.
// This is below the staleness threshold of 10 seconds.
// returns cached result + refresh cache on background
memoizedRequest('http://example.com')When the staleTtl option is set we won't delete expired items either. The same logic as above applies.
API
memoize(fn, [keyvOptions], [options])
fn
Type: Function
Required
Promise-returning or async function to be memoized.
keyvOptions
Type: Object
The Keyv instance or keyv#options to be used.
options
key
Type: Function
Default: identity
It defines how the get will be obtained.
The signature of the function should be a String to be used as key associated with the cache copy:
key: ({ req }) => req.urlJust in case you need a more granular control, you can return an Array, where the second value determines the expiration behavior:
key: ({ req }) => [req.url, req.query.forceExpiration]objectMode
Type: Boolean
Default: false
When is true, the result will be an Array, being the second item in the Array some information about the item:
const fn = () => Promise.reject(new Error('NOPE'))
const keyv = new Keyv()
const memoizedSum = memoize(fn, keyv, { staleTtl: 10, objectMode: true })
const [sum, info] = await memoizedSum(1, 2)
console.log(info)
// {
// hasValue: true,
// key: 1,
// isExpired: false,
// isStale: true,
// staleError: Error: NOPE
// }staleTtl
Type: Number or Function
Default: undefined
The staleness threshold we will still return the stale value but meanwhile asynchronously refresh the value.
When you provide a function, the value will be passed as first argument.
ttl
Type: Number or Function
Default: undefined
The time-to-live quantity of time the value will considered as fresh.
value
Type: Function
Default: identity
A decorate function to be applied before store the value.
License
@keyvhq/memoize © Dieter Luypaert, released under the MIT License. Maintained by Microlink with help from contributors.
microlink.io · GitHub microlinkhq · X @microlinkhq
9 months ago
6 months ago
1 year ago
2 years ago
2 years ago
2 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago