redux-memoize v2.3.1
redux-memoize
Memoize action creator for redux, and let you dispatch common/thunk/promise/async action whenever you want to, without worrying about duplication.
npm install --save redux-memoize
Installation
npm install --save redux-memoize
Then create the redux-memoize middleware.
import { createStore, applyMiddleware } from 'redux';
import createMemoizeMiddleware, { memoize } from 'redux-memoize';
// a common action creator
const increment = () => {
return {
type: 'INCREMENT',
payload: 1,
};
};
// This is a memoized action creator.
const memoizeIncrement = memoize({ ttl: 100 }, increment);
// Reducer
function counter(state = 0, action) {
switch (action.type) {
case 'INCREMENT':
return state + action.payload;
default:
return state;
}
}
const store = createStore(
counter,
applyMiddleware(createMemoizeMiddleware({ ttl: 200 })),
);
store.dispatch(increment());
console.info(store.getState()); // OUTPUT: 1
store.dispatch(increment());
console.info(store.getState()); // OUTPUT: 2
const promise1 = store.dispatch(memoizeIncrement()); // return a cached Promise
console.info(store.getState()); // OUTPUT: 3
const promise2 = store.dispatch(memoizeIncrement()); // return previous cached Promise
console.info(store.getState()); // OUTPUT: 3, increment() didn't run
console.info(promise1 === promise2); OUTPUT: true
// NOTICE: only works on browser.
// In order to prevent memory leak, cached action creator will not be evicted on server side by default.
// So the following code will output 3 on server side.
// To enable eviction on server, use createMemoizeMiddleware({ disableTTL: false })
setTimeout(() => {
store.dispatch(memoizeIncrement());
console.info(store.getState()); // OUTPUT: 4
}, 500);
It works perfectly with redux-thunk
import { createStore, applyMiddleware } from 'redux';
import createMemoizeMiddleware, { memoize } from 'redux-memoize';
import thunk from 'redux-thunk';
import rootReducer from './rootReducer';
const fetchUserSuccess = (user) => {
return {
type: 'FETCH_USER/SUCCESS',
payload: user,
};
});
let creatorCalled = 0;
let thunkCalled = 0;
const fetchUserRequest = memoize({ ttl: 1000 }, (username) => {
creatorCalled += 1;
return (dispatch, getState) => {
thunkCalled += 1;
return fetch('https://api.github.com/users/${username}')
.then(res => res.json())
.then((user) => {
dispatch(fetchUserSuccess(user));
});
};
});
const store = createStore(
rootReducer,
applyMiddleware(createMemoizeMiddleware({ ttl: 200 }), thunk),
);
// Component1
const promise1 = store.dispatch(fetchUserRequest('kouhin'))
.then(() => {
// do something
});
// Component2
const promise2 = store.dispatch(fetchUserRequest('kouhin'))
.then(() => {
// do something
});
Promise.all([promise1, promise2])
.then(() => {
console.info(creatorCalled); // OUTPUT: 1
console.info(thunkCalled); // OUTPUT: 1
});
API
memoize(opts, actionCreator)
Memoize actionCreator and returns a memoized actionCreator. When dispatch action that created by memorized actionCreator, it will returns a Promise.
Arguments
opts
Object | numberttl
Number|Function: The time to live for cached action creator. Whenttl
is a function,getState
will be passed as argument, and it must returns a number.enabled
Boolean|Function: Whether use memorized action creator or not. Whenfalse
, cache will be ignored and the result of original action creator will be dispatched without caching. Whenenabled
is a function,getState
will be passed argument, and it must returns a boolean.isEqual
: arguments of action creator will be used as the map cache key. It uses lodash.isEqual to find the existed cached action creator. You can customize this function.- If
opts
is a number, the numbrer specifies the ttl.
Returns
- (Function): memoized actionCreator. Original action creator can be accessed by
memoize(actionCreator).unmemoized
, e.g.
const actionCreator = () => {};
const memoized = memoize(actionCreator);
console.info(memoized.unmemoized === actionCreator);
createMemoizeMiddleware(globalOpts)
Create a redux middleware.
Arguments
globalOpts
Object- Object: Default opts for memorize().
- Default:
{ ttl:0, enabled: true, isEqual: lodash.isEqual }
]. ttl is REQUIRED, You SHOULD set a ttl > 0 in millisecond - There is another options
disableTTL
. The default value istrue
on server andfalse
on browser. By default, cached action creator will not be evicted by setTimeout with TTL on server in order to prevent memory leak. You can enable it for test purpose. - You can pass a customized cache by
cache
instead of default cachenew WeakMap()
.
Returns
- (Function): Redux middleware.
You can find more examples in test files.
Motivation
redux-thunk and redux-saga are two popular libraries to handle asynchronous flow. redux-saga
monitors dispatched actions and make side effects. It's very powerful and you can use it to control in almost every detail in asynchronous flow. However it is a little complex. redux-thunk
is simple and artfully designed. It's created only by 11 lines of code. I like it very much but it can't solve the problem of duplicated requests.
In 2016, I wrote a library called redux-dataloader. It monitors dispatched action and avoids duplicated requests. We use this library in out project and it works well. But I think it's still a little complex and want to make it simpler just like redux-thunk. Because for a single task we have to create data loader and three actions and switch between actions and data loaders and get boring. Then I create this middleware just for reducing duplicated thunk calls. It works pretty good with redux-thunk and common actions, may even works with other middlewares such as redux-promise and so on.
Why not memoize utils such as _.memoize?
Of course memoize utils such as lodash/memoize can solve duplicated requests on browser. However, _.memoize
only puts the result of action creator into cache, and the result of dispatch()
cannot be cached. When the result of a action creator is a function (thunk), the function will still be executed by thunk middleware. It means _.memoize
can't cache thunk, and the async action will still be duplicated. Besides, it may cause memory problem on server side. On the server side, we will create a new store for each request. Since this library holds cache in middleware that is created with createStore, cache will be cleaned up after request by GC. It won't cause memory leak problem. What's more, it supports dynamic ttl
and enabled
by store.getState()
, so you can change these opions from remote api when needed.
LICENSE
MIT