@lato/cacheable-proxy v0.0.2
@lato/cacheable-proxy
cacheableProxy
is a workaround for suboptimal performance of lookups on
proxies with a get
trap.
Usage
Proxy like objects
import { cacheableProxy } from '@lato/cacheable-proxy';
const { makeProxy } = cacheableProxy();
const fieldLikeProxy = makeProxy(property =>
console.log(`field "${property}" looked up`)
);
// regular lookup with bad performance
fieldLikeProxy.someProperty;
// prints
// field "someProperty" looked up
// by calling fieldLikeProxy with property name, handler is
// called directly without hitting a proxy trap
fieldLikeProxy("someProperty");
// prints
// field "someProperty" looked up
const methodLikeProxy = makeProxy(property => (x, y) =>
console.log(`.${property}("${x}", "${y}") call`)
);
// calling a handler may appear to be a method call
methodLikeProxy.someMethod("foo", "bar");
// prints
// .someMethod("foo", "bar") call
// faster alternative
methodLikeProxy("someMethod")("foo", "bar");
// prints
// .someMethod("foo", "bar") call
As a member of a prototype
It is possible to assign such objects to a prototype and
have this
bound to the originator:
import { cacheableProxy, makePrototypeGetter } from '@lato/cacheable-proxy';
const makeLoggingGetter = makePrototypeGetter(function(prop) {
return x => {
console.log(`.${prop}("${x}") call`);
console.log(`"this.foo" is ${this.foo}`);
};
});
const Type = function(){};
// Or `Object.defineProperties` for multiple proxy like fields
Object.defineProperty(Type.prototype, "magic", {
get: makeLoggingGetter(cacheableProxy().makeProxy)
});
const object = new Type;
object.foo = "bar";
object.magic.someMethod("some argument");
// prints:
// .someMethod("some argument") call
// "this.foo" is bar
// `this` inside the handler points to `object`
// faster alternative
object.magic("someMethod")("some argument");
Caching
This contraption wouldn't be of much use if you always had to choose between performance and familiar syntax. By caching, it is possible to reduce the number of lookups on a proxy object.
A separate cache is created per each call to cacheableProxy
. A cache
object is returned that allows cache management.
Basic caching
import { cacheableProxy } from '@lato/cacheable-proxy';
const cacheAll = cacheableProxy();
cacheAll.setTrap((prop, handler) => {
cacheAll.cache.set(prop, function(){ return this(prop); });
return handler(prop);
});
const toUpperProxy = cacheAll.makeProxy(property => property.toUpperCase());
// first call on "FoO", trap is hit, getter for "FoO" is cached
toUpperProxy.FoO; // "FOO"
// second call on "FoO", trap is not hit, much faster
toUpperProxy.FoO; // "FOO"
const toLowerProxy = cacheAll.makeProxy(property => property.toLowerCase());
// correct handler is still being picked, despite `toUpperProxy`
// and `toLowerProxy` sharing the same cache,
// proxy trap is not hit
toLowerProxy.FoO; // "foo"
// another cache, independent of `cacheAll`
const cacheM = cacheableProxy();
cacheM.setTrap((prop, handler) => {
if(prop.startsWith("m"))
cacheM.cache.set(function(){ return this(prop); });
return handler(prop);
});
const toLowerMProxy = cacheM.makeProxy(property => property.toLowerCase());
// trap is hit, but the getter is not cached
toLowerMProxy.FoO; // "foo"
// first call on "mOo", trap is hit, getter is cached
toLowerMProxy.mOo; // "moo"
// second call on "mOo", trap is not hit
toLowerMProxy.mOo; // "moo"
Except explicit calls to cacheableProxy
, this library does not track
any internal state. Anything that cacheableProxy
and makePrototypeGetter
touch should be a subject to normal garbage collection, should all
references go out of scope.
Getters
Entries added to the cache with cache.set
should be getters
that are independent of any handler (a cache could be shared between multiple
proxies with different handlers), usually:
function(){ return this(prop); }
While doing a lookup, this
will be bound to a correct handler.
However this library doesn't impose this limitation - you are allowed to break things. While hitting a proxy trap it is possible to create a getter that depends on the handler. It could be useful if a handler is slow and there is only one proxy using the cache.
import { cacheableProxy } from '@lato/cacheable-proxy';
const { makeProxy, setTrap, cache } = cacheableProxy();
setTrap((prop, handler) => {
const result = handler(prop);
cache.set(
prop,
function() { return result; } // getter not using `this`
// (handler to be bound)
);
return result;
});
// plain abuse, memoized fibonacci
const fiboProxy = makeProxy(n =>
(n < 2) ? Number(n) : (fiboProxy[n - 2] + fiboProxy[n - 1])
);
console.log(fiboProxy["50"]);
// 12586269025
const broken = makeProxy(prop => prop);
console.log(broken("10")); // "10", ok
console.log(broken["10"]); // 55, wrong
Cache management
cache.get
and cache.remove
work as expected.
import { cacheableProxy } from '@lato/cacheable-proxy';
const fooGetter = function(){ return this("foo"); };
const constGetter = function(){ return 42; };
const { makeProxy, setTrap, cache } = cacheableProxy();
setTrap((prop, handler) => {
cache.set(prop, constGetter);
return 44;
});
// precache foo
cache.set("foo", fooGetter);
const toUpperProxy = makeProxy(prop => prop.toUpperCase());
// proxy trap not hit
toUpperProxy.foo;
// "FOO"
// proxy trap hit
toUpperProxy.bar;
// 44
// proxy trap not hit
toUpperProxy.bar;
// 42
// cache.get("foo") === fooGetter
cache.set("bar", cache.get("foo"));
// proxy trap not hit
toUpperProxy.bar;
// "FOO"
cache.remove("foo");
// proxy trap hit
toUpperProxy.foo;
// 44
// proxy trap not hit
toUpperProxy.foo;
// 42
LRU cache
With these tools you can implement caching strategies as complex as you like.
import { cacheableProxy } from '@lato/cacheable-proxy';
const { makeProxy, setTrap, cache } = cacheableProxy();
const cacheSize = 3;
const history = new Set([]);
setTrap((prop, handler) => {
cache.set(prop, function() {
history.delete(prop); // bubble up
history.add(prop); //
return this(prop);
});
history.add(prop);
if(history.size > cacheSize) {
const oldest = history.values().next().value;
cache.remove(oldest);
history.delete(oldest);
}
return handler(prop);
});
const p = makeProxy(prop => prop);
p.foo;
p.bar;
p.foo;
p.baz;
p.qux;
// "bar" getter was removed,
// now cache contains entries for ["foo", "baz", "qux"]
This is just an example. This strategy would probably perform worse then expected (not to mention big O times), due to the fact that on every trap hit, the shape of the cache is changed, thus invalidating IC, even for already cached getters (todo: check).
Test
A simple test was run to have a rough performance comparison between access times of various objects:
for(let i = 0; i < bigNumber; ++i)
o.foo;
Object being accessed | Node | Chrome |
---|---|---|
{ foo: a } | 1.00 | 1.10 |
new T where T sets foo | 1.07 | 1.00 |
new T where T.prototype contains foo field | 1.48 | 1.58 |
{ get foo() { return a; } } | 2.36 | 4.33 |
proxy with foo cached | 2.47 | 3.93 |
new T where T.prototype contains foo getter | 2.82 | 3.61 |
proxy without caching | 4.66 | 6.73 |
47% speedup on node, but those numbers should be considered rather inaccurate - the test was too simplistic.