19f00dc212
* Bump @ava/typescript from 3.0.1 to 4.0.0 Bumps [@ava/typescript](https://github.com/avajs/typescript) from 3.0.1 to 4.0.0. - [Release notes](https://github.com/avajs/typescript/releases) - [Commits](https://github.com/avajs/typescript/compare/v3.0.1...v4.0.0) --- updated-dependencies: - dependency-name: "@ava/typescript" dependency-type: direct:development update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] <support@github.com> * Update checked-in dependencies --------- Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: github-actions[bot] <github-actions@github.com> |
||
---|---|---|
.. | ||
dist | ||
license | ||
package.json | ||
readme.md |
readme.md
mem
Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input
Memory is automatically released when an item expires or the cache is cleared.
By default, only the memoized function's first argument is considered via strict equality comparison. If you need to cache multiple arguments or cache object
s by value, have a look at alternative caching strategies below.
If you want to memoize Promise-returning functions (like async
functions), you might be better served by p-memoize.
Install
$ npm install mem
Usage
import mem from 'mem';
let index = 0;
const counter = () => ++index;
const memoized = mem(counter);
memoized('foo');
//=> 1
// Cached as it's the same argument
memoized('foo');
//=> 1
// Not cached anymore as the argument changed
memoized('bar');
//=> 2
memoized('bar');
//=> 2
// Only the first argument is considered by default
memoized('bar', 'foo');
//=> 2
Works well with Promise-returning functions
But you might want to use p-memoize for more Promise-specific behaviors.
import mem from 'mem';
let index = 0;
const counter = async () => ++index;
const memoized = mem(counter);
console.log(await memoized());
//=> 1
// The return value didn't increase as it's cached
console.log(await memoized());
//=> 1
import mem from 'mem';
import got from 'got';
import delay from 'delay';
const memGot = mem(got, {maxAge: 1000});
await memGot('https://sindresorhus.com');
// This call is cached
await memGot('https://sindresorhus.com');
await delay(2000);
// This call is not cached as the cache has expired
await memGot('https://sindresorhus.com');
Caching strategy
By default, only the first argument is compared via exact equality (===
) to determine whether a call is identical.
const power = mem((a, b) => Math.power(a, b));
power(2, 2); // => 4, stored in cache with the key 2 (number)
power(2, 3); // => 4, retrieved from cache at key 2 (number), it's wrong
You will have to use the cache
and cacheKey
options appropriate to your function. In this specific case, the following could work:
const power = mem((a, b) => Math.power(a, b), {
cacheKey: arguments_ => arguments_.join(',')
});
power(2, 2); // => 4, stored in cache with the key '2,2' (both arguments as one string)
power(2, 3); // => 8, stored in cache with the key '2,3'
More advanced examples follow.
Example: Options-like argument
If your function accepts an object, it won't be memoized out of the box:
const heavyMemoizedOperation = mem(heavyOperation);
heavyMemoizedOperation({full: true}); // Stored in cache with the object as key
heavyMemoizedOperation({full: true}); // Stored in cache with the object as key, again
// The objects look the same but for JS they're two different objects
You might want to serialize or hash them, for example using JSON.stringify
or something like serialize-javascript, which can also serialize RegExp
, Date
and so on.
const heavyMemoizedOperation = mem(heavyOperation, {cacheKey: JSON.stringify});
heavyMemoizedOperation({full: true}); // Stored in cache with the key '[{"full":true}]' (string)
heavyMemoizedOperation({full: true}); // Retrieved from cache
The same solution also works if it accepts multiple serializable objects:
const heavyMemoizedOperation = mem(heavyOperation, {cacheKey: JSON.stringify});
heavyMemoizedOperation('hello', {full: true}); // Stored in cache with the key '["hello",{"full":true}]' (string)
heavyMemoizedOperation('hello', {full: true}); // Retrieved from cache
Example: Multiple non-serializable arguments
If your function accepts multiple arguments that aren't supported by JSON.stringify
(e.g. DOM elements and functions), you can instead extend the initial exact equality (===
) to work on multiple arguments using many-keys-map
:
import ManyKeysMap from 'many-keys-map';
const addListener = (emitter, eventName, listener) => emitter.on(eventName, listener);
const addOneListener = mem(addListener, {
cacheKey: arguments_ => arguments_, // Use *all* the arguments as key
cache: new ManyKeysMap() // Correctly handles all the arguments for exact equality
});
addOneListener(header, 'click', console.log); // `addListener` is run, and it's cached with the `arguments` array as key
addOneListener(header, 'click', console.log); // `addListener` is not run again
addOneListener(mainContent, 'load', console.log); // `addListener` is run, and it's cached with the `arguments` array as key
Better yet, if your function’s arguments are compatible with WeakMap
, you should use deep-weak-map
instead of many-keys-map
. This will help avoid memory leaks.
API
mem(fn, options?)
fn
Type: Function
Function to be memoized.
options
Type: object
maxAge
Type: number
Default: Infinity
Milliseconds until the cache expires.
cacheKey
Type: Function
Default: arguments_ => arguments_[0]
Example: arguments_ => JSON.stringify(arguments_)
Determines the cache key for storing the result based on the function arguments. By default, only the first argument is considered.
A cacheKey
function can return any type supported by Map
(or whatever structure you use in the cache
option).
Refer to the caching strategies section for more information.
cache
Type: object
Default: new Map()
Use a different cache storage. Must implement the following methods: .has(key)
, .get(key)
, .set(key, value)
, .delete(key)
, and optionally .clear()
. You could for example use a WeakMap
instead or quick-lru
for a LRU cache.
Refer to the caching strategies section for more information.
memDecorator(options)
Returns a decorator to memoize class methods or static class methods.
Notes:
- Only class methods and getters/setters can be memoized, not regular functions (they aren't part of the proposal);
- Only TypeScript’s decorators are supported, not Babel’s, which use a different version of the proposal;
- Being an experimental feature, they need to be enabled with
--experimentalDecorators
; follow TypeScript’s docs.
options
Type: object
Same as options for mem()
.
import {memDecorator} from 'mem';
class Example {
index = 0
@memDecorator()
counter() {
return ++this.index;
}
}
class ExampleWithOptions {
index = 0
@memDecorator({maxAge: 1000})
counter() {
return ++this.index;
}
}
memClear(fn)
Clear all cached data of a memoized function.
fn
Type: Function
Memoized function.
Tips
Cache statistics
If you want to know how many times your cache had a hit or a miss, you can make use of stats-map as a replacement for the default cache.
Example
import mem from 'mem';
import StatsMap from 'stats-map';
import got from 'got';
const cache = new StatsMap();
const memGot = mem(got, {cache});
await memGot('https://sindresorhus.com');
await memGot('https://sindresorhus.com');
await memGot('https://sindresorhus.com');
console.log(cache.stats);
//=> {hits: 2, misses: 1}
Related
- p-memoize - Memoize promise-returning & async functions
Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies.