extra-promise v7.0.0
extra-promise
Utilities for JavaScript Promise and async functions.
Install
npm install --save extra-promise
# or
yarn add extra-promiseAPI
interface INonBlockingChannel<T> {
send(value: T): void
receive(): AsyncIterable<T>
close: () => void
}
interface IBlockingChannel<T> {
send(value: T): Promise<void>
receive(): AsyncIterable<T>
close: () => void
}
interface IDeferred<T> {
resolve(value: T): void
reject(reason: unknown): void
}functions
isPromise
function isPromise<T>(val: unknown): val is Promise<T>
function isntPromise<T>(val: T): val is Exclude<T, Promise<unknown>>isPromiseLike
function isPromiseLike<T>(val: unknown): val is PromiseLike<T>
function isntPromiseLike<T>(val: T): val is Exclude<T, PromiseLike<unknown>>delay
function delay(timeout: number, signal?: AbortSignal): Promise<void>A simple wrapper for setTimeout.
timeout
function timeout(ms: number, signal?: AbortSignal): Promise<never>It throws a TimeoutError after ms milliseconds.
try {
result = await Promise.race([
fetchData()
, timeout(5000)
])
} catch (e) {
if (e instanceof TimeoutError) ...
}pad
function pad<T>(ms: number, fn: () => Awaitable<T>): Promise<T>Run a function, but wait at least ms milliseconds before returning.
parallel
function parallel(
tasks: Iterable<() => Awaitable<unknown>>
, concurrency: number = Infinity
): Promise<void>Perform tasks in parallel.
The value range of concurrency is 1, Infinity.
Invalid values will throw Error.
parallelAsync
function parallelAsync(
tasks: AsyncIterable<() => Awaitable<unknown>>
, concurrency: number // concurrency must be finite number
): Promise<void>Same as parallel, but tasks is an AsyncIterable.
series
function series(
tasks: Iterable<() => Awaitable<unknown>>
| AsyncIterable<() => Awaitable<unknown>>
): Promise<void>Perform tasks in order.
Equivalent to parallel(tasks, 1).
waterfall
function waterfall<T>(
tasks: Iterable<(result: unknown) => Awatiable<unknown>>
| AsyncIterable<(result: unknown) => Awaitable<unknown>>
): Promise<T | undefined>Perform tasks in order, the return value of the previous task will become the parameter of the next task. If tasks is empty, return Promise<undefined>.
each
function each(
iterable: Iterable<T>
, fn: (element: T, i: number) => Awaitable<unknown>
, concurrency: number = Infinity
): Promise<void>The async each operator for Iterable.
The value range of concurrency is 1, Infinity.
Invalid values will throw Error.
eachAsync
function eachAsync<T>(
iterable: AsyncIterable<T>
, fn: (element: T, i: number) => Awaitable<unknown>
, concurrency: number // concurrency must be finite number
): Promise<void>Same as each, but iterable is an AsyncIterable.
map
function map<T, U>(
iterable: Iterable<T>
, fn: (element: T, i: number) => Awaitable<U>
, concurrency: number = Infinity
): Promise<U[]>The async map operator for Iterable.
The value range of concurrency is 1, Infinity.
Invalid values will throw Error.
mapAsync
function mapAsync<T, U>(
iterable: AsyncIterable<T>
, fn: (element: T, i: number) => Awaitable<U>
, concurrency: number // concurrency must be finite number
): Promise<U[]>Same as map, but iterable is an AsyncIterable.
filter
function filter<T, U = T>(
iterable: Iterable<T>
, fn: (element: T, i: number) => Awaitable<boolean>
, concurrency: number = Infinity
): Promise<U[]>The async filter operator for Iterable.
The value range of concurrency is 1, Infinity.
Invalid values will throw Error.
filterAsync
function filterAsync<T, U = T>(
iterable: AsyncIterable<T>
, fn: (element: T, i: number) => Awaitable<boolean>
, concurrency: number // concurrency must be finite number
): Promise<U[]>Same as filter, but iterable is an AsyncIterable.
all
function all<T extends { [key: string]: PromiseLike<unknown> }>(
obj: T
): Promise<{ [Key in keyof T]: UnpackedPromiseLike<T[Key]> }>It is similar to Promise.all, but the first parameter is an object.
const { task1, task2 } = await all({
task1: invokeTask1()
, task2: invokeTask2()
})promisify
type Callback<T> = (err: any, result?: T) => void
function promisify<Result, Args extends any[] = unknown[]>(
fn: (...args: [...args: Args, callback?: Callback<Result>]) => unknown
): (...args: Args) => Promise<Result>The well-known promisify function.
callbackify
type Callback<T> = (err: any, result?: T) => void
function callbackify<Result, Args extends any[] = unknown[]>(
fn: (...args: Args) => Awaitable<Result>
): (...args: [...args: Args, callback: Callback<Result>]) => voidThe callbackify function, as opposed to promisify.
asyncify
function asyncify<Args extends any[], Result, This = unknown>(
fn: (this: This, ...args: Args) => Awaitable<Result>
): (this: This, ...args: Promisify<Args>) => Promise<Result>Turn sync functions into async functions.
const a = 1
const b = Promise.resolve(2)
const add = (a: number, b: number) => a + b
// BAD
add(a, await b) // 3
// GOOD
const addAsync = asyncify(add) // (a: number | PromiseLike<number>, b: number | PromiseLike<number>) => Promise<number>
await addAsync(a, b) // Promise<3>It can also be used to eliminate the call stack:
// OLD
function count(n: number, i: number = 0): number {
if (i < n) return count(n, i + 1)
return i
}
count(10000) // RangeError: Maximum call stack size exceeded
// NEW
const countAsync = asyncify((n: number, i: number = 0): Awaitable<number> => {
if (i < n) return countAsync(n, i + 1)
return i
})
await countAsync(10000) // 10000spawn
function spawn<T>(
num: number
, create: (id: number) => Awaitable<T>
): Promise<T[]>A sugar for create multiple values in parallel.
The parameter id is from 1 to num.
limitConcurrencyByQueue
function limitConcurrencyByQueue<T, Args extends any[]>(
concurrency: number
, fn: (...args: Args) => PromiseLike<T>
): (...args: Args) => Promise<T>Limit the number of concurrency, calls that exceed the number of concurrency will be delayed in order.
reusePendingPromises
type VerboseResult<T> = [value: T, isReuse: boolean]
interface IReusePendingPromisesOptions<Args> {
createKey?: (args: Args) => unknown
verbose?: true
}
function reusePendingPromises<T, Args extends any[]>(
fn: (...args: Args) => PromiseLike<T>
, options: IReusePendingPromisesOptions<Args> & { verbose: true }
): (...args: Args) => Promise<VerboseResult<T>>
function reusePendingPromises<T, Args extends any[]>(
fn: (...args: Args) => PromiseLike<T>
, options: IReusePendingPromisesOptions<Args> & { verbose: false }
): (...args: Args) => Promise<T>
function reusePendingPromises<T, Args extends any[]>(
fn: (...args: Args) => PromiseLike<T>
, options: Omit<IReusePendingPromisesOptions<Args>, 'verbose'>
): (...args: Args) => Promise<T>
function reusePendingPromises<T, Args extends any[]>(
fn: (...args: Args) => PromiseLike<T>
): (...args: Args) => Promise<T>Returns a function that will return the same Promise for calls with the same parameters if the Promise is pending.
It generates cache keys based on the options.createKey function,
The default value of options.createKey is a stable JSON.stringify implementation.
Classes
StatefulPromise
enum StatefulPromiseState {
Pending = 'pending'
, Fulfilled = 'fulfilled'
, Rejected = 'rejected'
}
class StatefulPromise<T> extends Promise<T> {
static from<T>(promise: PromiseLike<T>): StatefulPromise<T>
get state(): StatefulPromiseState
constructor(
executor: (
resolve: (value: T) => void
, reject: (reason: any) => void
) => void
)
isPending(): boolean
isFulfilled(): boolean
isRejected(): boolean
}A subclass of Promise used for testing, helps you understand the state of Promise.
Channel
class Channel<T> implements IBlockingChannel<T>Implement MPMC(multi-producer, multi-consumer) FIFO queue communication with Promise and AsyncIterable.
sendSend value to the channel, block until data is taken out by the consumer.receiveReceive value from the channel.closeClose the channel.
If the channel closed, send and receive will throw ChannelClosedError.
AsyncIterator that have already been created do not throw ChannelClosedError,
but return { done: true }.
const chan = new Channel<string>()
queueMicrotask(() => {
await chan.send('hello')
await chan.send('world')
})
for await (const value of chan.receive()) {
console.log(value)
}BufferedChannel
class BufferedChannel<T> implements IBlockingChannel<T> {
constructor(bufferSize: number)
}Implement MPMC(multi-producer, multi-consumer) FIFO queue communication with Promise and AsyncIterable.
When the amount of data sent exceeds bufferSize, send will block until data in buffer is taken out by the consumer.
sendSend value to the channel. If the buffer is full, block.receiveReceive value from the channel.closeClose the channel.
If the channel closed, send and receive will throw ChannelClosedError.
AsyncIterator that have already been created do not throw ChannelClosedError,
but return { done: true }.
const chan = new BufferedChannel<string>(1)
queueMicrotask(() => {
await chan.send('hello')
await chan.send('world')
})
for await (const value of chan.receive()) {
console.log(value)
}UnlimitedChannel
class UnlimitedChannel<T> implements INonBlockingChannel<T>Implement MPMC(multi-producer, multi-consumer) FIFO queue communication with Promise and AsyncIterable.
UnlimitedChannel return a tuple includes three channel functions:
sendSend value to the channel. There is no size limit on the buffer, all sending will return immediately.receiveReceive value from the channel.closeClose the channel.
If the channel closed, send and receive will throw ChannelClosedError.
AsyncIterator that have already been created do not throw ChannelClosedError,
but return { done: true }.
const chan = new UnlimitedChannel<string>()
queueMicrotask(() => {
chan.send('hello')
chan.send('world')
})
for await (const value of chan.receive()) {
console.log(value)
}Deferred
class Deferred<T> implements PromiseLike<T>, IDeferred<T>Deferred is a Promise that separates resolve() and reject() from the constructor.
MutableDeferred
class MutableDeferred<T> implements PromiseLike<T>, IDefrred<T>MutableDeferred is similar to Deferred,
but its resolve() and reject() can be called multiple times to change the value.
const deferred = new MutableDeferred()
deferred.resolve(1)
deferred.resolve(2)
await deferred // resolved(2)ReusableDeferred
class ReusableDeferred<T> implements PromiseLike<T>, IDeferred<T>ReusableDeferred is similar to MutableDeferred,
but its internal Deferred will be overwritten with a new pending Deferred after each call.
const deferred = new ReusableDeferred()
deferred.resolve(1)
queueMicrotask(() => deferred.resolve(2))
await deferred // pending, resolved(2)DeferredGroup
class DeferredGroup<T> implements IDeferred<T> {
add(deferred: IDeferred<T>): void
remove(deferred: IDeferred<T>): void
clear(): void
}LazyPromise
class LazyPromise<T> implements PromiseLike<T> {
then: PromiseLike<T>['then']
constructor(
executor: (resolve: (value: T) => void
, reject: (reason: any) => void) => void
)
}LazyPromise constructor is the same as Promise.
The difference with Promise is that LazyPromise only performs executor after then method is called.
Semaphore
type Release = () => void
class Semaphore {
constructor(count: number)
acquire(): Promise<Release>
acquire<T>(handler: () => Awaitable<T>): Promise<T>
}Mutex
type Release = () => void
class Mutex extends Semaphore {
acquire(): Promise<Release>
acquire<T>(handler: () => Awaitable<T>): Promise<T>
}DebounceMicrotask
class DebounceMicrotask {
queue(fn: () => void): void
cancel(fn: () => void): boolean
}queue can create a microtask,
if the microtask is not executed, multiple calls will only queue it once.
cancel can cancel a microtask before it is executed.
DebounceMacrotask
class DebounceMacrotask {
queue(fn: () => void): void
cancel(fn: () => void): boolean
}queue can create a macrotask,
if the macrotask is not executed, multiple calls will only queue it once.
cancel can cancel a macrotask before it is executed.
TaskRunner
class TaskRunnerDestroyedError extends CustomError {}
class TaskRunner {
constructor(
concurrency: number = Infinity
, rateLimit?: {
duration: number
limit: number
}
)
/**
* @throws {TaskRunnerDestroyedError}
*/
run(task: (signal: AbortSignal) => Awaitable<T>, signal?: AbortSignal): Promise<T>
destroy(): void
}A task runner, it will execute tasks in FIFO order.
12 months ago
2 years ago
2 years ago
2 years ago
2 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
3 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
4 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
5 years ago
6 years ago
6 years ago
7 years ago
7 years ago
7 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago
8 years ago