0.1.1 • Published 9 months ago

@everymundo/limited-concurrency v0.1.1

Weekly downloads
-
License
ISC
Repository
github
Last release
9 months ago

limited-concurrency

Allows node js logic to define a limit of concurrent/async calls

This is particularly useful when you are hitting external Web APIs that give you a max number of concurrent/simultaneous calls and you want to request many times more than that limit in an optimized way.

Instalation

npm i @everymundo/limited-concurrency

Usage

const limitedConcurrency = require('@everymundo/limited-concurrency')
const httpClient = require('@everymundo/http-client')
const MAX_CONCURRENT_CALLS = +process.env.MAX_CONCURRENT_CALLS

async function queryInParallel(url, departureDates = []) {
  const queryOnce = async (departureDate) => {
    const fares = await httpClient.post(url, { departureDate })
    console.log({ fares })

    return fares
  }

  const allFares = await limitedConcurrency.processArray(queryOnce, departureDates, MAX_CONCURRENT_CALLS)
}

Or a shorter version

async function queryInParallel(url, departureDates = []) {
  const queryOnce = (departureDate) => httpClient.post(url, { departureDate })

  const allFares = await limitedConcurrency.processArray(queryOnce, departureDates, MAX_CONCURRENT_CALLS)
}

Common Alternative

A common alternative is to break your array in chunks as you can see below

const httpClient = require('@everymundo/http-client')
const MAX_CONCURRENT_CALLS = +process.env.MAX_CONCURRENT_CALLS

async function alternativeParallel(url, departureDates = []) {
  let chunk
  const results = []

  while (chunk = departureDates.splice(0, MAX_CONCURRENT_CALLS) && chunk.length > 0) {
    const promises = chunk.map(async (departureDate) => {
      const fares = await httpClient.post(url, { departureDate })
      console.log({ fares })

      return fares
    })
    
    results.push(...await Promises.all(promises))
  }
  
  return results
}

Although this approach works it is not optimal because it limits each chunk execution time to the time of the slowest of the requests within the chunk where this very package will get one by one and execute a new one as soon as a previous one has ended.

Another advantage of this package is that it supports generators as its input, so you don't need to always use an array.

0.1.1

9 months ago

0.1.0

2 years ago

0.0.1

2 years ago