lambda-customer-reports v1.2.1
lambda-customer-property-roles
This project is a lambda to run the customer reports system: evaluating criteria of existing reports, then sending transmissions out when the criteria is met. Although it is called a reports system, it's basically a process to check criteria and send emails to users, and thus could easily be adapted for running alerts—indeed, deprecating lambda-alerts and supplanting it with this is the long-term goal!
Some key takeaways for this repo:
there is a high degree of isolation, almost as if functions are components. This is done to allow functions to be more independent. There are a few things that are shared across functions:
- the Postgres client
- the BigQuery client
- some utility functions like unit converters and an errorLogger
the decoupling should help reduce bugs where changing one function can breaks another. The SQL files to run a function are contained within that function's folder too even. (See note below on importing). It's hoped that the code should be easier to reason about as there aren't larger abstractions. In addition, if/when testing is added, it should be easier to write tests
the SQL to check which reports should be sent is somewhat sophisticated—e.g. in the weeklyTrapCatches report, in a single SQL query, all the reports, with their contacts, criteria, (etc) are fetched, but in there it will also check
- is this the right day of the week for the respective property?
- which properties trap which insects?
- which properties can the owner of the report create trap catch reports for?
- big data checking is done via direct calls to BQ/DataLayer
- Postmark email templates are stored in this repo with some test JSON for template models
there's a strive for the dependencies to be quite a tiny and light list
The SQL to give to PG and BQ queries can be imported using a function,
importQueryFile
. GQL files for dataLayer follow a similar patternconst { pgdb } = require('./db') const { importQueryFile } = require('../shared-utils/importQueryFile') const stationsQuery = importQueryFile.pg('./stations.sql') const someFunction = async ({ propertyIds }) => { const stations = await pgdb.query(stationsQuery, { propertyIds }) }
Building and deploying
To only build the lambda, you can run npm run build
. To build and deploy, you can run npm run deploy SOME_TARGET
, where SOME_TARGET
is either prod
xor replica
. This will upload the function to AWS lambda. You will need the requisite permissions.
.zipignore
Since we build using zip
and then upload it, there's a .zipignore
file that lists files that we don't want to include. Note that ignoring folders at the root of this project require specific syntax, e.g. /.idea/*
3 years ago