1.21.6 • Published 3 years ago

event-service-tasks v1.21.6

Weekly downloads
135
License
UNLICENSED
Repository
github
Last release
3 years ago

event-service-tasks

Task and Process code for event-service to support tasks, including for customer tenants.

This repository is meant to be turned into a Docker image, and then made available as a Task in AWS ECS/Fargate, Docker or Kubernetes via event-service. This service has Triggers monitoring S3 uploads (from SFTP) and other events, and then launching the code in this repository to deal with files as they are uploaded.

The main entry point is expected to be the sbin/runTask.sh shell script, to which the task spawned by event-service should provide a futher parameter to distinguish which file is being processed.

Adding/Updating New Processes/Tasks/Handlers

  1. Modify sbin/runTask.sh to launch custom code when called with a unique argument (such as a task name).
  2. Add custom code to handle the processing that can be called from the runTask.sh script.

Environment Variables

The following environment variables will be provided from event-service (refer to the launch method of the FeedModel object in event-service)

  • LOG_LEVEL - Set to the current logging level that event-service is at, defaulting to info. Possible values are fatal, error, warn, info, debug, trace or silent.
  • LOG_TIMESTAMP - Set to the current timestamp flag within event-service, defaulting to false. Possible values are true and false -- where true will cause log entries to include a timestamp.
  • AIX_FEED_ID - String value for the unique Feed identifier under which the currently executing Task is being run.
  • AIX_REQUEST_ID - String value for the unique request identifier under which the currently executing Task is being run. (Best practice is to include this value in your log statements: const log = require('@trade-platform/log').child({requestid: process.env.AIX_REQUEST_ID});)
  • AIX_TRIGGERING_EVENT - JSON-encoded string for the event object that triggered the Task.
  • AIX_BASE_URL - The base URL for the event-service that is invoking the Task. This can be used with the (event-client)https://www.npmjs.com/package/@trade-platform/event-client library to communicate information back to event-service.

The runTask.sh script will also pull the following values from AWS SSM Parameter Store and export them as environment variables:

  • AIX_DB_URI - a URI, of the form mysql://username:password@host:port/database, for reaching a dedicated MySQL database.
  • AUTH0_TENANT - the Auth0 tenant, needed to obtain a token for event-service.
  • AUTH0_AUDIENCE - the Auth0 audience, needed to obtain a token for event-service.
  • AUTH0_CLIENT_ID - the Auth0 client id, needed to obtain a token for event-service.
  • AUTH0_CLIENT_SECRET - the Auth0 client secreate, needed to obtain a token for event-service.

Note that the above values can be different since they take the value of the NODE_ENV environment variable into account when doing the AWS SSM PS lookup.

Refer to sbin/setSecrets.sh for pushing these values up to the parameter store.

Debugging

Things go wrong. The code in this repository is (should be) designed to help make debugging problems a little easier.

  1. You can check out the code repository, configure a .env file in the root directory with necessary environment variables defined (see .env.example), and execute any of the scripts. (Note this means any new scripts added should support the dotenv convention.)
  2. You can also launch the specific Docker container version (or even the AWS ECS Task), with proper environment variables defined, against a local or testing instance.
  3. The scripts should use @trade-platform/log to log output, and you can adjust the LOG_LEVEL environment variable to get more (or less) information.
  4. The logs generated by the scripts should end up in CloudWatch and then LogDNA, and each entry should include a unique requestid and other values to assist with tracing. (Note this means any scripts added should support putting AIX_REQUEST_ID from the environment into all log messages.)
  5. Script output should also end up in the Activity object associated to the Feed in event-service, so each invocation of a Task should have script output available for debugging.
  6. Scripts should be designed to be run multiple times with the same data and not create problems. (idempotent)https://en.wikipedia.org/wiki/Idempotence

Local Debugging (Without Docker)

  1. Setup your local .env file based on what is provided in the .env.example file.
  2. In your .env file, ensure that the AIX_TRIGGERING_EVENT contains the correct eventName (task trigger name) and more importantly the correct s3Key and s3Bucket which will determine what file will get passed along to run in the task your attempting to trigger.
  3. Run npm run build to build the application code.
  4. If you would like to test against the Events Service UAT database instead of your local database, run the ./sbin/set-rds-sg script to gain access.
  5. Run the ./sbin/runTask.sh <task_trigger_name> to fire the task(s) you wish to run locally.
  6. NOTE: Each time you change any application code, you will need to run npm run build again to pick up those changes, before re-running the ./sbin/runTask.sh <task_trigger_name> script.

Building the Docker Image

docker build -t event-handler-tasks:latest .

I want to...

...add a new client

  1. Figure out a "short name" for the client -- preferably one that is used everywhere that client is referred to.
  2. Create a directory under src/tasks/ for that "short name".
  3. Add the processing script(s) to src/tasks/<short name>.
  4. Update sbin/runTasks.sh to add a unique string to the switch statement. This string should start with the "short name", and the body of the case should execute the script you created (along with any command-line parameters).
  5. Make sure there is a Feed for the client defined in event-service. You can use PostMan/Insomnia/etc to access the API, or you can use the es CLI from @trade-platform/event-service-lib.
  6. Add a Trigger to the Feed that uses the latest tag on the (TBD: Docker image name) image with the unique string as a command-line argument to the runTask.sh script.
  7. Test by crafting an event that will match the Trigger and sending that to the event-service API (again, using PostMan/etc or the es CLI). You should be able to find output logs in CloudWatch, LogDNA, and/or by using the API to request Activity objects on the Feed.

...write a basic processing script

  1. In src/tasks/<short name>/ add a TypeScript file that has a class that extends Application, and import { Application } from @/utils/Application
  2. Implement the run method and have the script do what you need. Be sure to explore the methods available in the Application base class for getting a database handle, an AIXEventClient object, sending content to SFTP or S3, etc.
  3. Test by creating a .env file with the values you need to have set. You'll need to manually craft all of the environment variables normally set by event-service and runTask.sh. The execute your script via ts-node script.

Examples: see anything in src/tasks/fsis or src/tasks/ifpartners.

...write a script that line-by-line processes a file uploaded via SFTP

  1. Extend ReadlineApplication instead of Application, then implement the onLine and onClose methods instead of run. The onLine method will be called once for each line in the file, and then onClose called once all of the lines have been processed.
  2. To test, your .env will need an AIX_TRIGGERING_EVENT that has a stringified S3EventModel (from @trade-platform/event-service-lib) that points to a file in S3. Make sure while testing that your general AWS CLI environment is set correctly (va AWS_DEFAULT_PROFILE or AWS_ACCESS_KEY, etc.)

Example: see src/tasks/fsis/friends_and_family_generate_accounts.ts

...write a script that processes a CSV file uploaded via SFTP

  1. Extend CsvApplication instead of Application, then implement the onRecord and onClose methods instead of run. The onRecord method will be called once for each record in the file, and then onClose called once all of the records have been processed.
  2. To test, your .env will need an AIX_TRIGGERING_EVENT that has a stringified S3EventModel (from @trade-platform/event-service-lib) that points to a file in S3. Make sure while testing that your general AWS CLI environment is set correctly (va AWS_DEFAULT_PROFILE or AWS_ACCESS_KEY, etc.)

Example: see src/utils/CsvUploaderApplication.ts and how it implements onRecord and onClose.

...write a script that uploads a CSV file directly into MySQL

  1. Extend CsvUploaderApplication instead of Application, then implement the tableName and dataSpec properties.
  2. The tableName property should be a string that is the name of the table to create/update in MySQL. Stick to a nameing convention where the "short name" of the client is the first part of the table name (ie: fsis_accounts or ifpartners_owners).
  3. The dataSpec property is an array of DataSpec objects, one per field in the CSV record (or extra calculated field).
    1. The csv value is the field name in the CSV record
    2. The db value is the field name as you want it in the database table
    3. The type values is the SQL datatype (excluding modifiers like NULL or PRIMARY KEY)
    4. The key value is a boolean flag to indicate the (composite) primary key
    5. The virtual flag indicates if the field appears in the database but not the CSV file (a constant, calculated or derived value, for example)
    6. value is an optional function used to set a constant, calculate a value, etc.
  4. To test, your .env will need an AIX_TRIGGERING_EVENT that has a stringified S3EventModel (from @trade-platform/event-service-lib) that points to a file in S3. Make sure while testing that your general AWS CLI environment is set correctly (va AWS_DEFAULT_PROFILE or AWS_ACCESS_KEY, etc.)

Note: The framework is implemented to favor large file sizes, rather than being particularly fast.

Examples: see src/tasks/ifpartners/upload-*.ts

...split my processing across several scripts

  1. To have one processing script call another, use the spawnSubTask method found in the Application base class.
  2. This method takes an environment override, so you can pass a different Event using the AIX_TRIGGERING_EVENT environment variable.
  3. This method also takes an array of strings to pass to the runTask.sh script, which means you may need to modify runTask.sh to add a new case to the switch statement.

Examples: see src/tasks/ifpartners/upload-account-owner.ts and src/tasks/ifpartners/export-contacts.ts.

...send a file somewhere via SFTP

  1. Use the sendToSFTP method of the Application base class.

Examples: see src/tasks/ifpartners/export-contacts.ts

...upload a file to an S3 bucket

  1. Use the sendToS3 method of the Application base class.

Examples: see src/tasks/fsis/friends_and_family_accounts_report.ts

Tasks

FSIS / FS Investments

Friends and Family

IFP / IFPartners

NOTE

Make sure your PR titles are semrel formatted when you squash merge.

vim: tw=80

1.21.6

3 years ago

1.21.5

3 years ago

1.21.4

3 years ago

1.21.3

3 years ago

1.21.1

3 years ago

1.21.2

3 years ago

1.21.0

3 years ago

1.20.0

3 years ago

1.18.1

3 years ago

1.18.0

3 years ago

1.17.39

3 years ago

1.19.0

3 years ago

1.17.38

3 years ago

1.17.37

3 years ago

1.17.36

3 years ago

1.17.35

3 years ago

1.17.34

3 years ago

1.17.33

3 years ago

1.17.32

3 years ago

1.17.31

3 years ago

1.17.30

3 years ago

1.17.29

3 years ago

1.17.28

3 years ago

1.17.27

3 years ago

1.17.26

3 years ago

1.17.11

3 years ago

1.17.6

3 years ago

1.17.10

3 years ago

1.17.5

3 years ago

1.17.15

3 years ago

1.17.14

3 years ago

1.17.9

3 years ago

1.17.13

3 years ago

1.17.8

3 years ago

1.17.12

3 years ago

1.17.7

3 years ago

1.17.19

3 years ago

1.17.18

3 years ago

1.17.17

3 years ago

1.17.16

3 years ago

1.17.22

3 years ago

1.17.21

3 years ago

1.17.20

3 years ago

1.17.25

3 years ago

1.17.24

3 years ago

1.17.23

3 years ago

1.17.2

3 years ago

1.15.4

3 years ago

1.17.1

3 years ago

1.15.3

3 years ago

1.17.0

3 years ago

1.16.0

3 years ago

1.17.4

3 years ago

1.17.3

3 years ago

1.15.5

3 years ago

1.15.2

3 years ago

1.15.0

3 years ago

1.15.1

3 years ago

1.14.5

3 years ago

1.14.4

3 years ago

1.14.3

3 years ago

1.14.1

3 years ago

1.14.2

3 years ago

1.14.0

3 years ago

1.13.10

3 years ago

1.13.9

3 years ago

1.13.8

3 years ago

1.13.7

3 years ago

1.13.6

3 years ago

1.13.5

3 years ago

1.13.4

3 years ago