0.0.0-development • Published 10 months ago

eq-product-pricing v0.0.0-development

Weekly downloads
-
License
MIT
Repository
-
Last release
10 months ago

EqProductPricing

App Status

Purpose

This project processes product pricing changes and sends them to Equinox.

Environment vars

This project uses the following environment variables:

NameDescriptionRequiredDefault ValueAccepted Values
NODE_ENVEnvironmentYesDEVDEV, TEST, PROD
STORE_IDSEquinox Store IDsYes
BUSINESS_IDEquinox Business IDYes
OKTA_CLIENT_IDOkta Client IDYes
OKTA_CLIENT_SECRETOkta Client SecretYes
SOLACE_CONNECT_VPN_NAMESolace Connect VPN NameYes
SOLACE_CONNECT_USERNAMESolace Connect UsernameYes
SOLACE_CONNECT_PASSWORDSolace Connect PasswordYes
SOLACE_CONNECT_HOSTSolace Connect HostYes

Pre-requisites

Getting started

Initialize and start application

You will need to have CODEARTIFACT_AUTH_TOKEN set as an environment variable. To do this run:

export CODEARTIFACT_AUTH_TOKEN=`aws codeartifact get-authorization-token --domain nextgen --domain-owner 425432027451 --query authorizationToken --output text --profile nse-shd`

nse-shd should be replace by whatever you name as your profile to gain access to the nse-shared-services AWS environment. It is also helpful to add an alias to your .bash_profile if you use bash or .zshrc if you use zsh. That would look something like this:

alias set-codeartifact='export CODEARTIFACT_AUTH_TOKEN=`aws codeartifact get-authorization-token --domain nextgen --domain-owner 425432027451 --query authorizationToken --output text --profile nse-shd`'

Install dependencies

yarn

Create .env file and add update variable values

NODE_ENV=dev
STORE_IDS='{"US":406, "CA":407}'
OKTA_CLIENT_ID='fakeId'
OKTA_CLIENT_SECRET='fakeSecret'
SOLACE_CONNECT_PASSWORD='fakePassword'
SOLACE_CONNECT_USERNAME=solace-cloud-client
SOLACE_CONNECT_HOST='wss://fakeHost:443'
SOLACE_CONNECT_VPN_NAME='fakeVpnName'

Build and run the project

yarn start

Project Structure

The folder structure of this app is explained below:

NameDescription
__mocks__Contains jest mocks
__tests__Contains jest tests
configApplication configuration including environment-specific configs
srcContains source code
publisherContains code to do a test publish
src/index.jsEntry point to the application
DockerfileFile used by docker to containerize application
.gitignoreList of files and directories for Git to ignore
jest.config.jsConfiguration options for Jest
.eslintrcConfiguration options for ES Lint
.prettierrcConfiguration options for prettier
package.jsonContains npm dependencies as well as build scripts
.prettierignoreList of files and directories for prettier to ignore
cx.configFile needed for common pipeline to pass
yarn.lockYarn state snapshot
.gitlab-ci.ymlCI pipeline configuration
node_modulesContains package dependencies

Project Scripts

All the different build steps are orchestrated via yarn scripts. yarn scripts basically allow us to call (and chain) terminal commands via npm.

yarn ScriptDescription
startRuns node on src/index.js. Can be invoked with yarn start
pubPublishes test messages that you can use locally to test. Can be invoked with yarn pub
testRuns tests with coverage using jest. Can be invoked with yarn test
lintRuns ESLint on project files. Can be invoked with yarn lint

Testing

The tests and assertions use Jest.

Running tests using yarn Scripts

yarn test

ESLint

ESLint is a code linter that helps catch minor code quality and style issues.

ESLint rules

All rules are configured through .eslintrc.

Running ESLint

To run ESLint you can call the ESLint task.

yarn lint  // runs only ES lint

Logging

A dashboard has been added to Datadog for monitoring the application logs: Datadog Dashboard. This dashboard is a joint accumulation of logs for EqProductInventory, EqProductPricing, and EqProductData

Logging Metrics

To allow for better log visibility and acessibility, metric data is included in the application logs. Elements in the metric object can be used to fiter, sort and perform actions on application logs in Datadog.

An example you could enter something like this when filtering logs in the Search for field in Datadog:

service:eq-product-data @metric.destination:"nuskin/product/pricing/created/v1/US/sap"

Here is a description for the elements in the Metric Object

NameDescription
countryThe country passed on the event
destinationThe solace topic were the message been processed came from
correlationIdA unique id that ties all the logs in a transaction together
eventTypeEvent type taken from destination in the metric
skuIdThe sku that the update has been sent for

Sample metric data

{
    "applicationMessageId": "ID:AMQP_NO_PREFIX:02020064-28b7-1eee-8b94-f2cdb6fae8de",
    "country": "US",
    "destination": "nuskin/product/pricing/created/v1/US/sap",
    "correlationId": "02020064-28b7-1eee-8b94-f2cdb6fae8de",
    "eventType": "created",
    "senderTimestamp": 1690480880443,
    "skuId": "01102945"
}