1.0.2 • Published 28 days ago

@bespoken-api/data-access v1.0.2

Weekly downloads
-
License
ISC
Repository
-
Last release
28 days ago

@bespoken-api/data-access

Overview

This package contains the data access layer for the Bespoken API. It is responsible for all interactions with bespoken data stores.

Data sources

  • MySql 8 Database server: mysql.bespoken.io (refer to Batch Tester MySQL for credentials)
  • Firebase This includes libraries for:
    • admin
    • client
  • MongoDB

Usage

This package runs isolated and does not require more configurations than setting the environment where runs to.

EnvironmentValueDescription
DA_ENVdev or prodIndicates the environment where it will connect. This will use one o another encrypted-dev.env or encrypted-prod.env file

Note: All env vars for running are saved in the repository( they are encrypted using SOPS and AGE).

Just for reference the list of env vars and its soures are listed below:

EnvironmentDescription
DA_ENVIt must be set on the project where it will be used. Indicates the environment where it will connect. This will use one o another encrypted-dev.env or encrypted-prod.env file
DA_PRISMA_BESPOKEN_DBUrl to connect the MySql database on mys1l8.bespoken.io.
DA_PRISMA_BESPOKEN_DB_SHADOWUrl to connect shadow MySql database on mys1l8.bespoken.io. This is only for development when the migration is calculated.
DA_FIREBASE_KEYFirebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts
DA_FIREBASE_EMAILFirebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts
DA_FIREBASE_PROJECTFirebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts
DA_FIREBASE_URLFirebase configuration to access the admin API level. Gives access to manitpulate the data or create accounts
DA_FIREBASE_CLIENT_API_KEYFirebase configuration to access the client API level.to validate Firebase toke(like JWT authentication when user login to dashboard)
DA_FIREBASE_CLIENT_AUTH_DOMAINFirebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard)
DA_FIREBASE_CLIENT_DATABASE_URLFirebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard)
DA_FIREBASE_CLIENT_MESSAGING_SENDER_IDFirebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard)
DA_FIREBASE_CLIENT_STORAGE_BUCKETFirebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard)
DA_FIREBASE_CLIENT_TOKENFirebase configuration to access the client API level. Give access to validate Firebase toke(like JWT authentication when user login to dashboard)
DA_MONGO_URLTo connect to MongoDB
DA_GITHUB_ACCESS_TOKENTo coonect to Github API to manage test suite files
DA_GITHUB_ORGANIZATIONTo coonect to Github API to manage test suite files
DA_JWT_PRIVATE_KEY_BASE64JWT token key for authentication. This is encoded in base64. To support the internal-api authentication
DA_INTUIT_CLIENT_IDQuickbooks API credentials
DA_INTUIT_CLIENT_SECRETQuickbooks API credentials
DA_INTUIT_ENVIRONMENTQuickbooks API credentials
DA_INTUIT_REDIRECTURIQuickbooks API credentials
DA_INTUIT_COMPANY_IDQuickbooks API credentials
DA_DYNAMODB_REGIONConfiguration to access AWS DynamoDB. It uses one db for dev and prod, orgin field is used for diferenctiate between environments
DA_DYNAMODB_ACCESS_KEY_IDConfiguration to access AWS DynamoDB.
DA_DYNAMODB_SECRET_ACCESS_KEYConfiguration to access AWS DynamoDB.
DA_VIRTUALDEVICE_ALEXA_OAUTH_URLAlexa configuration for creating access tokens when virtual device is created. (For Bots. Console: https://developer.amazon.com/alexa/console/avs/products/AlexaBot/details/info)
DA_VIRTUALDEVICE_ALEXA_AVS_CLIENT_IDAlexa configuration for creating access tokens when virtual device is created. (For Bots)
DA_VIRTUALDEVICE_ALEXA_AVS_CLIENT_SECRETAlexa configuration for creating access tokens when virtual device is created. (For Bots)
DA_VIRTUALDEVICE_ALEXA_AVS_PRODUCT_IDAlexa configuration for creating access tokens when virtual device is created. (For Bots)
DA_VIRTUALDEVICE_ALEXAMUSIC_OAUTH_URLAlexa configuration for creating access tokens when virtual device is created. (For Music device, Console: https://developer.amazon.com/alexa/console/avs/products/VirtualDeviceMusic/details/info)
DA_VIRTUALDEVICE_ALEXAMUSIC_AVS_CLIENT_IDAlexa configuration for creating access tokens when virtual device is created. (For Music device)
DA_VIRTUALDEVICE_ALEXAMUSIC_AVS_CLIENT_SECRETAlexa configuration for creating access tokens when virtual device is created. (For Music device)
DA_VIRTUALDEVICE_ALEXAMUSIC_AVS_PRODUCT_IDAlexa configuration for creating access tokens when virtual device is created. (For Music device)
DA_VIRTUALDEVICE_GOOGLE_OAUTH_URLGoogle configuration for creating access tokens when virtual device is created.(Console: https://console.cloud.google.com/apis/credentials/oauthclient/969501293302-726fr1b001sg3lg1ouk4u3l0m77h3skh.apps.googleusercontent.com?authuser=2&project=silent-echo)
DA_VIRTUALDEVICE_GOOGLE_ASSISTANT_CLIENT_IDGoogle configuration for creating access tokens when virtual device is created.
DA_VIRTUALDEVICE_GOOGLE_ASSISTANT_CLIENT_SECRETGoogle configuration for creating access tokens when virtual device is created.
DA_VIRTUALDEVICE_CRYPT_KEY_REFESH_TOKENGoogle configuration for creating access tokens when virtual device is created.
DA_VIRTUALDEVICE_JWT_STATE_BASE64Google configuration for creating access tokens when virtual device is created.

Example of usage

// 1. Set the environment where it will run
export DA_ENV=dev

// 2. Add the package to your project
pnpm add @bespoken-api/package-x add @bespoken-api/data-access

// 2. Import the dao class from the package and run
const { AppSettingsDao } = require("@bespoken-api/data-access")
const dao = new AppSettingsDao()
const dto = await dao.readAppSettings('dashboard')

Development

Adding new env vars

For adding new env vars, you must decrypt the file into .env. Then copy .env content, replace it on the encrypted-*.env, and then encrypt this file again.

Use prefix DA_ for all vars

:exclamation: BE CAREFUL ENCRYPTING TWICE BY MISTAKE.

Requires to have SOPS and AGE installed in your machine. Also, have the Public and Prive keys to encrypt/decrypt the files. see: README.md

There is a file for each environment encrypted-dev.env and encrypted-prod.env. To add a new env var, you need to add it to each of the encrypted files.

You can use the following steps to add a new env var:

  1. Decrypt one of the file, for example, encrypted-dev.env (cwd: data-access folder)
pnpm -w run sops:decrypt --file=./encrypted-dev.env
  1. Open the file .env and copy the content( this content will be used to replace the content of encrypted-dev.env)
## Prisma
DA_ENV="dev"
DA_OTHER_VAR="other value"
(...)
  1. Replace the content of encrypted-dev.env with the content of .env and add the new env var
## Prisma
DA_ENV="dev"
DA_OTHER_VAR="other value"
(...)
## New group
DA_NEW_VAR="new value"
  1. Encrypt the file again
pnpm -w run sops:encrypt --file=./encrypted-dev.env
  1. Repeat steps 1 to 4 for encrypted-prod.env

Reading env vars from code

The reading process is done by the sopsConfig function. This function is called in the index.js file of the package and sets the value in the object process.env . It is called only once and it is called with the encrypted-dev.env or encrypted-prod.env file depending on the value of DA_ENV env var.

Then modify the config class in /libs/utils/configuration.js

(...)
const config = new class {
  _cache = {}

  get DA_ENV() { return process.env.DA_ENV }
  (...)
  
  // Add the new variable 
  get DA_NEW_VAR() { return process.env.DA_ENV }

  // If requires to use read multi-line env var encode it in base64 and then decode in this class 
  get DA_JWT_PRIVATE_KEY() {
    if (isUndefined(this._cache?.DA_JWT_PRIVATE_KEY)) {
      this._cache.DA_JWT_PRIVATE_KEY = Buffer.from(toString(process?.env?.DA_JWT_PRIVATE_KEY_BASE64), 'base64').toString('utf-8')
    }
    return this._cache?.DA_JWT_PRIVATE_KEY
  }

}
(...)

Then to use it in code, import the config object and use it as follows:

const { config: { DA_NEW_VAR }
} = require('../utils/configuration')

console.log(DA_NEW_VAR)

Adding new data sources

New data sources should be added in the folder /libs/datasources/ file.

Adding new DAOs

New DAOs should be added in the folder /libs/daos/ file.

Develop database schema modifications

Read Prisma Documentation for more information about Prisma Migrations

Prisma requires to prebuild a client library before execution. This is done automatically when running prisma:migrate or prisma:draft commands. And can be run manually by running with prisma:generate. This library depends on the platform where it's running to prebuild this lib. This is configured on the prisma/schema.prisma file in the client.binaryTargets those files should be committed to git.

The following commands are provided for development and publishing purposes. EACH COMMAND WILL AUTOMATICALLY DECRYPT THE encrypted-{env}.env depending on the required for its execution.

Create and run a migration for development purposes

pnpm --filter=@bespoken-api/data-access run prisma:migrate

Create a customized migration

pnpm --filter=@bespoken-api/data-access run prisma:draft

Deploy changes in production

pnpm --filter=@bespoken-api/data-access run prisma:draft

Generate client library

pnpm --filter=@bespoken-api/data-access run prisma:draft

Usage examples

Add a new column
  1. Modify the schema.prisma file adding the new column(set nullable or)
(...)
model AppSetting {
  id      Int    @id @default(autoincrement())
  app     String
  name    String
  content Json
  // new column
  new_column   String?

  @@map("app_settings")
}
(...)
  1. Run prisma:migrate command

    Remember changes are applied to bespoken_dev database

Command will ask for the migration name. Set a descriptive name for it and commit the changes to git.

pnpm --filter=@bespoken-api/data-access run prisma:migrate
Add new column from values in another column
  1. Modify the schema.prisma file adding the new column(set nullable or)
(...)
model AppSetting {
  id      Int    @id @default(autoincrement())
  app     String
  name    String
  content Json
  // new column
  new_column   String?

  @@map("app_settings")
}
(...)
  1. Run prisma:migrate command

    Remember changes are applied to bespoken_dev database

Command will ask the migration name set a descriptive name for it.

pnpm --filter=@bespoken-api/data-access run prisma:migrate
  1. Create a custom migration to copy the data

    Command will ask the migration name. Set a descriptive name for it.

pnpm --filter=@bespoken-api/data-access run prisma:draft
  1. Edit the migration file created in the previous step(find the new migration file into the folder prisma/migrations)
-- This is an empty migration.
update app_settings
SET
new_column = name;
  1. Run the migration
pnpm --filter=@bespoken-api/data-access run prisma:migrate
  1. Remove the old column
(...)
model AppSetting {
  id      Int    @id @default(autoincrement())
  app     String
  content Json
  // new column
  new_column   String?

  @@map("app_settings")
}
(...)
  1. Run the migration
pnpm --filter=@bespoken-api/data-access run prisma:migrate

Other commands for development purposes are provided.

MySql using Prisma

The packages uses Prisma to interact with the MySql database. Prisma is an ORM that generates a client library based on the database schema. The client library is used to interact with the database.

NPM Scripts

Generate Client Library

This command generates the native client library based on prisma configurations. It runs automatically when push or migrate commands are executed.

pnpm --filter=@bespoken-api/data-access run prisma:generate