0.0.1 • Published 3 years ago

@nutshelllab/aurora-store v0.0.1

Weekly downloads
2
License
ISC
Repository
github
Last release
3 years ago

Goal

This repo was originaly made to provide a serverless aurora postgresql connector to be used in Keyro Login with a strong retro-compatibility with our dynamodb connector in mind. By implementing the same API, it offers the opportunity to easily switch between Aurora and Dynamo.

Usage

Installation

yarn add @nutshelllab/aurora-store

This module allows you to run your code on a local database. To do so, you will also need to have a proper postgresql installation running.

Configuration

Config is done through env variables so that you can use your serverless environnement to handle it as code :

Access to the database

Variabledescription
DB_HOSTThe ip adress/domain where the SGBD runs. It may be an aws aurora cluster.
DB_PORTDefault is postgres's default (5432)
DB_NAMEThe name of your postgresql database
DB_USERUsername who has been given the proper privileges in your db
DB_PASSWDPassword of the above user
SLS_STAGEIf local is passed, the variables below will be used instead of above.
LOCAL_DB_HOSTThe ip adress/domain where the SGBD runs. It may be 127.0.0.1.
LOCAL_DB_PORTDefault is postgres's default (5432)
LOCAL_DB_NAMEThe name of your postgresql database
LOCAL_DB_USERUsername who has been given the proper privileges in your db
LOCAL_DB_PASSWDPassword of the above user

Migrations

You can easily configure a migration source to handle your database schema. Migration are internally done through Knex, so that the configuration is identical. Here's an example with webpack context :

import { configure } from '@nutshelllab/aurora-store' //The module expose a simple configure method

const context = require.context('../../migrations', false, /\.js$/)

const makeMigrationSource = context => {
  return {
    getMigrations() {
      return Promise.resolve(context.keys().sort())
    },
    getMigrationName(migration) {
      return migration
    },
    getMigration(migration) {
      return context(migration)
    }
  }
}

configure({
  migrationSource: makeMigrationSource(context) // Configuration is mutating the state, keep it for initialization as much as possible.
})

Initialize a store

To initialize a store, just call default function of the module :

import createStore from '@nutshelllab/aurora-store'

//[...] configuration has been done at this point

const teamStore = createStore({ kind: 'teams', validate: makeTeam })
const result = await teamStore.find({ id: 'team-00007'})

Options you can pass to the default store builder function are :

Optiondescriptiondefault
kindThe table name you want to manipulateNo default, must be set
validateA validator function to be called on data before writing and after readingid function (nothing happen)
idFieldHandle PK with this option as an array of columns names'id'

Features

findIn(field, values)

Returns all the records having a field's value intercecting with a predefined array of value.

store.findIn('id', ['001', '002', '003'])

findAll(filters)

Returns all the records matching a pattern

store.findAll({ contracted: true })

find(filters)

Returns exactly one record matching a pattern or throw an error with name attribute set to 'NOT_FOUND'.

store.find({ id: '001' })

put(data)

Write an object with the store. If it does not exist, it will be created.

store.put({ id: 'movie-0001', name: 'Black Widow' })

update(data)

Update the record matching with data id (default is 'id' field, see idField configuration to change it).

store.update({id: 'movie-0001', name: 'Black Windows' })

create(data)

Insert a new record with the given object values. AlreadyExisting will throw.

store.create({id: 'movie-0002', name: 'Black Windows' })

findOrCreate(filters, data)

Fetch a record with filters. A creation attempt is made it nothing is found. Bad filtering may result in throwing already exist error.

store.create({name: 'Black Windows'}, { id: 'newId-002', name: 'Black Windows', company: 'marvelcrosoft'})

remove(filters)

Destroy all records with filters forever.

store.remove({name: 'Black Windows'})

truncate(options)

Destroy all records of a table forever. You must pass force attribute to true to execute this as a way to make it clear that you know what you're doing.

store.truncate() // won't work
store.truncate({ force: true })

knexClient()

Obviously the point of using a relational database was to go beyond dynamo key-values restriction. This function gives you a Knex querybuilder. You can use that to manipulate any table in a sql-query manner :

store.knexClient().then(db =>
  db('movies').innerjoin('authors', 'authors.id', 'movies.authorId').where('authors.name', 'marvelcrosoft').select('movies.*')
)

See knex documentation for further features.

Combine with serverless-offline

With serverless-offline, this module can be used to provide a 100% local execution which may greatly increase your productivity. Keyro Login has been tested and successfully run on a localhost with a local duplicated database of our remote dev environement.

Contribute

  1. Install postgresql
  2. Create a pguser user with a pguser password
  3. yarn
  4. Check integration tests yarn test
0.0.1

3 years ago

0.0.1-21

4 years ago

0.0.1-20

4 years ago

0.0.1-19

4 years ago

0.0.1-18

4 years ago

0.0.1-17

4 years ago

0.0.1-16

4 years ago

0.0.1-15

4 years ago

0.0.1-14

4 years ago

0.0.1-12

4 years ago

0.0.1-13

4 years ago

0.0.1-11

4 years ago

0.0.1-7

4 years ago

0.0.1-10

4 years ago

0.0.1-9

4 years ago

0.0.1-8

4 years ago

0.0.1-6

4 years ago

0.0.1-5

4 years ago

0.0.1-4

4 years ago

0.0.1-3

4 years ago

0.0.1-2

4 years ago

0.0.1-1

4 years ago

0.0.1-0

4 years ago