3.1.2 • Published 2 years ago

@runnerty/executor-postgres v3.1.2

Weekly downloads
1
License
MIT
Repository
github
Last release
2 years ago

NPM version Downloads

PostgreSQL executor for Runnerty:

Installation:

Through NPM

npm i @runnerty/executor-postgres

You can also add modules to your project with runnerty-cli

npx runnerty-cli add @runnerty/executor-postgres

This command installs the module in your project, adds example configuration in your config.json and creates an example plan of use.

If you have installed runnerty-cli globally you can include the module with this command:

rty add @runnerty/executor-postgres

Configuration:

Add in config.json:

{
  "id": "postgres_default",
  "type": "@runnerty-executor-postgres",
  "user": "postgresusr",
  "password": "postgrespass",
  "database": "MYDB",
  "host": "myhost.com",
  "port": "5432"
}
{
  "id": "postgres_default",
  "type": "@runnerty-executor-postgres",
  "user": "postgresusr",
  "password": "postgrespass",
  "database": "MYDB",
  "host": "myhost.com",
  "port": "5432",
  "ssl": {
    "ca": "./ssl/my.ca"
  }
}

Configuration params:

ParameterDescription
userThe postgres user to authenticate as.
passwordThe password of that postgres user.
databaseName of the database to use for this connection. (Optional)
hostThe hostname of the database you are connecting to.
portThe port number to connect to. (Default: 3306)
encodingThe encoding for the connection. (Default: 'utf8')
application_name(Default: runnerty)
connectionTimeoutMillis(Default: 60000)
query_timeout(Default: false)
statement_timeout(Default: false)
idle_in_transaction_session_timeout(Default: false)
keepAlive(Default: false)
keepAliveInitialDelayMillis(Default: 0)
ssl/caSSL CA File (Optional)
ssl/certSSL CERT File (Optional)
ssl/keySSL KEY File (Optional)

Plan sample:

Add in plan.json:

{
  "id": "postgres_default",
  "command_file": "./sql/test.sql"
}
{
  "id": "postgres_default",
  "command": "SELECT * FROM generate_series(1,10)"
}

Generation of files:

The saved can be indicated in the file of the results obtained from a query in csv, xlsx and json format. These files will be generated with streams. You only have to indicate the corresponding property in the parameters:

XLSX

XLSX Format

ParameterDescription
xlsxFileExportPath of xlsx file export.
xlsxAuthorNameAuthor file name. (Optional)
xlsxSheetNameName of the sheet. (Optional)

Sample:

{
  "id": "postgres_default",
  "command": "SELECT * FROM USERS",
  "xlsxFileExport": "./my_output.xlsx",
  "xlsxAuthorName": "Runnerty",
  "xlsxSheetName": "MySheetSample"
}

CSV

CSV Format

ParameterDescription
csvFileExportPath of csv file export.
csvOptions/headersType: boolean/string[]. The headers will be auto detected from the first row or you can to provide headers array: 'h1name','h2name',....
csvOptions/delimiterAlternate delimiter. (Default: ',')
csvOptions/quoteAlternate quote. (Default: '"')
csvOptions/alwaysWriteHeadersSet to true if you always want headers written, even if no rows are written. (Default: false)
csvOptions/rowDelimiterSpecify an alternate row delimiter (i.e \r\n). (Default: '\n')
csvOptions/quoteHeadersIf true then all headers will be quoted. (Default: quoteColumns value)
csvOptions/quoteColumnsIf true then columns and headers will be quoted (unless quoteHeaders is specified). (Default: false). More info here.
csvOptions/escapeAlternate escaping value. (Default: '"')
csvOptions/includeEndRowDelimiterSet to true to include a row delimiter at the end of the csv. (Default: false)
csvOptions/writeBOMSet to true if you want the first character written to the stream to be a utf-8 BOM character. (Default: false)

Sample:

{
  "id": "postgres_default",
  "command": "SELECT * FROM USERS",
  "csvFileExport": "@GV(WORK_DIR)/users.csv",
  "csvOptions": {
    "delimiter": ";",
    "quote": "\""
  }
}

JSON

JSON Format

Sample:

{
  "id": "postgres_default",
  "command": "SELECT * FROM USERS",
  "jsonfileExport": "@GV(WORK_DIR)/users.json"
}

PLAIN FILE

Plain File Format

For very large data exports it is recommended to use COPY TO with fileExport instead of csvFileExport, despite being developed on streams, it can save the work of converting to CSV.

Sample:

{
  "id": "postgres_default",
  "command": "COPY persons TO STDOUT DELIMITER ';' CSV HEADER QUOTE '\"';",
  "fileExport": "./users.csv"
}

Loading files (COPY FROM)

For file upload you must indicate the path of the file to be loaded in the localInFile parameter and in the COPY [...] FROM statement you must indicate STDIN. For example:

  • localInFile: CSV file path
{
  "id": "postgres_default",
  "command": "COPY persons (first_name,last_name,email) FROM STDIN DELIMITER ';' CSV HEADER QUOTE '\"';",
  "localInFile": "/persons_to_import.csv"
}

Output (Process values):

Standard

  • PROCESS_EXEC_MSG_OUTPUT: postgres output message.
  • PROCESS_EXEC_ERR_OUTPUT: Error output message.

Query output

  • PROCESS_EXEC_DATA_OUTPUT: postgres query output data.
  • PROCESS_EXEC_DB_COUNTROWS: postgres query count rows.
  • PROCESS_EXEC_DB_FIRSTROW: postgres query first row data.
  • PROCESS_EXEC_DB_FIRSTROW_[FILED_NAME]: postgres first row field data.