0.3.1 • Published 7 months ago

etl-server v0.3.1

Weekly downloads
674
License
-
Repository
github
Last release
7 months ago

DGP UI

This library and app provide a wrapper around airflow, providing a means to add / remove DAGs (Pipelines) via a web-ui based on a configuration defining the Pipeline 'kinds' and the parameters each kind requires.

Pipeline Dashboard

Pipeline Dashboard

Edit/New Pipeline

Edit/New Pipeline

Pipeline Status

Pipeline Status

Quickstart

  1. Create a folder containing:
  • A configuration.yaml file with the details on your pipeline kinds, e.g.
{
    "kinds": [
        {
            "name": "kind1",
            "display": "Kind 1",
            "fields": [
                {
                    "name": "param1",
                    "display": "Parameter 1"
                },
                {
                    "name": "param2",
                    "display": "Parameter 2"
                }
            ]
        },
        {
            "name": "kind2",
            "display": "Kind 2",
            "fields": [
                {
                    "name": "param3",
                    "display": "Parameter 3"
                },
                {
                    "name": "param4",
                    "display": "Parameter 4"
                }
            ]
        }
    ],
    "schedules": [
        {
            "name": "monthly",
            "display": "Monthly"
        },
        {
            "name": "daily",
            "display": "Daily"
        }
    ]

}

(If schedules are not specified, a default schedules list will be used).

  • The Airflow DAGs Creator - a Python file that reads the pipeline configuration and creates your Airflow DAGs. Sample code:
import datetime
import logging
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.utils import dates
from etl_server.models import Models

etl_models = Models()

default_args = {
    'owner': 'Airflow',
    'depends_on_past': False,
    'start_date': dates.days_ago(1),
}

for pipeline in etl_models.all_pipelines():
  # pipeline looks like this:
  # {
  #   "id": "<identifier>",
  #   "name": "<English Name of Pipeline>",
  #   "kind": "<kind-name>",
  #   "schedule": "<schedule>",
  #   "params": {
  #      "field1": "value1",
  #      .. other fields, based on kind's fields in configuration
  #   }
  # }
    dag_id = pipeline['id']
    logging.info('Initializing DAG %s', dag_id)
    dag = DAG(dag_id, default_args=default_args, schedule_interval=datetime.timedelta(days=1))
    task = BashOperator(task_id=dag_id,
                        bash_command='echo "%s"; sleep 10 ; echo done' % pipeline['name'],
                        dag=dag)
    globals()[dag_id] = dag
  1. Use a docker-compose setup to run the server, an example docker-compose.yaml file:
version: "3"

services:

  db:
    image: postgres:12
    environment:
      POSTGRES_PASSWORD: postgres
      POSTGRES_USER: postgres
      POSTGRES_DB: etls
    expose:
      - 5432
    volumes: 
      - /var/lib/postgresql/data

  server:
    build: .
    image: akariv/airflow-config-ui
    environment:
      DATABASE_URL: postgresql://postgres:postgres@db/etls
      AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql://postgres:postgres@db/etls
    expose:
      - 5000
    ports:
      - 5000:5000
    depends_on: 
      - db
    volumes: 
      - /path/to/local/dags/folder/:/app/dags

After running (docker-compose up -d server), open your browser at http://localhost:5000 to see the web UI.

Another option is to create a new Docker image which inherits from akariv/airflow-config-ui and replaces the contents of /app/dags/ with the configuration.json file and your DAG Python files.

0.3.0

7 months ago

0.3.1

7 months ago

0.2.5

1 year ago

0.0.37

2 years ago

0.0.38

2 years ago

0.0.39

2 years ago

0.1.0

2 years ago

0.2.1

2 years ago

0.1.2

2 years ago

0.2.0

2 years ago

0.1.1

2 years ago

0.2.3

2 years ago

0.2.2

2 years ago

0.2.4

2 years ago

0.0.35

3 years ago

0.0.36

3 years ago

0.0.33

3 years ago

0.0.34

3 years ago

0.0.31

3 years ago

0.0.32

3 years ago

0.0.30

3 years ago

0.0.28

3 years ago

0.0.29

3 years ago

0.0.27

3 years ago

0.0.23

3 years ago

0.0.24

3 years ago

0.0.25

3 years ago

0.0.26

3 years ago

0.0.20

3 years ago

0.0.21

3 years ago

0.0.22

3 years ago

0.0.12

3 years ago

0.0.13

3 years ago

0.0.14

3 years ago

0.0.15

3 years ago

0.0.16

3 years ago

0.0.17

3 years ago

0.0.18

3 years ago

0.0.19

3 years ago

0.0.10

3 years ago

0.0.11

3 years ago

0.0.9

3 years ago

0.0.8

3 years ago

0.0.7

3 years ago

0.0.6

3 years ago

0.0.5

3 years ago

0.0.3

3 years ago

0.0.2

3 years ago

0.0.1

3 years ago