1.4.1 • Published 5 years ago

csv-node v1.4.1

Weekly downloads
4
License
MIT
Repository
github
Last release
5 years ago

CSV Node

Overview

The library is for read csv and manager csv tables like humam, automating the process of read or write and serializer the csv rows to object.

Sumary

Features

  • Read an csv file and serializer data to javascript/typescript objects;
  • Write csv tables;
  • Automatic cast for numbers an booleans;
  • Alias in colunms of csv table;
  • Skip row of csv;
  • Limit numbers of rows.
  • Map row;
  • Agregates function (max, min, avg and sum) in an colunm;
  • Filter rows.

Next Features

  • Join csv tables;

Install

npm install csv-node

or

yarn add csv-node

Usage

First import class CSVReader of module csv-node.

import { CSVReader } from "csv-node"
// or
const { CSVReader } = require("csv-node")

The module csv-node export:

namedescription
AliasMapAn object for mapper alias columns names
FilterFunctionAn function for filter rows in csv files
PredicateFunctionAn function for apply an predicate to rows in csv files
CSVReadOptionsThe options for read an csv
CSVWriterOptionsThe options for write an csv
CSVReaderThe class to read csv files
CSVWriterThe class to write csv files
CSVNotFoundError, throw if file not exist

Basic usage

// names.csv
name,age
Joh,19
Mary,20
Nicoll,21
Ju,18

Let, create an file index.js, for example, and function loadCsv, for example, for your tests.

const { CSVReader } = require("csv-node")

async function loadCsv() {
  // let's go...
}

loadCsv()
  .then()
  .catch(console.error)

Run node index.js in your bash for tests.

CSVRead

The examples below are for read an csv, they can be found in examples folder.

First read

Use path for absolutes paths in NodeJS.

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "names.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName)
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "names.csv")

interface SimplePerson {
  name: string
  age: string
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson>(fileName)
  const data = await reader.read() 
  console.log(data) // data is of type SimplePerson[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "name": "Joh", "age": "19" },
  { "name": "Mary", "age": "20" },
  { "name": "Nicoll", "age": "21" },
  { "name": "Ju", "age": "18" }
]

Even though age is an number, but is loaded like an string, your can be use an map function or enable castNumbers option of CSVReader for fix this.

Options

The second param of contructor CSVReader is an object of options, the options availables are.

namedescriptiontyperequireddefault
aliasAn object that will rename columnsobjectfalse{}
skipLinesThe numbers of lines for skippingnumberfalse0
limitThe numbers max of rowsnumberfalseInfinity
delimiterDelimiter between columnsstringfalse,
castNumbersAutomatic cast for numbersbooleanfalsefalse
castBooleansAutomatic cast for booleansbooleanfalsefalse
filterFilter rows likes Array.filterFilterFunctionfalsenone
mapMap rows likes Array.mapMapFunctionfalsenone

Options usage

Alias

You doesn't need rename all headers of csv table.

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "names.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName, {
    alias: {
      name: 'Name',
      age: 'Age'
    }
  })
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "names.csv")

interface SimplePerson {
  Name: string
  Age: string
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson>(fileName, {
    alias: {
      name: 'Name',
      age: 'Age'
    }
  })
  const data = await reader.read()
  console.log(data) // data is of type SimplePerson[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "Name": "Joh", "Age": "19" },
  { "Name": "Mary", "Age": "20" },
  { "Name": "Nicoll", "Age": "21" },
  { "Name": "Ju", "Age": "18" }
]

Skip Lines

This option will skip x lines, like offset in SQL.

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "names.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName, {
    skipLines: 1
  })
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "names.csv")

interface SimplePerson {
  name: string
  age: string
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson>(fileName, {
    skipLines: 1
  })
  const data = await reader.read()
  console.log(data) // data is of type SimplePerson[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "name": "Mary", "age": "20" },
  { "name": "Nicoll", "age": "21" },
  { "name": "Ju", "age": "18" }
]

Limit

The option is for limit the result size, like limit in SQL;

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "names.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName, {
    limit: 2
  })
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "names.csv")

interface SimplePerson {
  name: string
  age: string
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson>(fileName, {
    limit: 2
  })
  const data = await reader.read()
  console.log(data) // data is of type SimplePerson[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "name": "Joh", "age": "19" },
  { "name": "Mary", "age": "20" }
]

Delimiter

This is delimiter between colunms.

Filter

Filter the row of csv, the callback function is of type FilterFunction, this feat is like Array.filter.

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "names.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName, {
    filter: (data) => data.age < 20
  })
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "names.csv")

interface SimplePerson {
  name: string
  age: string
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson>(fileName, {
    // the `data` is of type SimplePerson
    filter: (data) => Number(data.age) < 20 
  })
  const data = await reader.read()
  console.log(data) // data is of type SimplePerson[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "Name": "Joh", "Age": "19" },
  { "Name": "Ju", "Age": "18" }
]

Map

The option will map the csv row, the callback function is of type MapFunction, this feat is like Array.map.

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "names.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName, {
    map: (data) => `${data.name}-${data.age}`
  })
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[ "Joh-19", "Mary-20", "Nicoll-21", "Ju-18" ]
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "names.csv")

interface SimplePerson {
  name: string
  age: string
}

interface Person {
  name: string
  age: number
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson, Person>(fileName, {
    // data is of type SimplePerson
    map: (data) => ({
      name: data.name,
      age: Number(data.age)
    })
  })
  const data = await reader.read()
  console.log(data) // data is of type Person[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "name": "Joh", "age": 19 },
  { "name": "Mary", "age": 20 },
  { "name": "Nicoll", "age": 21 },
  { "name": "Ju", "age": 18 }
]

Cast Numbers

Automatic cast numbers.

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "names.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName, {
    castNumbers: true
  })
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "names.csv")

interface SimplePerson {
  name: string
  age: number
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson>(fileName, {
    castNumbers: true
  })
  const data = await reader.read()
  console.log(data) // data is of type SimplePerson[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "name": "Joh", "age": 19 },
  { "name": "Mary", "age": 20 },
  { "name": "Nicoll", "age": 21 },
  { "name": "Ju", "age": 18 }
]

Cast Booleans

Automatic cast booleans.

JS
const path = require("path")
const { CSVReader } = require("csv-node")

const fileName = path.resolve(__dirname, "todos.csv")

async function loadCsv() {
  const reader = new CSVReader(fileName, {
    castBooleans: true
  })
  const data = await reader.read()
  console.log(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVReader } from "csv-node"

const fileName = path.resolve(__dirname, "todos.csv")

interface SimplePerson {
  name: string
  completed: boolean
}

async function loadCsv() {
  const reader = new CSVReader<SimplePerson>(fileName, {
    castBooleans: true
  })
  const data = await reader.read()
  console.log(data) // data is of type SimplePerson[]
}

loadCsv()
  .then()
  .catch(console.error)

Output.

[
  { "name": "Todo 1", "completed": true },
  { "name": "Todo 2", "completed": true },
  { "name": "Todo 3", "completed": false },
  { "name": "Todo 4", "completed": true },
  { "name": "Todo 5", "completed": false }
]

The options can be combined.

Order of call of options:

  1. Alias;
  2. Map;
  3. Skip Lines & Limit;
  4. Filter;
  5. cast.

The filePath

filePath must be absolute or csv-node search the file startirg of root folder of project node.

CSVReader API

The CSVReader class provide the methods and fields bellow.

Fields

namedescriptiontype
headersThe headers columns with aliasstring []
nativeHeadersThe real headers of csv tablestring []
dataThe data of csvT[]

All fields only is available before call function read. The nativeHeaders and headers are available before call any methods.

Methods

namedescriptionreturn
read()Read the csv dataPromise<T[]>
min(column: string)Return the min value of an columnPromise<number | undefined>
sum(column: string)Return the sum value of an columnPromise<number | undefined>
max(column: string)Return the max value of an columnPromise<number | undefined>
avg(column: string)Return the average value of an columnPromise<number | undefined>

For tests, you can be usage the file CSV Test.

Read

The read function already explained in Usage.

Min

The min function return the min value of column passed in parameters of min(string: column). You can be usage the option config like read() function.

async function loadCsv() {
  const fileName = path.resolve(__dirname, "file3.csv")
  const reader = new CSVReader(fileName)

  const min = await reader.min("price")
  console.log(min)
}
// 0.03

Max

The max function return the max value of column passed in parameters of max(string: column). You can be usage the option config like read() function.

async function loadCsv() {
  const fileName = path.resolve(__dirname, "file3.csv")
  const reader = new CSVReader(fileName)

  const max = await reader.max("price")
  console.log(max)
}
// 99.99

Avg

The avg functions return the average value of column passed in parameters of avg(string: column). You can be usage the option config like read() function.

async function loadCsv() {
  const fileName = path.resolve(__dirname, "file3.csv")
  const reader = new CSVReader(fileName)

  const avg = await reader.avg("price")
  console.log(max)
}
// 49.492769999999936

Sum

The avg functions return the sum value of column passed in parameters of sum(string: column). You can be usage the option config like read() function.

async function loadCsv() {
  const fileName = path.resolve(__dirname, "file3.csv")
  const reader = new CSVReader(fileName)

  const sum = await reader.sum("price")
  console.log(sum)
}
// 49492.76999999994

The options can be used.

CSVWriter

Options

The second param of contructor CSVWriter is an object of options, the options availables are.

namedescriptiontyperequireddefault
headersAn object that will describe the columnsobjecttrue---
delimiterDelimiter between columnsstringfalse,
formatThe function for format an columnobjectfalse{}
defaultValueAn object with default value for empty columnsobjectfalse{}

Options usage

Headers

You must provide the headers that will writer in csv file, you can rename columns or not.

JS
const path = require("path")
const { CSVWriter } = require("csv-node")

const fileName = path.resolve(__dirname, "output.csv")

const data = [
  { name: 'David0', age: 18 },
  { name: 'David1', age: 18 },
  { name: 'David2', age: 18 },
  { name: 'David3', age: 18 },
  { name: 'David4', age: 18 }
]

async function loadCsv() {
  const writer = new CSVWriter(fileName, {
    headers: {
      name: 'name',
      age: 'age'
    }
  })
  await writer.write(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path from "path"
import { CSVWriter } from "csv-node"

const fileName = path.resolve(__dirname, "output.csv")

interface Person {
  name: string
  age: number
}

const data: Person[] = [
  { name: 'David0', age: 18 },
  { name: 'David1', age: 18 },
  { name: 'David2', age: 18 },
  { name: 'David3', age: 18 },
  { name: 'David4', age: 18 }
]

async function loadCsv() {
  const writer = new CSVWriter<Person>(fileName, {
    headers: {
      name: 'name',
      age: 'age'
    }
  })
  await writer.write(data)
}

loadCsv()
  .then()
  .catch(console.error)

Output.

name,age
David0,18
David1,18
David2,18
David3,18
David4,18

Delimiter

The delimiter between columns.

Format

The option is for apply an function before save, for example if object contain Dates is interesting save time only.

JS
const path = require("path")
const { CSVWriter } = require("csv-node")

const fileName = path.resolve(__dirname, "output.csv")

const data = [
  { name: 'David0', age: 18 },
  { name: 'David1', age: 18 },
  { name: 'David2', age: 18 },
  { name: 'David3', age: 18 },
  { name: 'David4', age: 18 }
]

async function loadCsv() {
  const writer = new CSVWriter(fileName, {
    headers: {
      name: 'name',
      age: 'age'
    },
    format: {
      age: (age) => `${age} years`
    }
  })
  await writer.write(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path from "path"
import { CSVWriter } from "csv-node"

const fileName = path.resolve(__dirname, "output.csv")

const data: Person[] = [
  { name: 'David0', age: 18 },
  { name: 'David1', age: 18 },
  { name: 'David2', age: 18 },
  { name: 'David3', age: 18 },
  { name: 'David4', age: 18 }
]

interface Person {
  name: string
  age: number
}

async function loadCsv() {
  const writer = new CSVWriter<Person>(fileName, {
    headers: {
      name: 'name',
      age: 'age'
    },
    format: {
      age: (age) => `${age} years`
    }
  })
  await writer.write(data)
}

loadCsv()
  .then()
  .catch(console.error)

Output.

name,age
David0,18 years
David1,18 years
David2,18 years
David3,18 years
David4,18 years

Default value

The option is for add fallback value if object not contains the column. The default value is NULL, but you can change.

JS
const path = require("path")
const { CSVWriter } = require("csv-node")

const fileName = path.resolve(__dirname, "output.csv")

const data = [
  { name: 'David0' },
  { age: 18 },
  { name: 'David2', age: 18 },
  { name: 'David3'},
  { name: 'David4', age: 18 }
]

async function loadCsv() {
  const writer = new CSVWriter(fileName, {
    headers: {
      name: 'name',
      age: 'age'
    },
    defaultValue: {
      name: 'None',
      age: '0'
    }
  })
  await writer.write(data)
}

loadCsv()
  .then()
  .catch(console.error)
TS
import path  from "path"
import { CSVWriter } from "csv-node"

const fileName = path.resolve(__dirname, "output.csv")

const data = [
  { name: 'David0' },
  { age: 18 },
  { name: 'David2', age: 18 },
  { name: 'David3'},
  { name: 'David4', age: 18 }
]

interface Person {
  name: string
  age: number
}

async function loadCsv() {
  const writer = new CSVWriter<Partial<Person>>(fileName, {
    headers: {
      name: 'name',
      age: 'age'
    },
    defaultValue: {
      name: 'None',
      age: '0'
    }
  })
  await writer.write(data)
}

loadCsv()
  .then()
  .catch(console.error)

Output.

name,age
David0,0
None,18
David2,18
David3,0
David4,18

The options can be combinated.

Methods

namedescriptionreturn
writer(data: object)Write the object in csvPromise<void>
1.4.1

5 years ago

1.4.0

5 years ago

1.3.2

5 years ago

1.3.1

5 years ago

1.3.0

5 years ago

1.2.0

5 years ago

1.1.0

5 years ago

1.0.0

5 years ago