1.7.8 • Published 1 year ago
export-from-json-enhanced v1.7.8
Export to plain text, css, html, json, csv, xls, xml files from JSON.
Installation
yarn add export-from-jsonor
npm i --save export-from-jsonor
pnpm i --save export-from-jsonUsage
exportFromJSON supports CommonJS, EcmaScript Module, UMD importing.
exportFromJSON receives the option as the Types Chapter demonstrated, and it uses a front-end downloader as the default processor. In browser environment, there is a content size limitation on the default processor, consider using the server side solution.
In module system
import exportFromJSON from 'export-from-json'
const data = [{ foo: 'foo'}, { bar: 'bar' }]
const fileName = 'download'
const exportType =  exportFromJSON.types.csv
exportFromJSON({ data, fileName, exportType })In browser
Check the codepen example
<script src="https://unpkg.com/export-from-json/dist/umd/index.min.js"></script>
<script>
    const data = [{ foo: 'foo'}, { bar: 'bar' }]
    const fileName = 'download'
    const exportType = 'csv'
    window.exportFromJSON({ data, fileName, exportType })
</script>In Node.js server
exportFromJSON returns what the option processor returns, we can use it on server side for providing a converting/downloading service:
const http = require('http')
const exportFromJSON = require('export-from-json')
http.createServer(function (request, response){
    // exportFromJSON actually supports passing JSON as the data option. It's very common that reading it from http request directly.
    const data = '[{"foo":"foo"},{"bar":"bar"}]'
    const fileName = 'download'
    const exportType = 'txt'
    const result = exportFromJSON({
        data,
        fileName,
        exportType,
        processor (content, type, fileName) {
            switch (type) {
                case 'txt':
                    response.setHeader('Content-Type', 'text/plain')
                    break
                case 'css':
                    response.setHeader('Content-Type', 'text/css')
                    break
                case 'html':
                    response.setHeader('Content-Type', 'text/html')
                    break
                case 'json':
                    response.setHeader('Content-Type', 'text/plain')
                    break
                case 'csv':
                    response.setHeader('Content-Type', 'text/csv')
                    break
                case 'xls':
                    response.setHeader('Content-Type', 'application/vnd.ms-excel')
                    break
            }
            response.setHeader('Content-disposition', 'attachment;filename=' + fileName)
            return content
        }
    })
    response.write(result)
    response.end()
}).listen(8080, '127.0.0.1')Types
Note: JSON refers to a parsable JSON string or a serializable JavaScript object.
| Option name | Required | Type | Description | ||||
|---|---|---|---|---|---|---|---|
| data | true | Array<JSON>, JSON or string | If the exportType is 'json', data can be any parsable JSON. If the exportType is 'csv' or 'xls', data can only be an array of parsable JSON. If the exportType is 'txt', 'css', 'html', the data must be a string type. | ||||
| fileName | false | string | filename without extension, default to 'download' | ||||
| extension | false | string | filename extension, by default it takes the exportType | ||||
| fileNameFormatter | false | (name: string) => string | filename formatter, by default the file name will be formatted to snake case | ||||
| fields | false | string[] or field name mapper type Record<string, string> | fields filter, also supports mapper field name by passing an name mapper, e.g. { 'bar': 'baz' }, default to undefined | ||||
| exportType | false | Enum ExportType | 'txt'(default), 'css', 'html', 'json', 'csv', 'xls', 'xml' | ||||
| processor | false | (content: string, type: ExportType, fileName: string) => any | default to a front-end downloader | ||||
| withBOM | false | boolean | Add BOM(byte order mark) meta to CSV file. BOM is expected by Excel when reading UTF8 CSV file. It is default to false. | ||||
| beforeTableEncode | false | (entries: { fieldName: string, fieldValues: string[] }[]) => { fieldName: string, fieldValues: string[] }[] | Given a chance to altering table entries, only works for CSV and XLS file, by default no altering. | ||||
| sanitizeCell | false | (value: string, delimiter: ',' \| ';') => string | Alter the CSV injection sanitization algorithm, only works for CSV data. By default the rules are: 1) Fields that contain commas must begin and end with double quotes, 2) Fields that contain double quotes must begin and end with double quotes. 3) Fields that contain line breaks must begin and end with double quotes (not all programs support values with line breaks). 4) All other fields do not require double quotes.  5) Double quotes within values are represented by two contiguous double quotes. | delimiter | false | ',' \| ';' | Specify CSV raw data's delimiter between values. It is default to , | 
Tips
- You can reference these exported types through a mounted static field 
types, e.g. 
exportFromJSON({ data: jsonData, fileName: 'data', exportType: exportFromJSON.types.csv })- You can transform the data before exporting by 
beforeTableEncode, e.g. 
exportFromJSON({
    data: jsonData,
    fileName: 'data',
    exportType: exportFromJSON.types.csv,
    beforeTableEncode: rows => rows.sort((p, c) => p.fieldName.localeCompare(c.fieldName)),
})1.7.8
1 year ago
