1.1.7 • Published 1 year ago

@mundanesoftware/file-uploader v1.1.7

Weekly downloads
-
License
MIT
Repository
-
Last release
1 year ago

File Uploader

A robust library for uploading large files to Cloud Storage with:

  • Chunked uploads
  • Retry mechanism with exponential backoff
  • Resumable uploads
  • Dynamic concurrency adjustment based on network conditions
  • Custom logging support

Installation

npm install file-uploader

Initialisation

You can configure the Uploader with the following options:

  • maxConcurrentUploads: The maximum number of concurrent chunk uploads.
  • destinationResolver: A function that resolves the destination URL for each file.
  • refreshSasToken: A function that returns a refreshed SAS token for secure uploads.
  • infoLogger: (Optional) Custom function to handle info logs.
  • errorLogger: (Optional) Custom function to handle error logs.

Basic Example

import Uploader from 'file-uploader';

// Initialize the uploader
const uploader = new Uploader({
    maxConcurrentUploads: 5,
    destinationResolver: async (file) => {
        const datasetName = file.dataset || 'default-dataset';
        return `https://myaccount.blob.core.windows.net/${datasetName}`;
    },
    refreshSasToken: async (fileName) => {
        const response = await fetch(`/api/refresh-sas?file=${fileName}`);
        const data = await response.json();
        return data.sasToken;
    },
    infoLogger: console.info,
    errorLogger: console.error,
});

// Handle files
const files = [
    { name: 'file1.txt', dataset: 'dataset1', size: 1024 },
    { name: 'file2.txt', dataset: 'dataset2', size: 2048 },
];

// Start the upload process
uploader.uploadFiles(files)
    .then(() => console.log('All files uploaded successfully!'))
    .catch((err) => console.error('Error uploading files:', err));

Resumable Upload Example

The uploader supports resumable uploads for both network interruptions and user-initiated pauses.

// Pause a file upload
uploader.pauseUpload('file1.txt');

// Resume the paused file upload
uploader.resumeUpload({ name: 'file1.txt', dataset: 'dataset1', size: 1024 });

Event Listeners

You can listen to the following events emitted by the Uploader:

  • fileStart: Fired when a file starts uploading.
  • fileProgress: Fired periodically to indicate the upload progress of a file.
  • fileComplete: Fired when a file finishes uploading.
  • chunkProgress: Fired for individual chunk upload progress.
  • error: Fired when an error occurs.
uploader.on('fileStart', (data) => console.log(`Starting upload for ${data.fileName}`));
uploader.on('fileProgress', (data) => console.log(`${data.fileName} is ${data.progress}% complete`));
uploader.on('fileComplete', (data) => console.log(`${data.fileName} completed successfully`));
uploader.on('error', (error) => console.error('Upload error:', error));

Dynamic Concurrency Adjustment

The uploader automatically adjusts concurrency based on the user's network conditions.

  • Fast Network (4G and above): Increases concurrency for faster uploads.
  • Slow Network (3G, 2G): Reduces concurrency to prevent overload.
  • No Network Information: Defaults to maxConcurrentUploads.

Logging

You can provide custom logging functions to integrate with external logging systems like Sentry.

const infoLogger = (message, data) => {
    // Custom log handling
    console.log(`[INFO]: ${message}`, data);
};

const errorLogger = (message, error) => {
    // Custom error handling
    console.error(`[ERROR]: ${message}`, error);
};

const uploader = new Uploader({
    maxConcurrentUploads: 3,
    destinationResolver: async (file) => `https://myaccount.blob.core.windows.net/${file.dataset}`,
    refreshSasToken: async (fileName) => 'YOUR_SAS_TOKEN',
    infoLogger,
    errorLogger,
});

Advanced Retry Mechanism

The uploader retries failed chunk uploads with exponential backoff:

  • Max Retries: 3 (default, configurable).
  • Backoff Delay: Starts at 500ms and doubles with each attempt.
async uploadChunk(chunk, uploadUrl, index, maxRetries = 3, delay = 500) {
    const config = {
        headers: {
            'x-ms-blob-type': 'BlockBlob',
        },
    };

    for (let attempt = 1; attempt <= maxRetries; attempt++) {
        try {
            await axios.put(uploadUrl, chunk, config);
            return; // Exit on success
        } catch (error) {
            if (attempt === maxRetries) {
                throw error; // Throw error after max retries
            }

            const backoff = delay * Math.pow(2, attempt - 1);
            await new Promise((resolve) => setTimeout(resolve, backoff));
        }
    }
}
1.1.7

1 year ago

1.1.6

1 year ago

1.1.5

1 year ago

1.1.4

1 year ago

1.1.3

1 year ago

1.1.2

1 year ago

1.1.1

1 year ago

1.1.0

1 year ago

1.0.41

1 year ago

1.0.40

1 year ago

1.0.39

1 year ago

1.0.38

1 year ago

1.0.37

1 year ago

1.0.36

1 year ago

1.0.35

1 year ago

1.0.34

1 year ago

1.0.33

1 year ago

1.0.32

1 year ago

1.0.31

1 year ago

1.0.30

1 year ago

1.0.29

1 year ago

1.0.28

1 year ago

1.0.27

1 year ago

1.0.26

1 year ago

1.0.25

1 year ago

1.0.24

1 year ago

1.0.23

1 year ago

1.0.22

1 year ago

1.0.21

1 year ago

1.0.20

1 year ago

1.0.19

1 year ago

1.0.18

1 year ago

1.0.17

1 year ago

1.0.16

1 year ago

1.0.15

1 year ago

1.0.14

1 year ago

1.0.13

1 year ago

1.0.12

1 year ago

1.0.11

1 year ago

1.0.10

1 year ago

1.0.9

1 year ago

1.0.8

1 year ago

1.0.7

1 year ago

1.0.6

1 year ago

1.0.5

1 year ago

1.0.4

1 year ago

1.0.3

1 year ago

1.0.2

1 year ago

1.0.1

1 year ago

1.0.0

1 year ago