1.0.21 • Published 9 months ago

tensornet v1.0.21

Weekly downloads
-
License
MIT
Repository
github
Last release
9 months ago

tensornet

Build and lint

pure js classify base64 between Tensorflow Models

Installation

npm i tensornet --save

Getting Started

Make sure to have @tensorflow/tfjs-core installed and a valid tensorflow backend set. You also need to pick between sync package jpeg-js or async package sharp.

# pure js full sync blocking installation
npm i @tensorflow/tfjs-core jpeg-js
# if going to use async non blocking
npm i @tensorflow/tfjs-core sharp

View the classify.test.ts file for an example setup.

import { classify, classifyAsync } from "tensornet";
import { setBackend } from "@tensorflow/tfjs-core";
import "@tensorflow/tfjs-backend-wasm";

await setBackend("wasm");

const classification = await classify(mybase64); //using jpeg-js.
// or use native sharp for increased performance 2x
const classificationA = await classifyAsync(mybase64);
// output example
// [
//   {
//     className: 'Siamese cat, Siamese',
//     probability: 0.9805548787117004
//   }
// ]

Why

The benefits of using pure js to calc the image is in a couple areas:

  1. size and portablity required is drastically less since you do not need cairo or any of the native img dev converters.
  2. speed is also faster since the calcs are done at hand without needing to bridge any calls.
  3. can use tensors in worker threads - allows for properly using Tensorflow wasm backends in an API service 🥳.

The TF models are checked in localy.

Benchmarks

Examples of some test ran on a mac m1(64gb):

Namecharssizesyncasync
jpeg2679126.16 KB100ms50ms
1.0.17

9 months ago

1.0.16

9 months ago

1.0.21

9 months ago

1.0.20

9 months ago

1.0.15

9 months ago

1.0.14

2 years ago

1.0.13

2 years ago

1.0.11

2 years ago

1.0.10

2 years ago

1.0.9

2 years ago

1.0.8

2 years ago

1.0.7

2 years ago

1.0.6

2 years ago

1.0.5

2 years ago

1.0.4

2 years ago

1.0.3

2 years ago

1.0.2

2 years ago

1.0.1

2 years ago