0.1.3 • Published 4 years ago

lobe v0.1.3

Weekly downloads
3
License
MIT
Repository
github
Last release
4 years ago

Lobe

Deep learning made easy with Lobe.ai:

const selfie = 'data:image/png;base64,iVBORw0KGgoAtJ2sAch...';
const results = await lobe({ selfie, object: 'hand' }, MODEL);
console.log(results);
// { emoji: '✌️', confidences: [['✌️', 0.9], ['👍', 0.05], ...] }

Note: this is a proof of concept since I'd love to work with Lobe!

Gettings started

To start using it, first import it in your preferred way and add your key:

// Node.js style
const lobe = require('lobe');
lobe.key = process.env.LOBE_KEY;
lobe(...).then(console.log);
// { emoji: '✌️', confidences: [['✌️', 0.9], ['👍', 0.05], ...] }
// React/ES7 Modules style
import lobe from 'lobe';
lobe.key = 'LOBE_KEY';
lobe(...).then(console.log);
// { emoji: '✌️', confidences: [['✌️', 0.9], ['👍', 0.05], ...] }
<!-- Old school; we follow semver for versioning -->
<script src="https://cdn.jsdelivr.net/npm/lobe@0.1"></script>
<script>
  lobe.key = 'LOBE_KEY';
  lobe(...).then(console.log)
  // { emoji: '✌️', confidences: [['✌️', 0.9], ['👍', 0.05], ...] }
</script>

Now you are ready to go and you can easily start evaluating your images:

const selfie = 'data:image/png;base64,iVBORw0KGgoA...';
const results = await lobe({ selfie, object: 'hand' }, MODEL);
console.log(results);
// { emoji: '✌️', confidences: [['✌️', 0.9], ['👍', 0.05], ...] }```

lobe(options, model) => Promise

The main function accepts one or two parameters and returns a promise. This promise can be resolved successfully if everything works, or it'd throw an error otherwise. API:

  • options: a plain object with your inputs for the defined model.
  • model: the identification for the model you are working with. Optional if already set globally.
  • returns Promise: it returns a promise that will be resolved to the output if successful.

options

The options will depend on your model, but it will always be a plain object with keys and string values:

const selfie = 'data:image/png;base64,iVBORw0KGgoA...';
const results = await lobe({ selfie, object: 'hand' });
console.log(results);
// { emoji: '✌️', confidences: [['✌️', 0.9], ['👍', 0.05], ...] }

Note: your images have to be encoded in base 64 for now.

For instance, let's see for the pipe example:

const image = 'data:image/png;base64,iVBORw0KGgoA...';
const results = await lobe({ image });
console.log(results);
// { gallons: 3450, boxes: [["Pipe", [0.51, 0.11, ...]], ["Anchor", [...]]] }

model

This option will identify the dataset that you are working with:

lobe({ selfie, object: 'hand' }, '2342-4545-3234');

It can be set globally if you are going to work with a single context for all your application:

lobe.model = '2342-4545-3234';
lobe({ selfie, object: 'hand' }).then(...);

The argument takes preference in case you have both set globally and locally.

return value

The return value is a promise that resolves to the output of the request:

lobe({ selfie, object: 'hand' }).then(out => console.log(out));

It can be used with async/await for the best results:

const result = await lobe({ selfie, object: 'hand' });

Make sure to check for errors:

try {
  const result = await lobe({ selfie, object: 'hand' });
} catch (error) {
  manage_somehow(error);
}