1.0.8 • Published 2 years ago

@nsfwspy/node v1.0.8

Weekly downloads
-
License
ISC
Repository
github
Last release
2 years ago

Introduction

NsfwSpy.js is a nudity/pornography image classifier built for Node.js, based on our parent .NET project, to aid in moderating user-generated content for various different application types, written in TypeScript. The machine learning model has been trained against the MobileNetV2 neural net architecture with 537,000 images (186GB), from 4 different categories:

LabelDescriptionFiles
PornographyImages that depict sexual acts and nudity.108,000
SexyImages of people in their underwear and men who are topless.76,000
HentaiDrawings or animations of sexual acts and nudity.83,000
NeutralImages that are not sexual in nature.268,000

Performance

NsfwSpy isn't perfect, but the accuracy should be good enough to detect approximately 96% of Nsfw images, those being images that are classed as pornography, sexy or hentai.

PornographySexyHentaiNeutral
Is Nsfw (pornography + sexy + hentai >= 0.5)95.0%97.3%93.3%3.7%
Correctly Predicted Label85.0%81.0%89.8%96.4%

Quick Start

Want to see how NsfwSpy.js performs? Try it now on our test site.

This project is available as a npm package and can be installed with the following commands:

npm install @nsfwspy/node

Import NsfwSpy at the top of your JavaScript or TypeScript file:

JavaScript

const { NsfwSpy } = require('@nsfwspy/node');

TypeScript

import { NsfwSpy } from '@nsfwspy/node';

Load the Model

Before starting to use NsfwSpy, the model should be loaded, ideally from your hosted site or as local files on your system. NsfwSpy by default uses a publically hosted model in an S3 bucket, but we cannot ensure this will be available forever and should not be used in production systems.

Hosted files

const nsfwSpy = new NsfwSpy("./model/model.json");

Local files

const nsfwSpy = new NsfwSpy("file://./model/model.json");

Classify an Image File

const filePath = "C:\\Users\\username\\Documents\\flower.jpg";
const nsfwSpy = new NsfwSpy();
await nsfwSpy.load();
const result = await nsfwSpy.classifyImageFile(filePath);

Classify an Image from a Byte Array

const imageBuffer = await fs.readFileSync(filePath);
const nsfwSpy = new NsfwSpy();
await nsfwSpy.load();
const result = await nsfwSpy.classifyImageFromByteArray(imageBuffer);

Contact Us

Interested to get involved in the project? Whether you fancy adding features, providing images to train NsfwSpy with or something else, feel free to contact us via email at nsfwspy@outlook.com or find us on Twitter at @nsfw_spy.

Notes

Using NsfwSpy? Let us know! We're keen to hear how the technology is being used and improving the safety of applications.

Got a feature request or found something not quite right? Report it here on GitHub and we'll try to help as best as possible.

1.0.8

2 years ago

1.0.7

2 years ago

1.0.6

2 years ago

1.0.5

2 years ago

1.0.4

2 years ago

1.0.3

2 years ago

1.0.2

2 years ago

1.0.1

2 years ago

1.0.0

2 years ago