0.1.1 • Published 12 months ago

n8n-nodes-gpt-tokenizer v0.1.1

Weekly downloads
-
License
MIT
Repository
github
Last release
12 months ago

Banner image

Work with BPE Tokens in n8n with the GPT-Tokenizer Node

This community package contains a node to work with BPE Tokens such as OpenAI's GPT models use under the hood. As a matter of fact this node works just fine with the OpenAI Node.

You can:

  • Encode a string into BPE Tokens (may be cool for custom training)
  • Decode an array of BPE Tokens back to a string (for funzies?)
  • Determine a strings token length before submitting to the OpenAI API
  • Calculate costs before submitting to OpenAI API
  • Split a text into chunks which match exactly a definable Token Limit

n8n is a fair-code licensed workflow automation platform.

Supported Operations
Installation
Compatibility
About
Version History

Supported Operations

OperationDescriptionOptions
EncodeEncode a string into BPE Tokens. Returns an array of Tokens.-
DecodeDecode an array of BPE Tokens into a string. Returns a string.-
Count TokensCount the tokens a string produces. Return the number of tokens.-
Check Token LimitWheather a given string exceeds a defined Token Limit. Returns a boolean.Optional: throw an error if the Token Limit is exceeded.
Slice to Max Token LimitSlice the string into block which match exactly the provided token limit. Returns an array of strings.-

Installation

Follow the installation guide in the n8n community nodes documentation. It also should automatical install this depency: https://www.npmjs.com/package/gpt-tokenizer, which is a port of the original BPE Token Python Library.

Compatibility

The Latest Version of n8n. If you encounter any problem, feel free to open an issue on Github.

About

Version History

0.1.1

  • just polising the npm release

0.1.0

  • initial release