3.0.1 • Published 8 months ago
@orama/tokenizers v3.0.1
Orama Tokenizers
This package provides support for additional tokenizers for the Orama Search Engine.
Available tokenizers:
- Chinese (Mandarin, experimental)
- Japanese (experimental)
- Korean (work in progress)
Usage:
import { create } from '@orama/orama'
import { createTokenizer } from '@orama/tokenizers/mandarin'
const db = await create({
schema: {
myProperty: 'string',
anotherProperty: 'number'
},
components: {
tokenizer: await createTokenizer()
}
})
License
3.0.1
8 months ago
3.0.0
8 months ago
3.0.0-rc-4
8 months ago
3.0.0-rc-1
9 months ago
3.0.0-rc-3
8 months ago
3.0.0-rc-2
9 months ago
2.1.1
9 months ago
2.1.0
9 months ago
2.0.24
10 months ago
2.0.19
1 year ago
2.0.22
11 months ago
2.0.23
10 months ago
2.0.21
12 months ago
2.0.18
1 year ago
2.0.17
1 year ago
2.0.16
1 year ago
2.0.15
1 year ago
2.0.14
1 year ago
2.0.13
1 year ago
2.0.12
1 year ago
2.0.11
1 year ago
2.0.10
1 year ago
2.0.9
1 year ago
2.0.8
1 year ago
2.0.7
1 year ago
2.0.6
1 year ago