3.1.7 • Published 5 months ago
@orama/tokenizers v3.1.7
Orama Tokenizers
This package provides support for additional tokenizers for the Orama Search Engine.
Available tokenizers:
- Chinese (Mandarin, experimental)
- Japanese (experimental)
- Korean (work in progress)
Usage:
import { create } from '@orama/orama'
import { createTokenizer } from '@orama/tokenizers/mandarin'
const db = await create({
schema: {
myProperty: 'string',
anotherProperty: 'number'
},
components: {
tokenizer: await createTokenizer()
}
})
License
3.1.3
7 months ago
3.0.4
10 months ago
3.1.2
7 months ago
3.0.3
10 months ago
3.1.1
8 months ago
3.0.2
11 months ago
3.1.0
8 months ago
3.1.7
5 months ago
3.0.8
8 months ago
3.1.6
6 months ago
3.0.7
8 months ago
3.1.5
6 months ago
3.0.6
8 months ago
3.1.4
7 months ago
3.0.5
9 months ago
3.0.1
12 months ago
3.0.0
1 year ago
3.0.0-rc-4
1 year ago
3.0.0-rc-1
1 year ago
3.0.0-rc-3
1 year ago
3.0.0-rc-2
1 year ago
2.1.1
1 year ago
2.1.0
1 year ago
2.0.24
1 year ago
2.0.19
1 year ago
2.0.22
1 year ago
2.0.23
1 year ago
2.0.21
1 year ago
2.0.18
1 year ago
2.0.17
1 year ago
2.0.16
2 years ago
2.0.15
2 years ago
2.0.14
2 years ago
2.0.13
2 years ago
2.0.12
2 years ago
2.0.11
2 years ago
2.0.10
2 years ago
2.0.9
2 years ago
2.0.8
2 years ago
2.0.7
2 years ago
2.0.6
2 years ago