2.1.5 • Published 6 years ago
glsl-tokenizer v2.1.5
glsl-tokenizer
Maps GLSL string data into GLSL tokens, either synchronously or using a streaming API.
var tokenString = require('glsl-tokenizer/string')
var tokenStream = require('glsl-tokenizer/stream')
var fs = require('fs')
// Synchronously:
var tokens = tokenString(fs.readFileSync('some.glsl'))
// Streaming API:
fs.createReadStream('some.glsl')
.pipe(tokenStream())
.on('data', function(token) {
console.log(token.data, token.position, token.type)
})
API
tokens = require('glsl-tokenizer/string')(src, opt)
Returns an array of tokens
given the GLSL source string src
You can specify opt.version
string to use different keywords/builtins, such as '300 es'
for WebGL2. Otherwise, will assume GLSL 100 (WebGL1).
var tokens = tokenizer(src, {
version: '300 es'
})
stream = require('glsl-tokenizer/stream')(opt)
Emits 'data' events whenever a token is parsed with a token object as output.
As above, you can specify opt.version
.
Tokens
{ 'type': TOKEN_TYPE
, 'data': "string of constituent data"
, 'position': integer position within the GLSL source
, 'line': line number within the GLSL source
, 'column': column number within the GLSL source }
The available token types are:
block-comment
:/* ... */
line-comment
:// ... \n
preprocessor
:# ... \n
operator
: Any operator. If it looks like punctuation, it's an operator.float
: Optionally suffixed withf
ident
: User defined identifier.builtin
: Builtin function.eof
: Emitted onend
; data will ==='(eof)'
.integer
whitespace
keyword
License
MIT, see LICENSE.md for further information.
2.1.5
6 years ago
2.1.4
6 years ago
2.1.3
6 years ago
2.1.2
9 years ago
2.1.1
9 years ago
2.1.0
9 years ago
2.0.2
10 years ago
2.0.1
10 years ago
2.0.0
10 years ago
1.1.1
10 years ago
1.1.0
10 years ago
1.0.0
10 years ago
0.0.9
11 years ago
0.0.8
12 years ago
0.0.7
12 years ago
0.0.6
12 years ago
0.0.5
12 years ago
0.0.4
12 years ago
0.0.3
12 years ago
0.0.2
12 years ago
0.0.1
12 years ago
0.0.0
12 years ago