Codebase list golang-github-pointlander-peg / d555a31
Import upstream version 1.0.1 Debian Janitor 2 years ago
42 changed file(s) with 6231 addition(s) and 5674 deletion(s). Raw diff Collapse all Expand all
0 name: Go
1
2 on:
3 push:
4 branches: [ master ]
5 pull_request:
6 branches: [ master ]
7
8 jobs:
9
10 build:
11 name: Build
12 runs-on: ubuntu-latest
13 steps:
14
15 - name: Set up Go 1.x
16 uses: actions/setup-go@v2
17 with:
18 go-version: ^1.13
19 id: go
20
21 - name: Check out code into the Go module directory
22 uses: actions/checkout@v2
23
24 - name: Build and Test
25 run: go run build.go test
00 *.peg.go
1 !bootstrap.peg.go
1 !peg.peg.go
22 *.exe
33 *.6
44 peg
55 calculator/calculator
66 bootstrap/bootstrap
7 cmd/peg-bootstrap/peg-bootstrap
8 cmd/peg-bootstrap/peg*
00 https://medium.com/@octskyward/graal-truffle-134d8f28fb69#.jo3luf4dn
1 http://nez-peg.github.io/
2 https://en.wikipedia.org/wiki/DFA_minimization
3
4 https://news.ycombinator.com/item?id=14589173
5 http://jamey.thesharps.us/2017/06/search-based-compiler-code-generation.html
6
7 https://news.ycombinator.com/item?id=15105119
8 https://en.wikipedia.org/wiki/Tree_transducer
9
10 # Type-Driven Program Synthesis
11 https://news.ycombinator.com/item?id=18251145
12 https://www.youtube.com/watch?v=HnOix9TFy1A
13 http://comcom.csail.mit.edu/comcom/#welcome
14 https://bitbucket.org/nadiapolikarpova/synquid
15
16 # Formality – An efficient programming language and proof assistant
17 https://news.ycombinator.com/item?id=18230148
18 https://github.com/maiavictor/formality
+0
-13
Makefile less more
0 # Copyright 2010 The Go Authors. All rights reserved.
1 # Use of this source code is governed by a BSD-style
2 # license that can be found in the LICENSE file.
3
4 peg: bootstrap.peg.go peg.go main.go
5 go build
6
7 bootstrap.peg.go: bootstrap/main.go peg.go
8 cd bootstrap; go build
9 bootstrap/bootstrap
10
11 clean:
12 rm -f bootstrap/bootstrap peg peg.peg.go
0 # About
1
2 Peg, Parsing Expression Grammar, is an implementation of a Packrat parser
3 generator. A Packrat parser is a descent recursive parser capable of
4 backtracking. The generated parser searches for the correct parsing of the
5 input.
6
7 For more information see:
8 * http://en.wikipedia.org/wiki/Parsing_expression_grammar
9 * http://pdos.csail.mit.edu/~baford/packrat/
10
11 This Go implementation is based on:
12 * http://piumarta.com/software/peg/
13
14
15 # Usage
16
17 ```
18 -inline
19 Tells the parser generator to inline parser rules.
20 -switch
21 Reduces the number of rules that have to be tried for some pegs.
22 If statements are replaced with switch statements.
23 ```
24
25
26 # Syntax
27
28 First declare the package name:
0 # PEG, an Implementation of a Packrat Parsing Expression Grammar in Go
1
2 [![GoDoc](https://godoc.org/github.com/pointlander/peg?status.svg)](https://godoc.org/github.com/pointlander/peg)
3 [![Go Report Card](https://goreportcard.com/badge/github.com/pointlander/peg)](https://goreportcard.com/report/github.com/pointlander/peg)
4 [![Coverage](https://gocover.io/_badge/github.com/pointlander/peg)](https://gocover.io/github.com/pointlander/peg)
5
6 A [Parsing Expression Grammar](http://en.wikipedia.org/wiki/Parsing_expression_grammar) ( hence `peg`) is a way to create grammars similar in principle to [regular expressions](https://en.wikipedia.org/wiki/Regular_expression) but which allow better code integration. Specifically, `peg` is an implementation of the [Packrat](https://en.wikipedia.org/wiki/Parsing_expression_grammar#Implementing_parsers_from_parsing_expression_grammars) parser generator originally implemented as [peg/leg](https://www.piumarta.com/software/peg/) by [Ian Piumarta](https://www.piumarta.com/cv/) in C. A Packrat parser is a "descent recursive parser" capable of backtracking and negative look-ahead assertions which are problematic for regular expression engines .
7
8 ## See Also
9
10 * <http://en.wikipedia.org/wiki/Parsing_expression_grammar>
11 * <http://pdos.csail.mit.edu/~baford/packrat/>
12 * <http://piumarta.com/software/peg/>
13
14 ## Installing
15
16 `go get -u github.com/pointlander/peg`
17
18 ## Building
19
20 ### Using Pre-Generated Files
21
22 `go install`
23
24 ### Generating Files Yourself
25 You should only need to do this if you are contributing to the library, or if something gets messed up.
26
27 `go run build.go` or `go generate`
28
29 With tests:
30
31 `go run build.go test`
32
33 ## Usage
34
35 ```
36 peg [<option>]... <file>
37
38 Usage of peg:
39 -inline
40 parse rule inlining
41 -noast
42 disable AST
43 -output string
44 specify name of output file
45 -print
46 directly dump the syntax tree
47 -strict
48 treat compiler warnings as errors
49 -switch
50 replace if-else if-else like blocks with switch blocks
51 -syntax
52 print out the syntax tree
53 -version
54 print the version and exit
55
56 ```
57
58
59 ## Sample Makefile
60
61 This sample `Makefile` will convert any file ending with `.peg` into a `.go` file with the same name. Adjust as needed.
62
63 ```make
64 .SUFFIXES: .peg .go
65
66 .peg.go:
67 peg -noast -switch -inline -strict -output $@ $<
68
69 all: grammar.go
70 ```
71
72 Use caution when picking your names to avoid overwriting existing `.go` files. Since only one PEG grammar is allowed per Go package (currently) the use of the name `grammar.peg` is suggested as a convention:
73
74 ```
75 grammar.peg
76 grammar.go
77 ```
78
79 ## PEG File Syntax
80
81 First declare the package name and any import(s) required:
82
2983 ```
3084 package <package name>
85
86 import <import name>
3187 ```
3288
3389 Then declare the parser:
90
3491 ```
3592 type <parser name> Peg {
3693 <parser state variables>
3794 }
3895 ```
3996
40 Next declare the rules. The first rule is the entry point into the parser:
97 Next declare the rules. Note that the main rules are described below but are based on the [peg/leg rules](https://www.piumarta.com/software/peg/peg.1.html) which provide additional documentation.
98
99 The first rule is the entry point into the parser:
100
41101 ```
42102 <rule name> <- <rule body>
43103 ```
44104
45 The first rule should probably end with '!.' to indicate no more input follows:
105 The first rule should probably end with `!.` to indicate no more input follows.
106
46107 ```
47108 first <- . !.
48109 ```
49110
50 '.' means any character matches. For zero or more character matches use:
111 This is often set to `END` to make PEG rules more readable:
112
113 ```
114 END <- !.
115 ```
116
117 `.` means any character matches. For zero or more character matches, use:
118
51119 ```
52120 repetition <- .*
53121 ```
54122
55 For one or more character matches use:
123 For one or more character matches, use:
124
56125 ```
57126 oneOrMore <- .+
58127 ```
59128
60 For an optional character match use:
129 For an optional character match, use:
130
61131 ```
62132 optional <- .?
63133 ```
64134
65 If specific characters are to be matched use single quotes:
135 If specific characters are to be matched, use single quotes:
136
66137 ```
67138 specific <- 'a'* 'bc'+ 'de'?
68139 ```
69 will match the string "aaabcbcde".
70
71 For choosing between different inputs use alternates:
140
141 This will match the string `"aaabcbcde"`.
142
143 For choosing between different inputs, use alternates:
144
72145 ```
73146 prioritized <- 'a' 'a'* / 'bc'+ / 'de'?
74147 ```
75 will match "aaaa" or "bcbc" or "de" or "". The matches are attempted in order.
76
77 If the characters are case insensitive use double quotes:
148
149 This will match `"aaaa"` or `"bcbc"` or `"de"` or `""`. The matches are attempted in order.
150
151 If the characters are case insensitive, use double quotes:
152
78153 ```
79154 insensitive <- "abc"
80155 ```
81 will match "abc" or "Abc" or "ABc" etc...
82
83 For matching a set of characters use a character class:
156
157 This will match `"abc"` or `"Abc"` or `"ABc"` and so on.
158
159 For matching a set of characters, use a character class:
160
84161 ```
85162 class <- [a-z]
86163 ```
87 will watch "a" or "b" or all the way to "z".
88
89 For an inverse character class start with a tilde:
90 ```
91 inverse <- [~a-z]
92 ```
93 will match anything but "a" or "b" or all the way to "z"
94
95 If the character class is case insensitive use double brackets:
164
165 This will match `"a"` or `"b"` or all the way to `"z"`.
166
167 For an inverse character class, start with a caret:
168
169 ```
170 inverse <- [^a-z]
171 ```
172
173 This will match anything but `"a"` or `"b"` or all the way to `"z"`.
174
175 If the character class is case insensitive, use double brackets:
176
96177 ```
97178 insensitive <- [[A-Z]]
98179 ```
99180
181 (Note that this is not available in regular expression syntax.)
182
100183 Use parentheses for grouping:
184
101185 ```
102186 grouping <- (rule1 / rule2) rule3
103187 ```
104188
105 For looking ahead for a match (predicate) use:
189 For looking ahead a match (predicate), use:
190
106191 ```
107192 lookAhead <- &rule1 rule2
108193 ```
109194
110 For inverse look ahead use:
195 For inverse look ahead, use:
196
111197 ```
112198 inverse <- !rule1 rule2
113199 ```
114200
115201 Use curly braces for Go code:
202
116203 ```
117204 gocode <- { fmt.Println("hello world") }
118205 ```
119206
120 For string captures use less than greater than:
121 ```
122 capture <- <'capture'> { fmt.Println(buffer[begin:end]) }
123 ```
124 Will print out "capture". The captured string is stored in buffer[begin:end].
125
126
127 # Files
128
129 * bootstrap/main.go: bootstrap syntax tree of peg
130 * peg.go: syntax tree and code generator
131 * main.go: bootstrap main
132 * peg.peg: peg in its own language
133
134
135 # Testing
136
137 There should be no differences between the bootstrap and self compiled:
138
139 ```
140 ./peg -inline -switch peg.peg
141 diff bootstrap.peg.go peg.peg.go
142 ```
143
144
145 # Author
207 For string captures, use less than and greater than:
208
209 ```
210 capture <- <'capture'> { fmt.Println(text) }
211 ```
212
213 Will print out `"capture"`. The captured string is stored in `buffer[begin:end]`.
214
215 ## Testing Complex Grammars
216
217 Testing a grammar usually requires more than the average unit testing with multiple inputs and outputs. Grammars are also usually not for just one language implementation. Consider maintaining a list of inputs with expected outputs in a structured file format such as JSON or YAML and parsing it for testing or using one of the available options for Go such as Rob Muhlestein's [`tinout`](https://github.com/robmuh/tinout) package.
218
219 ## Files
220
221 * `bootstrap/main.go` - bootstrap syntax tree of peg
222 * `tree/peg.go` - syntax tree and code generator
223 * `peg.peg` - peg in its own language
224
225 ## Author
146226
147227 Andrew Snodgrass
228
229 ## Projects That Use `peg`
230
231 Here are some projects that use `peg` to provide further examples of PEG grammars:
232
233 * <https://github.com/tj/go-naturaldate> - natural date/time parsing
234 * <https://github.com/robmuh/dtime> - easy date/time formats with duration spans
235
77 "fmt"
88 "os"
99 "runtime"
10
11 "github.com/pointlander/peg/tree"
1012 )
1113
1214 func main() {
1315 runtime.GOMAXPROCS(2)
14 t := New(true, true)
16 t := tree.New(true, true, false)
1517
1618 /*package main
1719
2426 *Tree
2527 }*/
2628 t.AddPackage("main")
29 t.AddImport("github.com/pointlander/peg/tree")
2730 t.AddPeg("Peg")
2831 t.AddState(`
29 *Tree
32 *tree.Tree
3033 `)
3134
3235 addDot := t.AddDot
3336 addName := t.AddName
3437 addCharacter := t.AddCharacter
35 addDoubleCharacter := t.AddDoubleCharacter
36 addHexaCharacter := t.AddHexaCharacter
3738 addAction := t.AddAction
3839
3940 addRule := func(name string, item func()) {
8485 t.AddRange()
8586 }
8687
87 addDoubleRange := func(begin, end string) {
88 addCharacter(begin)
89 addCharacter(end)
90 t.AddDoubleRange()
91 }
92
9388 addStar := func(item func()) {
9489 item()
9590 t.AddStar()
9691 }
9792
98 addPlus := func(item func()) {
99 item()
100 t.AddPlus()
101 }
102
10393 addQuery := func(item func()) {
10494 item()
10595 t.AddQuery()
120110 t.AddPeekFor()
121111 }
122112
123 /* Grammar <- Spacing 'package' MustSpacing Identifier { p.AddPackage(text) }
124 Import*
125 'type' MustSpacing Identifier { p.AddPeg(text) }
126 'Peg' Spacing Action { p.AddState(text) }
127 Definition+ EndOfFile */
113 /* Grammar <- Spacing { hdr; } Action* Definition* !. */
128114 addRule("Grammar", func() {
129115 addSequence(
130116 func() { addName("Spacing") },
131 func() { addString("package") },
132 func() { addName("MustSpacing") },
133 func() { addName("Identifier") },
134 func() { addAction(" p.AddPackage(text) ") },
135 func() { addStar(func() { addName("Import") }) },
136 func() { addString("type") },
137 func() { addName("MustSpacing") },
138 func() { addName("Identifier") },
139 func() { addAction(" p.AddPeg(text) ") },
140 func() { addString("Peg") },
141 func() { addName("Spacing") },
142 func() { addName("Action") },
143 func() { addAction(" p.AddState(text) ") },
144 func() { addPlus(func() { addName("Definition") }) },
145 func() { addName("EndOfFile") },
146 )
147 })
148
149 /* Import <- 'import' Spacing ["] < [a-zA-Z_/.\-]+ > ["] Spacing { p.AddImport(text) } */
150 addRule("Import", func() {
151 addSequence(
152 func() { addString("import") },
153 func() { addName("Spacing") },
154 func() { addCharacter(`"`) },
155 func() {
156 addPush(func() {
157 addPlus(func() {
158 addAlternate(
159 func() { addRange(`a`, `z`) },
160 func() { addRange(`A`, `Z`) },
161 func() { addCharacter(`_`) },
162 func() { addCharacter(`/`) },
163 func() { addCharacter(`.`) },
164 func() { addCharacter(`-`) },
165 )
166 })
167 })
168 },
169 func() { addCharacter(`"`) },
170 func() { addName("Spacing") },
171 func() { addAction(" p.AddImport(text) ") },
117 func() { addAction(`p.AddPackage("main")`) },
118 func() { addAction(`p.AddImport("github.com/pointlander/peg/tree")`) },
119 func() { addAction(`p.AddPeg("Peg")`) },
120 func() { addAction(`p.AddState("*tree.Tree")`) },
121 func() { addStar(func() { addName("Action") }) },
122 func() { addStar(func() { addName("Definition") }) },
123 func() { addPeekNot(func() { addDot() }) },
172124 )
173125 })
174126
197149 )
198150 })
199151
200 /* Expression <- Sequence (Slash Sequence { p.AddAlternate() }
201 )* (Slash { p.AddNil(); p.AddAlternate() }
202 )?
203 / { p.AddNil() } */
152 /* Expression <- Sequence (Slash Sequence { p.AddAlternate() })* */
204153 addRule("Expression", func() {
205 addAlternate(
206 func() {
207 addSequence(
208 func() { addName("Sequence") },
209 func() {
210 addStar(func() {
211 addSequence(
212 func() { addName("Slash") },
213 func() { addName("Sequence") },
214 func() { addAction(" p.AddAlternate() ") },
215 )
216 })
217 },
218 func() {
219 addQuery(func() {
220 addSequence(
221 func() { addName("Slash") },
222 func() { addAction(" p.AddNil(); p.AddAlternate() ") },
223 )
224 })
225 },
226 )
227 },
228 func() { addAction(" p.AddNil() ") },
229 )
230 })
231
232 /* Sequence <- Prefix (Prefix { p.AddSequence() }
233 )* */
154 addSequence(
155 func() { addName("Sequence") },
156 func() {
157 addStar(func() {
158 addSequence(
159 func() { addName("Slash") },
160 func() { addName("Sequence") },
161 func() { addAction(" p.AddAlternate() ") },
162 )
163 })
164 },
165 )
166 })
167
168 /* Sequence <- Prefix (Prefix { p.AddSequence() } )* */
234169 addRule("Sequence", func() {
235170 addSequence(
236171 func() { addName("Prefix") },
245180 )
246181 })
247182
248 /* Prefix <- And Action { p.AddPredicate(text) }
249 / Not Action { p.AddStateChange(text) }
250 / And Suffix { p.AddPeekFor() }
251 / Not Suffix { p.AddPeekNot() }
252 / Suffix */
183 /* Prefix <- '!' Suffix { p.AddPeekNot() } / Suffix */
253184 addRule("Prefix", func() {
254185 addAlternate(
255186 func() {
256187 addSequence(
257 func() { addName("And") },
258 func() { addName("Action") },
259 func() { addAction(" p.AddPredicate(text) ") },
260 )
261 },
262 func() {
263 addSequence(
264 func() { addName("Not") },
265 func() { addName("Action") },
266 func() { addAction(" p.AddStateChange(text) ") },
267 )
268 },
269 func() {
270 addSequence(
271 func() { addName("And") },
272 func() { addName("Suffix") },
273 func() { addAction(" p.AddPeekFor() ") },
274 )
275 },
276 func() {
277 addSequence(
278 func() { addName("Not") },
188 func() { addCharacter(`!`) },
279189 func() { addName("Suffix") },
280190 func() { addAction(" p.AddPeekNot() ") },
281191 )
284194 )
285195 })
286196
287 /* Suffix <- Primary (Question { p.AddQuery() }
288 / Star { p.AddStar() }
289 / Plus { p.AddPlus() }
290 )? */
197 /* Suffix <- Primary ( Question { p.AddQuery() }
198 / Star { p.AddStar() }
199 )? */
291200 addRule("Suffix", func() {
292201 addSequence(
293202 func() { addName("Primary") },
306215 func() { addAction(" p.AddStar() ") },
307216 )
308217 },
309 func() {
310 addSequence(
311 func() { addName("Plus") },
312 func() { addAction(" p.AddPlus() ") },
313 )
314 },
315218 )
316219 })
317220 },
366269 )
367270 })
368271
369 /* Identifier <- < IdentStart IdentCont* > Spacing */
272 /* Identifier <- < Ident Ident* > Spacing */
370273 addRule("Identifier", func() {
371274 addSequence(
372275 func() {
373276 addPush(func() {
374277 addSequence(
375 func() { addName("IdentStart") },
376 func() { addStar(func() { addName("IdentCont") }) },
377 )
378 })
379 },
380 func() { addName("Spacing") },
381 )
382 })
383
384 /* IdentStart <- [[a-z_]] */
385 addRule("IdentStart", func() {
386 addAlternate(
387 func() { addDoubleRange(`a`, `z`) },
388 func() { addCharacter(`_`) },
389 )
390 })
391
392 /* IdentCont <- IdentStart / [0-9] */
393 addRule("IdentCont", func() {
394 addAlternate(
395 func() { addName("IdentStart") },
396 func() { addRange(`0`, `9`) },
397 )
398 })
399
400 /* Literal <- ['] (!['] Char)? (!['] Char { p.AddSequence() }
401 )* ['] Spacing
402 / ["] (!["] DoubleChar)? (!["] DoubleChar { p.AddSequence() }
403 )* ["] Spacing */
278 func() { addName("Ident") },
279 func() { addStar(func() { addName("Ident") }) },
280 )
281 })
282 },
283 func() { addName("Spacing") },
284 )
285 })
286
287 /* Ident <- [A-Za-z] */
288 addRule("Ident", func() {
289 addAlternate(
290 func() { addRange(`A`, `Z`) },
291 func() { addRange(`a`, `z`) },
292 )
293 })
294
295 /* Literal <- ['] !['] Char (!['] Char { p.AddSequence() } )* ['] Spacing */
404296 addRule("Literal", func() {
405 addAlternate(
406 func() {
407 addSequence(
408 func() { addCharacter(`'`) },
409 func() {
410 addQuery(func() {
411 addSequence(
412 func() { addPeekNot(func() { addCharacter(`'`) }) },
413 func() { addName("Char") },
414 )
415 })
416 },
417 func() {
418 addStar(func() {
419 addSequence(
420 func() { addPeekNot(func() { addCharacter(`'`) }) },
421 func() { addName("Char") },
422 func() { addAction(` p.AddSequence() `) },
423 )
424 })
425 },
426 func() { addCharacter(`'`) },
427 func() { addName("Spacing") },
428 )
429 },
430 func() {
431 addSequence(
432 func() { addCharacter(`"`) },
433 func() {
434 addQuery(func() {
435 addSequence(
436 func() { addPeekNot(func() { addCharacter(`"`) }) },
437 func() { addName("DoubleChar") },
438 )
439 })
440 },
441 func() {
442 addStar(func() {
443 addSequence(
444 func() { addPeekNot(func() { addCharacter(`"`) }) },
445 func() { addName("DoubleChar") },
446 func() { addAction(` p.AddSequence() `) },
447 )
448 })
449 },
450 func() { addCharacter(`"`) },
451 func() { addName("Spacing") },
452 )
453 },
454 )
455 })
456
457 /* Class <- ( '[[' ( '^' DoubleRanges { p.AddPeekNot(); p.AddDot(); p.AddSequence() }
458 / DoubleRanges )?
459 ']]'
460 / '[' ( '^' Ranges { p.AddPeekNot(); p.AddDot(); p.AddSequence() }
461 / Ranges )?
462 ']' )
463 Spacing */
297 addSequence(
298 func() { addCharacter(`'`) },
299 func() {
300 addSequence(
301 func() { addPeekNot(func() { addCharacter(`'`) }) },
302 func() { addName("Char") },
303 )
304 },
305 func() {
306 addStar(func() {
307 addSequence(
308 func() { addPeekNot(func() { addCharacter(`'`) }) },
309 func() { addName("Char") },
310 func() { addAction(` p.AddSequence() `) },
311 )
312 })
313 },
314 func() { addCharacter(`'`) },
315 func() { addName("Spacing") },
316 )
317 })
318
319 /* Class <- '[' Range (!']' Range { p.AddAlternate() })* ']' Spacing */
464320 addRule("Class", func() {
465321 addSequence(
466 func() {
467 addAlternate(
468 func() {
469 addSequence(
470 func() { addString(`[[`) },
471 func() {
472 addQuery(func() {
473 addAlternate(
474 func() {
475 addSequence(
476 func() { addCharacter(`^`) },
477 func() { addName("DoubleRanges") },
478 func() { addAction(` p.AddPeekNot(); p.AddDot(); p.AddSequence() `) },
479 )
480 },
481 func() { addName("DoubleRanges") },
482 )
483 })
484 },
485 func() { addString(`]]`) },
486 )
487 },
488 func() {
489 addSequence(
490 func() { addCharacter(`[`) },
491 func() {
492 addQuery(func() {
493 addAlternate(
494 func() {
495 addSequence(
496 func() { addCharacter(`^`) },
497 func() { addName("Ranges") },
498 func() { addAction(` p.AddPeekNot(); p.AddDot(); p.AddSequence() `) },
499 )
500 },
501 func() { addName("Ranges") },
502 )
503 })
504 },
505 func() { addCharacter(`]`) },
506 )
507 },
508 )
509 },
510 func() { addName("Spacing") },
511 )
512 })
513
514 /* Ranges <- !']' Range (!']' Range { p.AddAlternate() }
515 )* */
516 addRule("Ranges", func() {
517 addSequence(
518 func() { addPeekNot(func() { addCharacter(`]`) }) },
322 func() { addCharacter(`[`) },
519323 func() { addName("Range") },
520324 func() {
521325 addStar(func() {
526330 )
527331 })
528332 },
529 )
530 })
531
532 /* DoubleRanges <- !']]' DoubleRange (!']]' DoubleRange { p.AddAlternate() }
533 )* */
534 addRule("DoubleRanges", func() {
535 addSequence(
536 func() { addPeekNot(func() { addString(`]]`) }) },
537 func() { addName("DoubleRange") },
538 func() {
539 addStar(func() {
540 addSequence(
541 func() { addPeekNot(func() { addString(`]]`) }) },
542 func() { addName("DoubleRange") },
543 func() { addAction(" p.AddAlternate() ") },
544 )
545 })
546 },
333 func() { addCharacter(`]`) },
334 func() { addName("Spacing") },
547335 )
548336 })
549337
563351 )
564352 })
565353
566 /* DoubleRange <- Char '-' Char { p.AddDoubleRange() }
567 / DoubleChar */
568 addRule("DoubleRange", func() {
569 addAlternate(
570 func() {
571 addSequence(
572 func() { addName("Char") },
573 func() { addCharacter(`-`) },
574 func() { addName("Char") },
575 func() { addAction(" p.AddDoubleRange() ") },
576 )
577 },
578 func() { addName("DoubleChar") },
579 )
580 })
581
582 /* Char <- Escape
583 / !'\\' <.> { p.AddCharacter(text) } */
354 /* Char <- Escape
355 / '\\' "0x"<[0-9a-f]*> { p.AddHexaCharacter(text) }
356 / '\\\\' { p.AddCharacter("\\") }
357 / !'\\' <.> { p.AddCharacter(text) } */
584358 addRule("Char", func() {
585359 addAlternate(
586 func() { addName("Escape") },
587 func() {
588 addSequence(
589 func() { addPeekNot(func() { addCharacter("\\") }) },
590 func() { addPush(func() { addDot() }) },
591 func() { addAction(` p.AddCharacter(text) `) },
592 )
593 },
594 )
595 })
596
597 /* DoubleChar <- Escape
598 / <[a-zA-Z]> { p.AddDoubleCharacter(text) }
599 / !'\\' <.> { p.AddCharacter(text) } */
600 addRule("DoubleChar", func() {
601 addAlternate(
602 func() { addName("Escape") },
603 func() {
604 addSequence(
360 func() {
361 addSequence(
362 func() { addCharacter("\\") },
363 func() { addCharacter(`0`) },
364 func() { addCharacter(`x`) },
605365 func() {
606366 addPush(func() {
607 addAlternate(
608 func() { addRange(`a`, `z`) },
609 func() { addRange(`A`, `Z`) },
610 )
611 })
612 },
613 func() { addAction(` p.AddDoubleCharacter(text) `) },
614 )
615 },
616 func() {
617 addSequence(
618 func() { addPeekNot(func() { addCharacter("\\") }) },
619 func() { addPush(func() { addDot() }) },
620 func() { addAction(` p.AddCharacter(text) `) },
621 )
622 },
623 )
624 })
625
626 /* Escape <- "\\a" { p.AddCharacter("\a") } # bell
627 / "\\b" { p.AddCharacter("\b") } # bs
628 / "\\e" { p.AddCharacter("\x1B") } # esc
629 / "\\f" { p.AddCharacter("\f") } # ff
630 / "\\n" { p.AddCharacter("\n") } # nl
631 / "\\r" { p.AddCharacter("\r") } # cr
632 / "\\t" { p.AddCharacter("\t") } # ht
633 / "\\v" { p.AddCharacter("\v") } # vt
634 / "\\'" { p.AddCharacter("'") }
635 / '\\"' { p.AddCharacter("\"") }
636 / '\\[' { p.AddCharacter("[") }
637 / '\\]' { p.AddCharacter("]") }
638 / '\\-' { p.AddCharacter("-") }
639 / '\\' "0x"<[0-9a-fA-F]+> { p.AddHexaCharacter(text) }
640 / '\\' <[0-3][0-7][0-7]> { p.AddOctalCharacter(text) }
641 / '\\' <[0-7][0-7]?> { p.AddOctalCharacter(text) }
642 / '\\\\' { p.AddCharacter("\\") } */
643 addRule("Escape", func() {
644 addAlternate(
645 func() {
646 addSequence(
647 func() { addCharacter("\\") },
648 func() { addDoubleCharacter(`a`) },
649 func() { addAction(` p.AddCharacter("\a") `) },
650 )
651 },
652 func() {
653 addSequence(
654 func() { addCharacter("\\") },
655 func() { addDoubleCharacter(`b`) },
656 func() { addAction(` p.AddCharacter("\b") `) },
657 )
658 },
659 func() {
660 addSequence(
661 func() { addCharacter("\\") },
662 func() { addDoubleCharacter(`e`) },
663 func() { addAction(` p.AddCharacter("\x1B") `) },
664 )
665 },
666 func() {
667 addSequence(
668 func() { addCharacter("\\") },
669 func() { addDoubleCharacter(`f`) },
670 func() { addAction(` p.AddCharacter("\f") `) },
671 )
672 },
673 func() {
674 addSequence(
675 func() { addCharacter("\\") },
676 func() { addDoubleCharacter(`n`) },
677 func() { addAction(` p.AddCharacter("\n") `) },
678 )
679 },
680 func() {
681 addSequence(
682 func() { addCharacter("\\") },
683 func() { addDoubleCharacter(`r`) },
684 func() { addAction(` p.AddCharacter("\r") `) },
685 )
686 },
687 func() {
688 addSequence(
689 func() { addCharacter("\\") },
690 func() { addDoubleCharacter(`t`) },
691 func() { addAction(` p.AddCharacter("\t") `) },
692 )
693 },
694 func() {
695 addSequence(
696 func() { addCharacter("\\") },
697 func() { addDoubleCharacter(`v`) },
698 func() { addAction(` p.AddCharacter("\v") `) },
699 )
700 },
701 func() {
702 addSequence(
703 func() { addCharacter("\\") },
704 func() { addCharacter(`'`) },
705 func() { addAction(` p.AddCharacter("'") `) },
706 )
707 },
708 func() {
709 addSequence(
710 func() { addCharacter("\\") },
711 func() { addCharacter(`"`) },
712 func() { addAction(` p.AddCharacter("\"") `) },
713 )
714 },
715 func() {
716 addSequence(
717 func() { addCharacter("\\") },
718 func() { addCharacter(`[`) },
719 func() { addAction(` p.AddCharacter("[") `) },
720 )
721 },
722 func() {
723 addSequence(
724 func() { addCharacter("\\") },
725 func() { addCharacter(`]`) },
726 func() { addAction(` p.AddCharacter("]") `) },
727 )
728 },
729 func() {
730 addSequence(
731 func() { addCharacter("\\") },
732 func() { addCharacter(`-`) },
733 func() { addAction(` p.AddCharacter("-") `) },
734 )
735 },
736 func() {
737 addSequence(
738 func() { addCharacter("\\") },
739 func() {
740 addSequence(
741 func() { addCharacter(`0`) },
742 func() { addDoubleCharacter(`x`) },
743 )
744 },
745 func() {
746 addPush(func() {
747 addPlus(func() {
367 addStar(func() {
748368 addAlternate(
749369 func() { addRange(`0`, `9`) },
750370 func() { addRange(`a`, `f`) },
751 func() { addRange(`A`, `F`) },
752371 )
753372 })
754373 })
759378 func() {
760379 addSequence(
761380 func() { addCharacter("\\") },
762 func() {
763 addPush(func() {
764 addSequence(
765 func() { addRange(`0`, `3`) },
766 func() { addRange(`0`, `7`) },
767 func() { addRange(`0`, `7`) },
768 )
769 })
770 },
771 func() { addAction(` p.AddOctalCharacter(text) `) },
772 )
773 },
774 func() {
775 addSequence(
776 func() { addCharacter("\\") },
777 func() {
778 addPush(func() {
779 addSequence(
780 func() { addRange(`0`, `7`) },
781 func() { addQuery(func() { addRange(`0`, `7`) }) },
782 )
783 })
784 },
785 func() { addAction(` p.AddOctalCharacter(text) `) },
786 )
787 },
788 func() {
789 addSequence(
790 func() { addCharacter("\\") },
791381 func() { addCharacter("\\") },
792382 func() { addAction(` p.AddCharacter("\\") `) },
793383 )
794384 },
795 )
796 })
797
798 /* LeftArrow <- ('<-' / '\0x2190') Spacing */
385 func() {
386 addSequence(
387 func() { addPeekNot(func() { addCharacter("\\") }) },
388 func() { addPush(func() { addDot() }) },
389 func() { addAction(` p.AddCharacter(text) `) },
390 )
391 },
392 )
393 })
394 /* LeftArrow <- '<-' Spacing */
799395 addRule("LeftArrow", func() {
800396 addSequence(
801 func() {
802 addAlternate(
803 func() { addString(`<-`) },
804 func() { addHexaCharacter("2190") },
805 )
806 },
397 func() { addString(`<-`) },
807398 func() { addName("Spacing") },
808399 )
809400 })
816407 )
817408 })
818409
819 /* And <- '&' Spacing */
820 addRule("And", func() {
821 addSequence(
822 func() { addCharacter(`&`) },
823 func() { addName("Spacing") },
824 )
825 })
826
827 /* Not <- '!' Spacing */
828 addRule("Not", func() {
829 addSequence(
830 func() { addCharacter(`!`) },
831 func() { addName("Spacing") },
832 )
833 })
834
835410 /* Question <- '?' Spacing */
836411 addRule("Question", func() {
837412 addSequence(
848423 )
849424 })
850425
851 /* Plus <- '+' Spacing */
852 addRule("Plus", func() {
853 addSequence(
854 func() { addCharacter(`+`) },
855 func() { addName("Spacing") },
856 )
857 })
858
859426 /* Open <- '(' Spacing */
860427 addRule("Open", func() {
861428 addSequence(
880447 )
881448 })
882449
883 /* SpaceComment <- (Space / Comment) */
884 addRule("SpaceComment", func() {
885 addAlternate(
886 func() { addName("Space") },
887 func() { addName("Comment") },
888 )
889 })
890
891 /* Spacing <- SpaceComment* */
892450 addRule("Spacing", func() {
893 addStar(func() { addName("SpaceComment") })
894 })
895
896 /* MustSpacing <- SpaceComment+ */
897 addRule("MustSpacing", func() {
898 addPlus(func() { t.AddName("SpaceComment") })
899 })
900
901 /* Comment <- '#' (!EndOfLine .)* EndOfLine */
451 addStar(func() {
452 addAlternate(
453 func() { addName("Space") },
454 func() { addName("Comment") },
455 )
456 })
457 })
458
459 /* Comment <- '#' (!EndOfLine .)* */
902460 addRule("Comment", func() {
903461 addSequence(
904462 func() { addCharacter(`#`) },
910468 )
911469 })
912470 },
913 func() { addName("EndOfLine") },
914471 )
915472 })
916473
932489 )
933490 })
934491
935 /* EndOfFile <- !. */
936 addRule("EndOfFile", func() {
937 addPeekNot(func() { addDot() })
938 })
939
940 /* Action <- '{' < ActionBody* > '}' Spacing */
492 /* Action <- '{' < (![}].)* > '}' Spacing */
941493 addRule("Action", func() {
942494 addSequence(
943495 func() { addCharacter(`{`) },
944496 func() {
945497 addPush(func() {
946 addStar(func() { addName("ActionBody") })
498 addStar(func() {
499 addSequence(
500 func() {
501 addPeekNot(func() {
502 addCharacter(`}`)
503 })
504 },
505 func() { addDot() },
506 )
507 })
947508 })
948509 },
949510 func() { addCharacter(`}`) },
950511 func() { addName("Spacing") },
951 )
952 })
953
954 /* ActionBody <- [^{}] / '{' ActionBody* '}' */
955 addRule("ActionBody", func() {
956 addAlternate(
957 func() {
958 addSequence(
959 func() {
960 addPeekNot(func() {
961 addAlternate(
962 func() { addCharacter(`{`) },
963 func() { addCharacter(`}`) },
964 )
965 })
966 },
967 func() { addDot() },
968 )
969 },
970 func() {
971 addSequence(
972 func() { addCharacter(`{`) },
973 func() { addStar(func() { addName("ActionBody") }) },
974 func() { addCharacter(`}`) },
975 )
976 },
977512 )
978513 })
979514
994529 })
995530
996531 filename := "bootstrap.peg.go"
997 out, error := os.OpenFile(filename, os.O_RDWR|os.O_CREATE|os.O_TRUNC, 0644)
998 if error != nil {
999 fmt.Printf("%v: %v\n", filename, error)
532 out, err := os.OpenFile(filename, os.O_RDWR|os.O_CREATE|os.O_TRUNC, 0644)
533 if err != nil {
534 fmt.Printf("%v: %v\n", filename, err)
1000535 return
1001536 }
1002537 defer out.Close()
1003 t.Compile(filename, out)
538 t.Compile(filename, os.Args, out)
1004539 }
+0
-1
bootstrap/peg.go less more
0 ../peg.go
+0
-3040
bootstrap.peg.go less more
0 package main
1
2 import (
3 "fmt"
4 "math"
5 "sort"
6 "strconv"
7 )
8
9 const endSymbol rune = 1114112
10
11 /* The rule types inferred from the grammar are below. */
12 type pegRule uint8
13
14 const (
15 ruleUnknown pegRule = iota
16 ruleGrammar
17 ruleImport
18 ruleDefinition
19 ruleExpression
20 ruleSequence
21 rulePrefix
22 ruleSuffix
23 rulePrimary
24 ruleIdentifier
25 ruleIdentStart
26 ruleIdentCont
27 ruleLiteral
28 ruleClass
29 ruleRanges
30 ruleDoubleRanges
31 ruleRange
32 ruleDoubleRange
33 ruleChar
34 ruleDoubleChar
35 ruleEscape
36 ruleLeftArrow
37 ruleSlash
38 ruleAnd
39 ruleNot
40 ruleQuestion
41 ruleStar
42 rulePlus
43 ruleOpen
44 ruleClose
45 ruleDot
46 ruleSpaceComment
47 ruleSpacing
48 ruleMustSpacing
49 ruleComment
50 ruleSpace
51 ruleEndOfLine
52 ruleEndOfFile
53 ruleAction
54 ruleActionBody
55 ruleBegin
56 ruleEnd
57 ruleAction0
58 ruleAction1
59 ruleAction2
60 rulePegText
61 ruleAction3
62 ruleAction4
63 ruleAction5
64 ruleAction6
65 ruleAction7
66 ruleAction8
67 ruleAction9
68 ruleAction10
69 ruleAction11
70 ruleAction12
71 ruleAction13
72 ruleAction14
73 ruleAction15
74 ruleAction16
75 ruleAction17
76 ruleAction18
77 ruleAction19
78 ruleAction20
79 ruleAction21
80 ruleAction22
81 ruleAction23
82 ruleAction24
83 ruleAction25
84 ruleAction26
85 ruleAction27
86 ruleAction28
87 ruleAction29
88 ruleAction30
89 ruleAction31
90 ruleAction32
91 ruleAction33
92 ruleAction34
93 ruleAction35
94 ruleAction36
95 ruleAction37
96 ruleAction38
97 ruleAction39
98 ruleAction40
99 ruleAction41
100 ruleAction42
101 ruleAction43
102 ruleAction44
103 ruleAction45
104 ruleAction46
105 ruleAction47
106 ruleAction48
107
108 rulePre
109 ruleIn
110 ruleSuf
111 )
112
113 var rul3s = [...]string{
114 "Unknown",
115 "Grammar",
116 "Import",
117 "Definition",
118 "Expression",
119 "Sequence",
120 "Prefix",
121 "Suffix",
122 "Primary",
123 "Identifier",
124 "IdentStart",
125 "IdentCont",
126 "Literal",
127 "Class",
128 "Ranges",
129 "DoubleRanges",
130 "Range",
131 "DoubleRange",
132 "Char",
133 "DoubleChar",
134 "Escape",
135 "LeftArrow",
136 "Slash",
137 "And",
138 "Not",
139 "Question",
140 "Star",
141 "Plus",
142 "Open",
143 "Close",
144 "Dot",
145 "SpaceComment",
146 "Spacing",
147 "MustSpacing",
148 "Comment",
149 "Space",
150 "EndOfLine",
151 "EndOfFile",
152 "Action",
153 "ActionBody",
154 "Begin",
155 "End",
156 "Action0",
157 "Action1",
158 "Action2",
159 "PegText",
160 "Action3",
161 "Action4",
162 "Action5",
163 "Action6",
164 "Action7",
165 "Action8",
166 "Action9",
167 "Action10",
168 "Action11",
169 "Action12",
170 "Action13",
171 "Action14",
172 "Action15",
173 "Action16",
174 "Action17",
175 "Action18",
176 "Action19",
177 "Action20",
178 "Action21",
179 "Action22",
180 "Action23",
181 "Action24",
182 "Action25",
183 "Action26",
184 "Action27",
185 "Action28",
186 "Action29",
187 "Action30",
188 "Action31",
189 "Action32",
190 "Action33",
191 "Action34",
192 "Action35",
193 "Action36",
194 "Action37",
195 "Action38",
196 "Action39",
197 "Action40",
198 "Action41",
199 "Action42",
200 "Action43",
201 "Action44",
202 "Action45",
203 "Action46",
204 "Action47",
205 "Action48",
206
207 "Pre_",
208 "_In_",
209 "_Suf",
210 }
211
212 type node32 struct {
213 token32
214 up, next *node32
215 }
216
217 func (node *node32) print(depth int, buffer string) {
218 for node != nil {
219 for c := 0; c < depth; c++ {
220 fmt.Printf(" ")
221 }
222 fmt.Printf("\x1B[34m%v\x1B[m %v\n", rul3s[node.pegRule], strconv.Quote(string(([]rune(buffer)[node.begin:node.end]))))
223 if node.up != nil {
224 node.up.print(depth+1, buffer)
225 }
226 node = node.next
227 }
228 }
229
230 func (node *node32) Print(buffer string) {
231 node.print(0, buffer)
232 }
233
234 type element struct {
235 node *node32
236 down *element
237 }
238
239 /* ${@} bit structure for abstract syntax tree */
240 type token32 struct {
241 pegRule
242 begin, end, next uint32
243 }
244
245 func (t *token32) isZero() bool {
246 return t.pegRule == ruleUnknown && t.begin == 0 && t.end == 0 && t.next == 0
247 }
248
249 func (t *token32) isParentOf(u token32) bool {
250 return t.begin <= u.begin && t.end >= u.end && t.next > u.next
251 }
252
253 func (t *token32) getToken32() token32 {
254 return token32{pegRule: t.pegRule, begin: uint32(t.begin), end: uint32(t.end), next: uint32(t.next)}
255 }
256
257 func (t *token32) String() string {
258 return fmt.Sprintf("\x1B[34m%v\x1B[m %v %v %v", rul3s[t.pegRule], t.begin, t.end, t.next)
259 }
260
261 type tokens32 struct {
262 tree []token32
263 ordered [][]token32
264 }
265
266 func (t *tokens32) trim(length int) {
267 t.tree = t.tree[0:length]
268 }
269
270 func (t *tokens32) Print() {
271 for _, token := range t.tree {
272 fmt.Println(token.String())
273 }
274 }
275
276 func (t *tokens32) Order() [][]token32 {
277 if t.ordered != nil {
278 return t.ordered
279 }
280
281 depths := make([]int32, 1, math.MaxInt16)
282 for i, token := range t.tree {
283 if token.pegRule == ruleUnknown {
284 t.tree = t.tree[:i]
285 break
286 }
287 depth := int(token.next)
288 if length := len(depths); depth >= length {
289 depths = depths[:depth+1]
290 }
291 depths[depth]++
292 }
293 depths = append(depths, 0)
294
295 ordered, pool := make([][]token32, len(depths)), make([]token32, len(t.tree)+len(depths))
296 for i, depth := range depths {
297 depth++
298 ordered[i], pool, depths[i] = pool[:depth], pool[depth:], 0
299 }
300
301 for i, token := range t.tree {
302 depth := token.next
303 token.next = uint32(i)
304 ordered[depth][depths[depth]] = token
305 depths[depth]++
306 }
307 t.ordered = ordered
308 return ordered
309 }
310
311 type state32 struct {
312 token32
313 depths []int32
314 leaf bool
315 }
316
317 func (t *tokens32) AST() *node32 {
318 tokens := t.Tokens()
319 stack := &element{node: &node32{token32: <-tokens}}
320 for token := range tokens {
321 if token.begin == token.end {
322 continue
323 }
324 node := &node32{token32: token}
325 for stack != nil && stack.node.begin >= token.begin && stack.node.end <= token.end {
326 stack.node.next = node.up
327 node.up = stack.node
328 stack = stack.down
329 }
330 stack = &element{node: node, down: stack}
331 }
332 return stack.node
333 }
334
335 func (t *tokens32) PreOrder() (<-chan state32, [][]token32) {
336 s, ordered := make(chan state32, 6), t.Order()
337 go func() {
338 var states [8]state32
339 for i := range states {
340 states[i].depths = make([]int32, len(ordered))
341 }
342 depths, state, depth := make([]int32, len(ordered)), 0, 1
343 write := func(t token32, leaf bool) {
344 S := states[state]
345 state, S.pegRule, S.begin, S.end, S.next, S.leaf = (state+1)%8, t.pegRule, t.begin, t.end, uint32(depth), leaf
346 copy(S.depths, depths)
347 s <- S
348 }
349
350 states[state].token32 = ordered[0][0]
351 depths[0]++
352 state++
353 a, b := ordered[depth-1][depths[depth-1]-1], ordered[depth][depths[depth]]
354 depthFirstSearch:
355 for {
356 for {
357 if i := depths[depth]; i > 0 {
358 if c, j := ordered[depth][i-1], depths[depth-1]; a.isParentOf(c) &&
359 (j < 2 || !ordered[depth-1][j-2].isParentOf(c)) {
360 if c.end != b.begin {
361 write(token32{pegRule: ruleIn, begin: c.end, end: b.begin}, true)
362 }
363 break
364 }
365 }
366
367 if a.begin < b.begin {
368 write(token32{pegRule: rulePre, begin: a.begin, end: b.begin}, true)
369 }
370 break
371 }
372
373 next := depth + 1
374 if c := ordered[next][depths[next]]; c.pegRule != ruleUnknown && b.isParentOf(c) {
375 write(b, false)
376 depths[depth]++
377 depth, a, b = next, b, c
378 continue
379 }
380
381 write(b, true)
382 depths[depth]++
383 c, parent := ordered[depth][depths[depth]], true
384 for {
385 if c.pegRule != ruleUnknown && a.isParentOf(c) {
386 b = c
387 continue depthFirstSearch
388 } else if parent && b.end != a.end {
389 write(token32{pegRule: ruleSuf, begin: b.end, end: a.end}, true)
390 }
391
392 depth--
393 if depth > 0 {
394 a, b, c = ordered[depth-1][depths[depth-1]-1], a, ordered[depth][depths[depth]]
395 parent = a.isParentOf(b)
396 continue
397 }
398
399 break depthFirstSearch
400 }
401 }
402
403 close(s)
404 }()
405 return s, ordered
406 }
407
408 func (t *tokens32) PrintSyntax() {
409 tokens, ordered := t.PreOrder()
410 max := -1
411 for token := range tokens {
412 if !token.leaf {
413 fmt.Printf("%v", token.begin)
414 for i, leaf, depths := 0, int(token.next), token.depths; i < leaf; i++ {
415 fmt.Printf(" \x1B[36m%v\x1B[m", rul3s[ordered[i][depths[i]-1].pegRule])
416 }
417 fmt.Printf(" \x1B[36m%v\x1B[m\n", rul3s[token.pegRule])
418 } else if token.begin == token.end {
419 fmt.Printf("%v", token.begin)
420 for i, leaf, depths := 0, int(token.next), token.depths; i < leaf; i++ {
421 fmt.Printf(" \x1B[31m%v\x1B[m", rul3s[ordered[i][depths[i]-1].pegRule])
422 }
423 fmt.Printf(" \x1B[31m%v\x1B[m\n", rul3s[token.pegRule])
424 } else {
425 for c, end := token.begin, token.end; c < end; c++ {
426 if i := int(c); max+1 < i {
427 for j := max; j < i; j++ {
428 fmt.Printf("skip %v %v\n", j, token.String())
429 }
430 max = i
431 } else if i := int(c); i <= max {
432 for j := i; j <= max; j++ {
433 fmt.Printf("dupe %v %v\n", j, token.String())
434 }
435 } else {
436 max = int(c)
437 }
438 fmt.Printf("%v", c)
439 for i, leaf, depths := 0, int(token.next), token.depths; i < leaf; i++ {
440 fmt.Printf(" \x1B[34m%v\x1B[m", rul3s[ordered[i][depths[i]-1].pegRule])
441 }
442 fmt.Printf(" \x1B[34m%v\x1B[m\n", rul3s[token.pegRule])
443 }
444 fmt.Printf("\n")
445 }
446 }
447 }
448
449 func (t *tokens32) PrintSyntaxTree(buffer string) {
450 tokens, _ := t.PreOrder()
451 for token := range tokens {
452 for c := 0; c < int(token.next); c++ {
453 fmt.Printf(" ")
454 }
455 fmt.Printf("\x1B[34m%v\x1B[m %v\n", rul3s[token.pegRule], strconv.Quote(string(([]rune(buffer)[token.begin:token.end]))))
456 }
457 }
458
459 func (t *tokens32) Add(rule pegRule, begin, end, depth uint32, index int) {
460 t.tree[index] = token32{pegRule: rule, begin: uint32(begin), end: uint32(end), next: uint32(depth)}
461 }
462
463 func (t *tokens32) Tokens() <-chan token32 {
464 s := make(chan token32, 16)
465 go func() {
466 for _, v := range t.tree {
467 s <- v.getToken32()
468 }
469 close(s)
470 }()
471 return s
472 }
473
474 func (t *tokens32) Error() []token32 {
475 ordered := t.Order()
476 length := len(ordered)
477 tokens, length := make([]token32, length), length-1
478 for i := range tokens {
479 o := ordered[length-i]
480 if len(o) > 1 {
481 tokens[i] = o[len(o)-2].getToken32()
482 }
483 }
484 return tokens
485 }
486
487 func (t *tokens32) Expand(index int) {
488 tree := t.tree
489 if index >= len(tree) {
490 expanded := make([]token32, 2*len(tree))
491 copy(expanded, tree)
492 t.tree = expanded
493 }
494 }
495
496 type Peg struct {
497 *Tree
498
499 Buffer string
500 buffer []rune
501 rules [92]func() bool
502 Parse func(rule ...int) error
503 Reset func()
504 Pretty bool
505 tokens32
506 }
507
508 type textPosition struct {
509 line, symbol int
510 }
511
512 type textPositionMap map[int]textPosition
513
514 func translatePositions(buffer []rune, positions []int) textPositionMap {
515 length, translations, j, line, symbol := len(positions), make(textPositionMap, len(positions)), 0, 1, 0
516 sort.Ints(positions)
517
518 search:
519 for i, c := range buffer {
520 if c == '\n' {
521 line, symbol = line+1, 0
522 } else {
523 symbol++
524 }
525 if i == positions[j] {
526 translations[positions[j]] = textPosition{line, symbol}
527 for j++; j < length; j++ {
528 if i != positions[j] {
529 continue search
530 }
531 }
532 break search
533 }
534 }
535
536 return translations
537 }
538
539 type parseError struct {
540 p *Peg
541 max token32
542 }
543
544 func (e *parseError) Error() string {
545 tokens, error := []token32{e.max}, "\n"
546 positions, p := make([]int, 2*len(tokens)), 0
547 for _, token := range tokens {
548 positions[p], p = int(token.begin), p+1
549 positions[p], p = int(token.end), p+1
550 }
551 translations := translatePositions(e.p.buffer, positions)
552 format := "parse error near %v (line %v symbol %v - line %v symbol %v):\n%v\n"
553 if e.p.Pretty {
554 format = "parse error near \x1B[34m%v\x1B[m (line %v symbol %v - line %v symbol %v):\n%v\n"
555 }
556 for _, token := range tokens {
557 begin, end := int(token.begin), int(token.end)
558 error += fmt.Sprintf(format,
559 rul3s[token.pegRule],
560 translations[begin].line, translations[begin].symbol,
561 translations[end].line, translations[end].symbol,
562 strconv.Quote(string(e.p.buffer[begin:end])))
563 }
564
565 return error
566 }
567
568 func (p *Peg) PrintSyntaxTree() {
569 p.tokens32.PrintSyntaxTree(p.Buffer)
570 }
571
572 func (p *Peg) Highlighter() {
573 p.PrintSyntax()
574 }
575
576 func (p *Peg) Execute() {
577 buffer, _buffer, text, begin, end := p.Buffer, p.buffer, "", 0, 0
578 for token := range p.Tokens() {
579 switch token.pegRule {
580
581 case rulePegText:
582 begin, end = int(token.begin), int(token.end)
583 text = string(_buffer[begin:end])
584
585 case ruleAction0:
586 p.AddPackage(text)
587 case ruleAction1:
588 p.AddPeg(text)
589 case ruleAction2:
590 p.AddState(text)
591 case ruleAction3:
592 p.AddImport(text)
593 case ruleAction4:
594 p.AddRule(text)
595 case ruleAction5:
596 p.AddExpression()
597 case ruleAction6:
598 p.AddAlternate()
599 case ruleAction7:
600 p.AddNil()
601 p.AddAlternate()
602 case ruleAction8:
603 p.AddNil()
604 case ruleAction9:
605 p.AddSequence()
606 case ruleAction10:
607 p.AddPredicate(text)
608 case ruleAction11:
609 p.AddStateChange(text)
610 case ruleAction12:
611 p.AddPeekFor()
612 case ruleAction13:
613 p.AddPeekNot()
614 case ruleAction14:
615 p.AddQuery()
616 case ruleAction15:
617 p.AddStar()
618 case ruleAction16:
619 p.AddPlus()
620 case ruleAction17:
621 p.AddName(text)
622 case ruleAction18:
623 p.AddDot()
624 case ruleAction19:
625 p.AddAction(text)
626 case ruleAction20:
627 p.AddPush()
628 case ruleAction21:
629 p.AddSequence()
630 case ruleAction22:
631 p.AddSequence()
632 case ruleAction23:
633 p.AddPeekNot()
634 p.AddDot()
635 p.AddSequence()
636 case ruleAction24:
637 p.AddPeekNot()
638 p.AddDot()
639 p.AddSequence()
640 case ruleAction25:
641 p.AddAlternate()
642 case ruleAction26:
643 p.AddAlternate()
644 case ruleAction27:
645 p.AddRange()
646 case ruleAction28:
647 p.AddDoubleRange()
648 case ruleAction29:
649 p.AddCharacter(text)
650 case ruleAction30:
651 p.AddDoubleCharacter(text)
652 case ruleAction31:
653 p.AddCharacter(text)
654 case ruleAction32:
655 p.AddCharacter("\a")
656 case ruleAction33:
657 p.AddCharacter("\b")
658 case ruleAction34:
659 p.AddCharacter("\x1B")
660 case ruleAction35:
661 p.AddCharacter("\f")
662 case ruleAction36:
663 p.AddCharacter("\n")
664 case ruleAction37:
665 p.AddCharacter("\r")
666 case ruleAction38:
667 p.AddCharacter("\t")
668 case ruleAction39:
669 p.AddCharacter("\v")
670 case ruleAction40:
671 p.AddCharacter("'")
672 case ruleAction41:
673 p.AddCharacter("\"")
674 case ruleAction42:
675 p.AddCharacter("[")
676 case ruleAction43:
677 p.AddCharacter("]")
678 case ruleAction44:
679 p.AddCharacter("-")
680 case ruleAction45:
681 p.AddHexaCharacter(text)
682 case ruleAction46:
683 p.AddOctalCharacter(text)
684 case ruleAction47:
685 p.AddOctalCharacter(text)
686 case ruleAction48:
687 p.AddCharacter("\\")
688
689 }
690 }
691 _, _, _, _, _ = buffer, _buffer, text, begin, end
692 }
693
694 func (p *Peg) Init() {
695 p.buffer = []rune(p.Buffer)
696 if len(p.buffer) == 0 || p.buffer[len(p.buffer)-1] != endSymbol {
697 p.buffer = append(p.buffer, endSymbol)
698 }
699
700 tree := tokens32{tree: make([]token32, math.MaxInt16)}
701 var max token32
702 position, depth, tokenIndex, buffer, _rules := uint32(0), uint32(0), 0, p.buffer, p.rules
703
704 p.Parse = func(rule ...int) error {
705 r := 1
706 if len(rule) > 0 {
707 r = rule[0]
708 }
709 matches := p.rules[r]()
710 p.tokens32 = tree
711 if matches {
712 p.trim(tokenIndex)
713 return nil
714 }
715 return &parseError{p, max}
716 }
717
718 p.Reset = func() {
719 position, tokenIndex, depth = 0, 0, 0
720 }
721
722 add := func(rule pegRule, begin uint32) {
723 tree.Expand(tokenIndex)
724 tree.Add(rule, begin, position, depth, tokenIndex)
725 tokenIndex++
726 if begin != position && position > max.end {
727 max = token32{rule, begin, position, depth}
728 }
729 }
730
731 matchDot := func() bool {
732 if buffer[position] != endSymbol {
733 position++
734 return true
735 }
736 return false
737 }
738
739 /*matchChar := func(c byte) bool {
740 if buffer[position] == c {
741 position++
742 return true
743 }
744 return false
745 }*/
746
747 /*matchRange := func(lower byte, upper byte) bool {
748 if c := buffer[position]; c >= lower && c <= upper {
749 position++
750 return true
751 }
752 return false
753 }*/
754
755 _rules = [...]func() bool{
756 nil,
757 /* 0 Grammar <- <(Spacing ('p' 'a' 'c' 'k' 'a' 'g' 'e') MustSpacing Identifier Action0 Import* ('t' 'y' 'p' 'e') MustSpacing Identifier Action1 ('P' 'e' 'g') Spacing Action Action2 Definition+ EndOfFile)> */
758 func() bool {
759 position0, tokenIndex0, depth0 := position, tokenIndex, depth
760 {
761 position1 := position
762 depth++
763 if !_rules[ruleSpacing]() {
764 goto l0
765 }
766 if buffer[position] != rune('p') {
767 goto l0
768 }
769 position++
770 if buffer[position] != rune('a') {
771 goto l0
772 }
773 position++
774 if buffer[position] != rune('c') {
775 goto l0
776 }
777 position++
778 if buffer[position] != rune('k') {
779 goto l0
780 }
781 position++
782 if buffer[position] != rune('a') {
783 goto l0
784 }
785 position++
786 if buffer[position] != rune('g') {
787 goto l0
788 }
789 position++
790 if buffer[position] != rune('e') {
791 goto l0
792 }
793 position++
794 if !_rules[ruleMustSpacing]() {
795 goto l0
796 }
797 if !_rules[ruleIdentifier]() {
798 goto l0
799 }
800 {
801 add(ruleAction0, position)
802 }
803 l3:
804 {
805 position4, tokenIndex4, depth4 := position, tokenIndex, depth
806 {
807 position5 := position
808 depth++
809 if buffer[position] != rune('i') {
810 goto l4
811 }
812 position++
813 if buffer[position] != rune('m') {
814 goto l4
815 }
816 position++
817 if buffer[position] != rune('p') {
818 goto l4
819 }
820 position++
821 if buffer[position] != rune('o') {
822 goto l4
823 }
824 position++
825 if buffer[position] != rune('r') {
826 goto l4
827 }
828 position++
829 if buffer[position] != rune('t') {
830 goto l4
831 }
832 position++
833 if !_rules[ruleSpacing]() {
834 goto l4
835 }
836 if buffer[position] != rune('"') {
837 goto l4
838 }
839 position++
840 {
841 position6 := position
842 depth++
843 {
844 switch buffer[position] {
845 case '-':
846 if buffer[position] != rune('-') {
847 goto l4
848 }
849 position++
850 break
851 case '.':
852 if buffer[position] != rune('.') {
853 goto l4
854 }
855 position++
856 break
857 case '/':
858 if buffer[position] != rune('/') {
859 goto l4
860 }
861 position++
862 break
863 case '_':
864 if buffer[position] != rune('_') {
865 goto l4
866 }
867 position++
868 break
869 case 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z':
870 if c := buffer[position]; c < rune('A') || c > rune('Z') {
871 goto l4
872 }
873 position++
874 break
875 default:
876 if c := buffer[position]; c < rune('a') || c > rune('z') {
877 goto l4
878 }
879 position++
880 break
881 }
882 }
883
884 l7:
885 {
886 position8, tokenIndex8, depth8 := position, tokenIndex, depth
887 {
888 switch buffer[position] {
889 case '-':
890 if buffer[position] != rune('-') {
891 goto l8
892 }
893 position++
894 break
895 case '.':
896 if buffer[position] != rune('.') {
897 goto l8
898 }
899 position++
900 break
901 case '/':
902 if buffer[position] != rune('/') {
903 goto l8
904 }
905 position++
906 break
907 case '_':
908 if buffer[position] != rune('_') {
909 goto l8
910 }
911 position++
912 break
913 case 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z':
914 if c := buffer[position]; c < rune('A') || c > rune('Z') {
915 goto l8
916 }
917 position++
918 break
919 default:
920 if c := buffer[position]; c < rune('a') || c > rune('z') {
921 goto l8
922 }
923 position++
924 break
925 }
926 }
927
928 goto l7
929 l8:
930 position, tokenIndex, depth = position8, tokenIndex8, depth8
931 }
932 depth--
933 add(rulePegText, position6)
934 }
935 if buffer[position] != rune('"') {
936 goto l4
937 }
938 position++
939 if !_rules[ruleSpacing]() {
940 goto l4
941 }
942 {
943 add(ruleAction3, position)
944 }
945 depth--
946 add(ruleImport, position5)
947 }
948 goto l3
949 l4:
950 position, tokenIndex, depth = position4, tokenIndex4, depth4
951 }
952 if buffer[position] != rune('t') {
953 goto l0
954 }
955 position++
956 if buffer[position] != rune('y') {
957 goto l0
958 }
959 position++
960 if buffer[position] != rune('p') {
961 goto l0
962 }
963 position++
964 if buffer[position] != rune('e') {
965 goto l0
966 }
967 position++
968 if !_rules[ruleMustSpacing]() {
969 goto l0
970 }
971 if !_rules[ruleIdentifier]() {
972 goto l0
973 }
974 {
975 add(ruleAction1, position)
976 }
977 if buffer[position] != rune('P') {
978 goto l0
979 }
980 position++
981 if buffer[position] != rune('e') {
982 goto l0
983 }
984 position++
985 if buffer[position] != rune('g') {
986 goto l0
987 }
988 position++
989 if !_rules[ruleSpacing]() {
990 goto l0
991 }
992 if !_rules[ruleAction]() {
993 goto l0
994 }
995 {
996 add(ruleAction2, position)
997 }
998 {
999 position16 := position
1000 depth++
1001 if !_rules[ruleIdentifier]() {
1002 goto l0
1003 }
1004 {
1005 add(ruleAction4, position)
1006 }
1007 if !_rules[ruleLeftArrow]() {
1008 goto l0
1009 }
1010 if !_rules[ruleExpression]() {
1011 goto l0
1012 }
1013 {
1014 add(ruleAction5, position)
1015 }
1016 {
1017 position19, tokenIndex19, depth19 := position, tokenIndex, depth
1018 {
1019 position20, tokenIndex20, depth20 := position, tokenIndex, depth
1020 if !_rules[ruleIdentifier]() {
1021 goto l21
1022 }
1023 if !_rules[ruleLeftArrow]() {
1024 goto l21
1025 }
1026 goto l20
1027 l21:
1028 position, tokenIndex, depth = position20, tokenIndex20, depth20
1029 {
1030 position22, tokenIndex22, depth22 := position, tokenIndex, depth
1031 if !matchDot() {
1032 goto l22
1033 }
1034 goto l0
1035 l22:
1036 position, tokenIndex, depth = position22, tokenIndex22, depth22
1037 }
1038 }
1039 l20:
1040 position, tokenIndex, depth = position19, tokenIndex19, depth19
1041 }
1042 depth--
1043 add(ruleDefinition, position16)
1044 }
1045 l14:
1046 {
1047 position15, tokenIndex15, depth15 := position, tokenIndex, depth
1048 {
1049 position23 := position
1050 depth++
1051 if !_rules[ruleIdentifier]() {
1052 goto l15
1053 }
1054 {
1055 add(ruleAction4, position)
1056 }
1057 if !_rules[ruleLeftArrow]() {
1058 goto l15
1059 }
1060 if !_rules[ruleExpression]() {
1061 goto l15
1062 }
1063 {
1064 add(ruleAction5, position)
1065 }
1066 {
1067 position26, tokenIndex26, depth26 := position, tokenIndex, depth
1068 {
1069 position27, tokenIndex27, depth27 := position, tokenIndex, depth
1070 if !_rules[ruleIdentifier]() {
1071 goto l28
1072 }
1073 if !_rules[ruleLeftArrow]() {
1074 goto l28
1075 }
1076 goto l27
1077 l28:
1078 position, tokenIndex, depth = position27, tokenIndex27, depth27
1079 {
1080 position29, tokenIndex29, depth29 := position, tokenIndex, depth
1081 if !matchDot() {
1082 goto l29
1083 }
1084 goto l15
1085 l29:
1086 position, tokenIndex, depth = position29, tokenIndex29, depth29
1087 }
1088 }
1089 l27:
1090 position, tokenIndex, depth = position26, tokenIndex26, depth26
1091 }
1092 depth--
1093 add(ruleDefinition, position23)
1094 }
1095 goto l14
1096 l15:
1097 position, tokenIndex, depth = position15, tokenIndex15, depth15
1098 }
1099 {
1100 position30 := position
1101 depth++
1102 {
1103 position31, tokenIndex31, depth31 := position, tokenIndex, depth
1104 if !matchDot() {
1105 goto l31
1106 }
1107 goto l0
1108 l31:
1109 position, tokenIndex, depth = position31, tokenIndex31, depth31
1110 }
1111 depth--
1112 add(ruleEndOfFile, position30)
1113 }
1114 depth--
1115 add(ruleGrammar, position1)
1116 }
1117 return true
1118 l0:
1119 position, tokenIndex, depth = position0, tokenIndex0, depth0
1120 return false
1121 },
1122 /* 1 Import <- <('i' 'm' 'p' 'o' 'r' 't' Spacing '"' <((&('-') '-') | (&('.') '.') | (&('/') '/') | (&('_') '_') | (&('A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z') [A-Z]) | (&('a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z') [a-z]))+> '"' Spacing Action3)> */
1123 nil,
1124 /* 2 Definition <- <(Identifier Action4 LeftArrow Expression Action5 &((Identifier LeftArrow) / !.))> */
1125 nil,
1126 /* 3 Expression <- <((Sequence (Slash Sequence Action6)* (Slash Action7)?) / Action8)> */
1127 func() bool {
1128 {
1129 position35 := position
1130 depth++
1131 {
1132 position36, tokenIndex36, depth36 := position, tokenIndex, depth
1133 if !_rules[ruleSequence]() {
1134 goto l37
1135 }
1136 l38:
1137 {
1138 position39, tokenIndex39, depth39 := position, tokenIndex, depth
1139 if !_rules[ruleSlash]() {
1140 goto l39
1141 }
1142 if !_rules[ruleSequence]() {
1143 goto l39
1144 }
1145 {
1146 add(ruleAction6, position)
1147 }
1148 goto l38
1149 l39:
1150 position, tokenIndex, depth = position39, tokenIndex39, depth39
1151 }
1152 {
1153 position41, tokenIndex41, depth41 := position, tokenIndex, depth
1154 if !_rules[ruleSlash]() {
1155 goto l41
1156 }
1157 {
1158 add(ruleAction7, position)
1159 }
1160 goto l42
1161 l41:
1162 position, tokenIndex, depth = position41, tokenIndex41, depth41
1163 }
1164 l42:
1165 goto l36
1166 l37:
1167 position, tokenIndex, depth = position36, tokenIndex36, depth36
1168 {
1169 add(ruleAction8, position)
1170 }
1171 }
1172 l36:
1173 depth--
1174 add(ruleExpression, position35)
1175 }
1176 return true
1177 },
1178 /* 4 Sequence <- <(Prefix (Prefix Action9)*)> */
1179 func() bool {
1180 position45, tokenIndex45, depth45 := position, tokenIndex, depth
1181 {
1182 position46 := position
1183 depth++
1184 if !_rules[rulePrefix]() {
1185 goto l45
1186 }
1187 l47:
1188 {
1189 position48, tokenIndex48, depth48 := position, tokenIndex, depth
1190 if !_rules[rulePrefix]() {
1191 goto l48
1192 }
1193 {
1194 add(ruleAction9, position)
1195 }
1196 goto l47
1197 l48:
1198 position, tokenIndex, depth = position48, tokenIndex48, depth48
1199 }
1200 depth--
1201 add(ruleSequence, position46)
1202 }
1203 return true
1204 l45:
1205 position, tokenIndex, depth = position45, tokenIndex45, depth45
1206 return false
1207 },
1208 /* 5 Prefix <- <((And Action Action10) / (Not Action Action11) / ((&('!') (Not Suffix Action13)) | (&('&') (And Suffix Action12)) | (&('"' | '\'' | '(' | '.' | '<' | 'A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z' | '[' | '_' | 'a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z' | '{') Suffix)))> */
1209 func() bool {
1210 position50, tokenIndex50, depth50 := position, tokenIndex, depth
1211 {
1212 position51 := position
1213 depth++
1214 {
1215 position52, tokenIndex52, depth52 := position, tokenIndex, depth
1216 if !_rules[ruleAnd]() {
1217 goto l53
1218 }
1219 if !_rules[ruleAction]() {
1220 goto l53
1221 }
1222 {
1223 add(ruleAction10, position)
1224 }
1225 goto l52
1226 l53:
1227 position, tokenIndex, depth = position52, tokenIndex52, depth52
1228 if !_rules[ruleNot]() {
1229 goto l55
1230 }
1231 if !_rules[ruleAction]() {
1232 goto l55
1233 }
1234 {
1235 add(ruleAction11, position)
1236 }
1237 goto l52
1238 l55:
1239 position, tokenIndex, depth = position52, tokenIndex52, depth52
1240 {
1241 switch buffer[position] {
1242 case '!':
1243 if !_rules[ruleNot]() {
1244 goto l50
1245 }
1246 if !_rules[ruleSuffix]() {
1247 goto l50
1248 }
1249 {
1250 add(ruleAction13, position)
1251 }
1252 break
1253 case '&':
1254 if !_rules[ruleAnd]() {
1255 goto l50
1256 }
1257 if !_rules[ruleSuffix]() {
1258 goto l50
1259 }
1260 {
1261 add(ruleAction12, position)
1262 }
1263 break
1264 default:
1265 if !_rules[ruleSuffix]() {
1266 goto l50
1267 }
1268 break
1269 }
1270 }
1271
1272 }
1273 l52:
1274 depth--
1275 add(rulePrefix, position51)
1276 }
1277 return true
1278 l50:
1279 position, tokenIndex, depth = position50, tokenIndex50, depth50
1280 return false
1281 },
1282 /* 6 Suffix <- <(Primary ((&('+') (Plus Action16)) | (&('*') (Star Action15)) | (&('?') (Question Action14)))?)> */
1283 func() bool {
1284 position60, tokenIndex60, depth60 := position, tokenIndex, depth
1285 {
1286 position61 := position
1287 depth++
1288 {
1289 position62 := position
1290 depth++
1291 {
1292 switch buffer[position] {
1293 case '<':
1294 {
1295 position64 := position
1296 depth++
1297 if buffer[position] != rune('<') {
1298 goto l60
1299 }
1300 position++
1301 if !_rules[ruleSpacing]() {
1302 goto l60
1303 }
1304 depth--
1305 add(ruleBegin, position64)
1306 }
1307 if !_rules[ruleExpression]() {
1308 goto l60
1309 }
1310 {
1311 position65 := position
1312 depth++
1313 if buffer[position] != rune('>') {
1314 goto l60
1315 }
1316 position++
1317 if !_rules[ruleSpacing]() {
1318 goto l60
1319 }
1320 depth--
1321 add(ruleEnd, position65)
1322 }
1323 {
1324 add(ruleAction20, position)
1325 }
1326 break
1327 case '{':
1328 if !_rules[ruleAction]() {
1329 goto l60
1330 }
1331 {
1332 add(ruleAction19, position)
1333 }
1334 break
1335 case '.':
1336 {
1337 position68 := position
1338 depth++
1339 if buffer[position] != rune('.') {
1340 goto l60
1341 }
1342 position++
1343 if !_rules[ruleSpacing]() {
1344 goto l60
1345 }
1346 depth--
1347 add(ruleDot, position68)
1348 }
1349 {
1350 add(ruleAction18, position)
1351 }
1352 break
1353 case '[':
1354 {
1355 position70 := position
1356 depth++
1357 {
1358 position71, tokenIndex71, depth71 := position, tokenIndex, depth
1359 if buffer[position] != rune('[') {
1360 goto l72
1361 }
1362 position++
1363 if buffer[position] != rune('[') {
1364 goto l72
1365 }
1366 position++
1367 {
1368 position73, tokenIndex73, depth73 := position, tokenIndex, depth
1369 {
1370 position75, tokenIndex75, depth75 := position, tokenIndex, depth
1371 if buffer[position] != rune('^') {
1372 goto l76
1373 }
1374 position++
1375 if !_rules[ruleDoubleRanges]() {
1376 goto l76
1377 }
1378 {
1379 add(ruleAction23, position)
1380 }
1381 goto l75
1382 l76:
1383 position, tokenIndex, depth = position75, tokenIndex75, depth75
1384 if !_rules[ruleDoubleRanges]() {
1385 goto l73
1386 }
1387 }
1388 l75:
1389 goto l74
1390 l73:
1391 position, tokenIndex, depth = position73, tokenIndex73, depth73
1392 }
1393 l74:
1394 if buffer[position] != rune(']') {
1395 goto l72
1396 }
1397 position++
1398 if buffer[position] != rune(']') {
1399 goto l72
1400 }
1401 position++
1402 goto l71
1403 l72:
1404 position, tokenIndex, depth = position71, tokenIndex71, depth71
1405 if buffer[position] != rune('[') {
1406 goto l60
1407 }
1408 position++
1409 {
1410 position78, tokenIndex78, depth78 := position, tokenIndex, depth
1411 {
1412 position80, tokenIndex80, depth80 := position, tokenIndex, depth
1413 if buffer[position] != rune('^') {
1414 goto l81
1415 }
1416 position++
1417 if !_rules[ruleRanges]() {
1418 goto l81
1419 }
1420 {
1421 add(ruleAction24, position)
1422 }
1423 goto l80
1424 l81:
1425 position, tokenIndex, depth = position80, tokenIndex80, depth80
1426 if !_rules[ruleRanges]() {
1427 goto l78
1428 }
1429 }
1430 l80:
1431 goto l79
1432 l78:
1433 position, tokenIndex, depth = position78, tokenIndex78, depth78
1434 }
1435 l79:
1436 if buffer[position] != rune(']') {
1437 goto l60
1438 }
1439 position++
1440 }
1441 l71:
1442 if !_rules[ruleSpacing]() {
1443 goto l60
1444 }
1445 depth--
1446 add(ruleClass, position70)
1447 }
1448 break
1449 case '"', '\'':
1450 {
1451 position83 := position
1452 depth++
1453 {
1454 position84, tokenIndex84, depth84 := position, tokenIndex, depth
1455 if buffer[position] != rune('\'') {
1456 goto l85
1457 }
1458 position++
1459 {
1460 position86, tokenIndex86, depth86 := position, tokenIndex, depth
1461 {
1462 position88, tokenIndex88, depth88 := position, tokenIndex, depth
1463 if buffer[position] != rune('\'') {
1464 goto l88
1465 }
1466 position++
1467 goto l86
1468 l88:
1469 position, tokenIndex, depth = position88, tokenIndex88, depth88
1470 }
1471 if !_rules[ruleChar]() {
1472 goto l86
1473 }
1474 goto l87
1475 l86:
1476 position, tokenIndex, depth = position86, tokenIndex86, depth86
1477 }
1478 l87:
1479 l89:
1480 {
1481 position90, tokenIndex90, depth90 := position, tokenIndex, depth
1482 {
1483 position91, tokenIndex91, depth91 := position, tokenIndex, depth
1484 if buffer[position] != rune('\'') {
1485 goto l91
1486 }
1487 position++
1488 goto l90
1489 l91:
1490 position, tokenIndex, depth = position91, tokenIndex91, depth91
1491 }
1492 if !_rules[ruleChar]() {
1493 goto l90
1494 }
1495 {
1496 add(ruleAction21, position)
1497 }
1498 goto l89
1499 l90:
1500 position, tokenIndex, depth = position90, tokenIndex90, depth90
1501 }
1502 if buffer[position] != rune('\'') {
1503 goto l85
1504 }
1505 position++
1506 if !_rules[ruleSpacing]() {
1507 goto l85
1508 }
1509 goto l84
1510 l85:
1511 position, tokenIndex, depth = position84, tokenIndex84, depth84
1512 if buffer[position] != rune('"') {
1513 goto l60
1514 }
1515 position++
1516 {
1517 position93, tokenIndex93, depth93 := position, tokenIndex, depth
1518 {
1519 position95, tokenIndex95, depth95 := position, tokenIndex, depth
1520 if buffer[position] != rune('"') {
1521 goto l95
1522 }
1523 position++
1524 goto l93
1525 l95:
1526 position, tokenIndex, depth = position95, tokenIndex95, depth95
1527 }
1528 if !_rules[ruleDoubleChar]() {
1529 goto l93
1530 }
1531 goto l94
1532 l93:
1533 position, tokenIndex, depth = position93, tokenIndex93, depth93
1534 }
1535 l94:
1536 l96:
1537 {
1538 position97, tokenIndex97, depth97 := position, tokenIndex, depth
1539 {
1540 position98, tokenIndex98, depth98 := position, tokenIndex, depth
1541 if buffer[position] != rune('"') {
1542 goto l98
1543 }
1544 position++
1545 goto l97
1546 l98:
1547 position, tokenIndex, depth = position98, tokenIndex98, depth98
1548 }
1549 if !_rules[ruleDoubleChar]() {
1550 goto l97
1551 }
1552 {
1553 add(ruleAction22, position)
1554 }
1555 goto l96
1556 l97:
1557 position, tokenIndex, depth = position97, tokenIndex97, depth97
1558 }
1559 if buffer[position] != rune('"') {
1560 goto l60
1561 }
1562 position++
1563 if !_rules[ruleSpacing]() {
1564 goto l60
1565 }
1566 }
1567 l84:
1568 depth--
1569 add(ruleLiteral, position83)
1570 }
1571 break
1572 case '(':
1573 {
1574 position100 := position
1575 depth++
1576 if buffer[position] != rune('(') {
1577 goto l60
1578 }
1579 position++
1580 if !_rules[ruleSpacing]() {
1581 goto l60
1582 }
1583 depth--
1584 add(ruleOpen, position100)
1585 }
1586 if !_rules[ruleExpression]() {
1587 goto l60
1588 }
1589 {
1590 position101 := position
1591 depth++
1592 if buffer[position] != rune(')') {
1593 goto l60
1594 }
1595 position++
1596 if !_rules[ruleSpacing]() {
1597 goto l60
1598 }
1599 depth--
1600 add(ruleClose, position101)
1601 }
1602 break
1603 default:
1604 if !_rules[ruleIdentifier]() {
1605 goto l60
1606 }
1607 {
1608 position102, tokenIndex102, depth102 := position, tokenIndex, depth
1609 if !_rules[ruleLeftArrow]() {
1610 goto l102
1611 }
1612 goto l60
1613 l102:
1614 position, tokenIndex, depth = position102, tokenIndex102, depth102
1615 }
1616 {
1617 add(ruleAction17, position)
1618 }
1619 break
1620 }
1621 }
1622
1623 depth--
1624 add(rulePrimary, position62)
1625 }
1626 {
1627 position104, tokenIndex104, depth104 := position, tokenIndex, depth
1628 {
1629 switch buffer[position] {
1630 case '+':
1631 {
1632 position107 := position
1633 depth++
1634 if buffer[position] != rune('+') {
1635 goto l104
1636 }
1637 position++
1638 if !_rules[ruleSpacing]() {
1639 goto l104
1640 }
1641 depth--
1642 add(rulePlus, position107)
1643 }
1644 {
1645 add(ruleAction16, position)
1646 }
1647 break
1648 case '*':
1649 {
1650 position109 := position
1651 depth++
1652 if buffer[position] != rune('*') {
1653 goto l104
1654 }
1655 position++
1656 if !_rules[ruleSpacing]() {
1657 goto l104
1658 }
1659 depth--
1660 add(ruleStar, position109)
1661 }
1662 {
1663 add(ruleAction15, position)
1664 }
1665 break
1666 default:
1667 {
1668 position111 := position
1669 depth++
1670 if buffer[position] != rune('?') {
1671 goto l104
1672 }
1673 position++
1674 if !_rules[ruleSpacing]() {
1675 goto l104
1676 }
1677 depth--
1678 add(ruleQuestion, position111)
1679 }
1680 {
1681 add(ruleAction14, position)
1682 }
1683 break
1684 }
1685 }
1686
1687 goto l105
1688 l104:
1689 position, tokenIndex, depth = position104, tokenIndex104, depth104
1690 }
1691 l105:
1692 depth--
1693 add(ruleSuffix, position61)
1694 }
1695 return true
1696 l60:
1697 position, tokenIndex, depth = position60, tokenIndex60, depth60
1698 return false
1699 },
1700 /* 7 Primary <- <((&('<') (Begin Expression End Action20)) | (&('{') (Action Action19)) | (&('.') (Dot Action18)) | (&('[') Class) | (&('"' | '\'') Literal) | (&('(') (Open Expression Close)) | (&('A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z' | '_' | 'a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z') (Identifier !LeftArrow Action17)))> */
1701 nil,
1702 /* 8 Identifier <- <(<(IdentStart IdentCont*)> Spacing)> */
1703 func() bool {
1704 position114, tokenIndex114, depth114 := position, tokenIndex, depth
1705 {
1706 position115 := position
1707 depth++
1708 {
1709 position116 := position
1710 depth++
1711 if !_rules[ruleIdentStart]() {
1712 goto l114
1713 }
1714 l117:
1715 {
1716 position118, tokenIndex118, depth118 := position, tokenIndex, depth
1717 {
1718 position119 := position
1719 depth++
1720 {
1721 position120, tokenIndex120, depth120 := position, tokenIndex, depth
1722 if !_rules[ruleIdentStart]() {
1723 goto l121
1724 }
1725 goto l120
1726 l121:
1727 position, tokenIndex, depth = position120, tokenIndex120, depth120
1728 if c := buffer[position]; c < rune('0') || c > rune('9') {
1729 goto l118
1730 }
1731 position++
1732 }
1733 l120:
1734 depth--
1735 add(ruleIdentCont, position119)
1736 }
1737 goto l117
1738 l118:
1739 position, tokenIndex, depth = position118, tokenIndex118, depth118
1740 }
1741 depth--
1742 add(rulePegText, position116)
1743 }
1744 if !_rules[ruleSpacing]() {
1745 goto l114
1746 }
1747 depth--
1748 add(ruleIdentifier, position115)
1749 }
1750 return true
1751 l114:
1752 position, tokenIndex, depth = position114, tokenIndex114, depth114
1753 return false
1754 },
1755 /* 9 IdentStart <- <((&('_') '_') | (&('A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z') [A-Z]) | (&('a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z') [a-z]))> */
1756 func() bool {
1757 position122, tokenIndex122, depth122 := position, tokenIndex, depth
1758 {
1759 position123 := position
1760 depth++
1761 {
1762 switch buffer[position] {
1763 case '_':
1764 if buffer[position] != rune('_') {
1765 goto l122
1766 }
1767 position++
1768 break
1769 case 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z':
1770 if c := buffer[position]; c < rune('A') || c > rune('Z') {
1771 goto l122
1772 }
1773 position++
1774 break
1775 default:
1776 if c := buffer[position]; c < rune('a') || c > rune('z') {
1777 goto l122
1778 }
1779 position++
1780 break
1781 }
1782 }
1783
1784 depth--
1785 add(ruleIdentStart, position123)
1786 }
1787 return true
1788 l122:
1789 position, tokenIndex, depth = position122, tokenIndex122, depth122
1790 return false
1791 },
1792 /* 10 IdentCont <- <(IdentStart / [0-9])> */
1793 nil,
1794 /* 11 Literal <- <(('\'' (!'\'' Char)? (!'\'' Char Action21)* '\'' Spacing) / ('"' (!'"' DoubleChar)? (!'"' DoubleChar Action22)* '"' Spacing))> */
1795 nil,
1796 /* 12 Class <- <((('[' '[' (('^' DoubleRanges Action23) / DoubleRanges)? (']' ']')) / ('[' (('^' Ranges Action24) / Ranges)? ']')) Spacing)> */
1797 nil,
1798 /* 13 Ranges <- <(!']' Range (!']' Range Action25)*)> */
1799 func() bool {
1800 position128, tokenIndex128, depth128 := position, tokenIndex, depth
1801 {
1802 position129 := position
1803 depth++
1804 {
1805 position130, tokenIndex130, depth130 := position, tokenIndex, depth
1806 if buffer[position] != rune(']') {
1807 goto l130
1808 }
1809 position++
1810 goto l128
1811 l130:
1812 position, tokenIndex, depth = position130, tokenIndex130, depth130
1813 }
1814 if !_rules[ruleRange]() {
1815 goto l128
1816 }
1817 l131:
1818 {
1819 position132, tokenIndex132, depth132 := position, tokenIndex, depth
1820 {
1821 position133, tokenIndex133, depth133 := position, tokenIndex, depth
1822 if buffer[position] != rune(']') {
1823 goto l133
1824 }
1825 position++
1826 goto l132
1827 l133:
1828 position, tokenIndex, depth = position133, tokenIndex133, depth133
1829 }
1830 if !_rules[ruleRange]() {
1831 goto l132
1832 }
1833 {
1834 add(ruleAction25, position)
1835 }
1836 goto l131
1837 l132:
1838 position, tokenIndex, depth = position132, tokenIndex132, depth132
1839 }
1840 depth--
1841 add(ruleRanges, position129)
1842 }
1843 return true
1844 l128:
1845 position, tokenIndex, depth = position128, tokenIndex128, depth128
1846 return false
1847 },
1848 /* 14 DoubleRanges <- <(!(']' ']') DoubleRange (!(']' ']') DoubleRange Action26)*)> */
1849 func() bool {
1850 position135, tokenIndex135, depth135 := position, tokenIndex, depth
1851 {
1852 position136 := position
1853 depth++
1854 {
1855 position137, tokenIndex137, depth137 := position, tokenIndex, depth
1856 if buffer[position] != rune(']') {
1857 goto l137
1858 }
1859 position++
1860 if buffer[position] != rune(']') {
1861 goto l137
1862 }
1863 position++
1864 goto l135
1865 l137:
1866 position, tokenIndex, depth = position137, tokenIndex137, depth137
1867 }
1868 if !_rules[ruleDoubleRange]() {
1869 goto l135
1870 }
1871 l138:
1872 {
1873 position139, tokenIndex139, depth139 := position, tokenIndex, depth
1874 {
1875 position140, tokenIndex140, depth140 := position, tokenIndex, depth
1876 if buffer[position] != rune(']') {
1877 goto l140
1878 }
1879 position++
1880 if buffer[position] != rune(']') {
1881 goto l140
1882 }
1883 position++
1884 goto l139
1885 l140:
1886 position, tokenIndex, depth = position140, tokenIndex140, depth140
1887 }
1888 if !_rules[ruleDoubleRange]() {
1889 goto l139
1890 }
1891 {
1892 add(ruleAction26, position)
1893 }
1894 goto l138
1895 l139:
1896 position, tokenIndex, depth = position139, tokenIndex139, depth139
1897 }
1898 depth--
1899 add(ruleDoubleRanges, position136)
1900 }
1901 return true
1902 l135:
1903 position, tokenIndex, depth = position135, tokenIndex135, depth135
1904 return false
1905 },
1906 /* 15 Range <- <((Char '-' Char Action27) / Char)> */
1907 func() bool {
1908 position142, tokenIndex142, depth142 := position, tokenIndex, depth
1909 {
1910 position143 := position
1911 depth++
1912 {
1913 position144, tokenIndex144, depth144 := position, tokenIndex, depth
1914 if !_rules[ruleChar]() {
1915 goto l145
1916 }
1917 if buffer[position] != rune('-') {
1918 goto l145
1919 }
1920 position++
1921 if !_rules[ruleChar]() {
1922 goto l145
1923 }
1924 {
1925 add(ruleAction27, position)
1926 }
1927 goto l144
1928 l145:
1929 position, tokenIndex, depth = position144, tokenIndex144, depth144
1930 if !_rules[ruleChar]() {
1931 goto l142
1932 }
1933 }
1934 l144:
1935 depth--
1936 add(ruleRange, position143)
1937 }
1938 return true
1939 l142:
1940 position, tokenIndex, depth = position142, tokenIndex142, depth142
1941 return false
1942 },
1943 /* 16 DoubleRange <- <((Char '-' Char Action28) / DoubleChar)> */
1944 func() bool {
1945 position147, tokenIndex147, depth147 := position, tokenIndex, depth
1946 {
1947 position148 := position
1948 depth++
1949 {
1950 position149, tokenIndex149, depth149 := position, tokenIndex, depth
1951 if !_rules[ruleChar]() {
1952 goto l150
1953 }
1954 if buffer[position] != rune('-') {
1955 goto l150
1956 }
1957 position++
1958 if !_rules[ruleChar]() {
1959 goto l150
1960 }
1961 {
1962 add(ruleAction28, position)
1963 }
1964 goto l149
1965 l150:
1966 position, tokenIndex, depth = position149, tokenIndex149, depth149
1967 if !_rules[ruleDoubleChar]() {
1968 goto l147
1969 }
1970 }
1971 l149:
1972 depth--
1973 add(ruleDoubleRange, position148)
1974 }
1975 return true
1976 l147:
1977 position, tokenIndex, depth = position147, tokenIndex147, depth147
1978 return false
1979 },
1980 /* 17 Char <- <(Escape / (!'\\' <.> Action29))> */
1981 func() bool {
1982 position152, tokenIndex152, depth152 := position, tokenIndex, depth
1983 {
1984 position153 := position
1985 depth++
1986 {
1987 position154, tokenIndex154, depth154 := position, tokenIndex, depth
1988 if !_rules[ruleEscape]() {
1989 goto l155
1990 }
1991 goto l154
1992 l155:
1993 position, tokenIndex, depth = position154, tokenIndex154, depth154
1994 {
1995 position156, tokenIndex156, depth156 := position, tokenIndex, depth
1996 if buffer[position] != rune('\\') {
1997 goto l156
1998 }
1999 position++
2000 goto l152
2001 l156:
2002 position, tokenIndex, depth = position156, tokenIndex156, depth156
2003 }
2004 {
2005 position157 := position
2006 depth++
2007 if !matchDot() {
2008 goto l152
2009 }
2010 depth--
2011 add(rulePegText, position157)
2012 }
2013 {
2014 add(ruleAction29, position)
2015 }
2016 }
2017 l154:
2018 depth--
2019 add(ruleChar, position153)
2020 }
2021 return true
2022 l152:
2023 position, tokenIndex, depth = position152, tokenIndex152, depth152
2024 return false
2025 },
2026 /* 18 DoubleChar <- <(Escape / (<([a-z] / [A-Z])> Action30) / (!'\\' <.> Action31))> */
2027 func() bool {
2028 position159, tokenIndex159, depth159 := position, tokenIndex, depth
2029 {
2030 position160 := position
2031 depth++
2032 {
2033 position161, tokenIndex161, depth161 := position, tokenIndex, depth
2034 if !_rules[ruleEscape]() {
2035 goto l162
2036 }
2037 goto l161
2038 l162:
2039 position, tokenIndex, depth = position161, tokenIndex161, depth161
2040 {
2041 position164 := position
2042 depth++
2043 {
2044 position165, tokenIndex165, depth165 := position, tokenIndex, depth
2045 if c := buffer[position]; c < rune('a') || c > rune('z') {
2046 goto l166
2047 }
2048 position++
2049 goto l165
2050 l166:
2051 position, tokenIndex, depth = position165, tokenIndex165, depth165
2052 if c := buffer[position]; c < rune('A') || c > rune('Z') {
2053 goto l163
2054 }
2055 position++
2056 }
2057 l165:
2058 depth--
2059 add(rulePegText, position164)
2060 }
2061 {
2062 add(ruleAction30, position)
2063 }
2064 goto l161
2065 l163:
2066 position, tokenIndex, depth = position161, tokenIndex161, depth161
2067 {
2068 position168, tokenIndex168, depth168 := position, tokenIndex, depth
2069 if buffer[position] != rune('\\') {
2070 goto l168
2071 }
2072 position++
2073 goto l159
2074 l168:
2075 position, tokenIndex, depth = position168, tokenIndex168, depth168
2076 }
2077 {
2078 position169 := position
2079 depth++
2080 if !matchDot() {
2081 goto l159
2082 }
2083 depth--
2084 add(rulePegText, position169)
2085 }
2086 {
2087 add(ruleAction31, position)
2088 }
2089 }
2090 l161:
2091 depth--
2092 add(ruleDoubleChar, position160)
2093 }
2094 return true
2095 l159:
2096 position, tokenIndex, depth = position159, tokenIndex159, depth159
2097 return false
2098 },
2099 /* 19 Escape <- <(('\\' ('a' / 'A') Action32) / ('\\' ('b' / 'B') Action33) / ('\\' ('e' / 'E') Action34) / ('\\' ('f' / 'F') Action35) / ('\\' ('n' / 'N') Action36) / ('\\' ('r' / 'R') Action37) / ('\\' ('t' / 'T') Action38) / ('\\' ('v' / 'V') Action39) / ('\\' '\'' Action40) / ('\\' '"' Action41) / ('\\' '[' Action42) / ('\\' ']' Action43) / ('\\' '-' Action44) / ('\\' ('0' ('x' / 'X')) <((&('A' | 'B' | 'C' | 'D' | 'E' | 'F') [A-F]) | (&('a' | 'b' | 'c' | 'd' | 'e' | 'f') [a-f]) | (&('0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9') [0-9]))+> Action45) / ('\\' <([0-3] [0-7] [0-7])> Action46) / ('\\' <([0-7] [0-7]?)> Action47) / ('\\' '\\' Action48))> */
2100 func() bool {
2101 position171, tokenIndex171, depth171 := position, tokenIndex, depth
2102 {
2103 position172 := position
2104 depth++
2105 {
2106 position173, tokenIndex173, depth173 := position, tokenIndex, depth
2107 if buffer[position] != rune('\\') {
2108 goto l174
2109 }
2110 position++
2111 {
2112 position175, tokenIndex175, depth175 := position, tokenIndex, depth
2113 if buffer[position] != rune('a') {
2114 goto l176
2115 }
2116 position++
2117 goto l175
2118 l176:
2119 position, tokenIndex, depth = position175, tokenIndex175, depth175
2120 if buffer[position] != rune('A') {
2121 goto l174
2122 }
2123 position++
2124 }
2125 l175:
2126 {
2127 add(ruleAction32, position)
2128 }
2129 goto l173
2130 l174:
2131 position, tokenIndex, depth = position173, tokenIndex173, depth173
2132 if buffer[position] != rune('\\') {
2133 goto l178
2134 }
2135 position++
2136 {
2137 position179, tokenIndex179, depth179 := position, tokenIndex, depth
2138 if buffer[position] != rune('b') {
2139 goto l180
2140 }
2141 position++
2142 goto l179
2143 l180:
2144 position, tokenIndex, depth = position179, tokenIndex179, depth179
2145 if buffer[position] != rune('B') {
2146 goto l178
2147 }
2148 position++
2149 }
2150 l179:
2151 {
2152 add(ruleAction33, position)
2153 }
2154 goto l173
2155 l178:
2156 position, tokenIndex, depth = position173, tokenIndex173, depth173
2157 if buffer[position] != rune('\\') {
2158 goto l182
2159 }
2160 position++
2161 {
2162 position183, tokenIndex183, depth183 := position, tokenIndex, depth
2163 if buffer[position] != rune('e') {
2164 goto l184
2165 }
2166 position++
2167 goto l183
2168 l184:
2169 position, tokenIndex, depth = position183, tokenIndex183, depth183
2170 if buffer[position] != rune('E') {
2171 goto l182
2172 }
2173 position++
2174 }
2175 l183:
2176 {
2177 add(ruleAction34, position)
2178 }
2179 goto l173
2180 l182:
2181 position, tokenIndex, depth = position173, tokenIndex173, depth173
2182 if buffer[position] != rune('\\') {
2183 goto l186
2184 }
2185 position++
2186 {
2187 position187, tokenIndex187, depth187 := position, tokenIndex, depth
2188 if buffer[position] != rune('f') {
2189 goto l188
2190 }
2191 position++
2192 goto l187
2193 l188:
2194 position, tokenIndex, depth = position187, tokenIndex187, depth187
2195 if buffer[position] != rune('F') {
2196 goto l186
2197 }
2198 position++
2199 }
2200 l187:
2201 {
2202 add(ruleAction35, position)
2203 }
2204 goto l173
2205 l186:
2206 position, tokenIndex, depth = position173, tokenIndex173, depth173
2207 if buffer[position] != rune('\\') {
2208 goto l190
2209 }
2210 position++
2211 {
2212 position191, tokenIndex191, depth191 := position, tokenIndex, depth
2213 if buffer[position] != rune('n') {
2214 goto l192
2215 }
2216 position++
2217 goto l191
2218 l192:
2219 position, tokenIndex, depth = position191, tokenIndex191, depth191
2220 if buffer[position] != rune('N') {
2221 goto l190
2222 }
2223 position++
2224 }
2225 l191:
2226 {
2227 add(ruleAction36, position)
2228 }
2229 goto l173
2230 l190:
2231 position, tokenIndex, depth = position173, tokenIndex173, depth173
2232 if buffer[position] != rune('\\') {
2233 goto l194
2234 }
2235 position++
2236 {
2237 position195, tokenIndex195, depth195 := position, tokenIndex, depth
2238 if buffer[position] != rune('r') {
2239 goto l196
2240 }
2241 position++
2242 goto l195
2243 l196:
2244 position, tokenIndex, depth = position195, tokenIndex195, depth195
2245 if buffer[position] != rune('R') {
2246 goto l194
2247 }
2248 position++
2249 }
2250 l195:
2251 {
2252 add(ruleAction37, position)
2253 }
2254 goto l173
2255 l194:
2256 position, tokenIndex, depth = position173, tokenIndex173, depth173
2257 if buffer[position] != rune('\\') {
2258 goto l198
2259 }
2260 position++
2261 {
2262 position199, tokenIndex199, depth199 := position, tokenIndex, depth
2263 if buffer[position] != rune('t') {
2264 goto l200
2265 }
2266 position++
2267 goto l199
2268 l200:
2269 position, tokenIndex, depth = position199, tokenIndex199, depth199
2270 if buffer[position] != rune('T') {
2271 goto l198
2272 }
2273 position++
2274 }
2275 l199:
2276 {
2277 add(ruleAction38, position)
2278 }
2279 goto l173
2280 l198:
2281 position, tokenIndex, depth = position173, tokenIndex173, depth173
2282 if buffer[position] != rune('\\') {
2283 goto l202
2284 }
2285 position++
2286 {
2287 position203, tokenIndex203, depth203 := position, tokenIndex, depth
2288 if buffer[position] != rune('v') {
2289 goto l204
2290 }
2291 position++
2292 goto l203
2293 l204:
2294 position, tokenIndex, depth = position203, tokenIndex203, depth203
2295 if buffer[position] != rune('V') {
2296 goto l202
2297 }
2298 position++
2299 }
2300 l203:
2301 {
2302 add(ruleAction39, position)
2303 }
2304 goto l173
2305 l202:
2306 position, tokenIndex, depth = position173, tokenIndex173, depth173
2307 if buffer[position] != rune('\\') {
2308 goto l206
2309 }
2310 position++
2311 if buffer[position] != rune('\'') {
2312 goto l206
2313 }
2314 position++
2315 {
2316 add(ruleAction40, position)
2317 }
2318 goto l173
2319 l206:
2320 position, tokenIndex, depth = position173, tokenIndex173, depth173
2321 if buffer[position] != rune('\\') {
2322 goto l208
2323 }
2324 position++
2325 if buffer[position] != rune('"') {
2326 goto l208
2327 }
2328 position++
2329 {
2330 add(ruleAction41, position)
2331 }
2332 goto l173
2333 l208:
2334 position, tokenIndex, depth = position173, tokenIndex173, depth173
2335 if buffer[position] != rune('\\') {
2336 goto l210
2337 }
2338 position++
2339 if buffer[position] != rune('[') {
2340 goto l210
2341 }
2342 position++
2343 {
2344 add(ruleAction42, position)
2345 }
2346 goto l173
2347 l210:
2348 position, tokenIndex, depth = position173, tokenIndex173, depth173
2349 if buffer[position] != rune('\\') {
2350 goto l212
2351 }
2352 position++
2353 if buffer[position] != rune(']') {
2354 goto l212
2355 }
2356 position++
2357 {
2358 add(ruleAction43, position)
2359 }
2360 goto l173
2361 l212:
2362 position, tokenIndex, depth = position173, tokenIndex173, depth173
2363 if buffer[position] != rune('\\') {
2364 goto l214
2365 }
2366 position++
2367 if buffer[position] != rune('-') {
2368 goto l214
2369 }
2370 position++
2371 {
2372 add(ruleAction44, position)
2373 }
2374 goto l173
2375 l214:
2376 position, tokenIndex, depth = position173, tokenIndex173, depth173
2377 if buffer[position] != rune('\\') {
2378 goto l216
2379 }
2380 position++
2381 if buffer[position] != rune('0') {
2382 goto l216
2383 }
2384 position++
2385 {
2386 position217, tokenIndex217, depth217 := position, tokenIndex, depth
2387 if buffer[position] != rune('x') {
2388 goto l218
2389 }
2390 position++
2391 goto l217
2392 l218:
2393 position, tokenIndex, depth = position217, tokenIndex217, depth217
2394 if buffer[position] != rune('X') {
2395 goto l216
2396 }
2397 position++
2398 }
2399 l217:
2400 {
2401 position219 := position
2402 depth++
2403 {
2404 switch buffer[position] {
2405 case 'A', 'B', 'C', 'D', 'E', 'F':
2406 if c := buffer[position]; c < rune('A') || c > rune('F') {
2407 goto l216
2408 }
2409 position++
2410 break
2411 case 'a', 'b', 'c', 'd', 'e', 'f':
2412 if c := buffer[position]; c < rune('a') || c > rune('f') {
2413 goto l216
2414 }
2415 position++
2416 break
2417 default:
2418 if c := buffer[position]; c < rune('0') || c > rune('9') {
2419 goto l216
2420 }
2421 position++
2422 break
2423 }
2424 }
2425
2426 l220:
2427 {
2428 position221, tokenIndex221, depth221 := position, tokenIndex, depth
2429 {
2430 switch buffer[position] {
2431 case 'A', 'B', 'C', 'D', 'E', 'F':
2432 if c := buffer[position]; c < rune('A') || c > rune('F') {
2433 goto l221
2434 }
2435 position++
2436 break
2437 case 'a', 'b', 'c', 'd', 'e', 'f':
2438 if c := buffer[position]; c < rune('a') || c > rune('f') {
2439 goto l221
2440 }
2441 position++
2442 break
2443 default:
2444 if c := buffer[position]; c < rune('0') || c > rune('9') {
2445 goto l221
2446 }
2447 position++
2448 break
2449 }
2450 }
2451
2452 goto l220
2453 l221:
2454 position, tokenIndex, depth = position221, tokenIndex221, depth221
2455 }
2456 depth--
2457 add(rulePegText, position219)
2458 }
2459 {
2460 add(ruleAction45, position)
2461 }
2462 goto l173
2463 l216:
2464 position, tokenIndex, depth = position173, tokenIndex173, depth173
2465 if buffer[position] != rune('\\') {
2466 goto l225
2467 }
2468 position++
2469 {
2470 position226 := position
2471 depth++
2472 if c := buffer[position]; c < rune('0') || c > rune('3') {
2473 goto l225
2474 }
2475 position++
2476 if c := buffer[position]; c < rune('0') || c > rune('7') {
2477 goto l225
2478 }
2479 position++
2480 if c := buffer[position]; c < rune('0') || c > rune('7') {
2481 goto l225
2482 }
2483 position++
2484 depth--
2485 add(rulePegText, position226)
2486 }
2487 {
2488 add(ruleAction46, position)
2489 }
2490 goto l173
2491 l225:
2492 position, tokenIndex, depth = position173, tokenIndex173, depth173
2493 if buffer[position] != rune('\\') {
2494 goto l228
2495 }
2496 position++
2497 {
2498 position229 := position
2499 depth++
2500 if c := buffer[position]; c < rune('0') || c > rune('7') {
2501 goto l228
2502 }
2503 position++
2504 {
2505 position230, tokenIndex230, depth230 := position, tokenIndex, depth
2506 if c := buffer[position]; c < rune('0') || c > rune('7') {
2507 goto l230
2508 }
2509 position++
2510 goto l231
2511 l230:
2512 position, tokenIndex, depth = position230, tokenIndex230, depth230
2513 }
2514 l231:
2515 depth--
2516 add(rulePegText, position229)
2517 }
2518 {
2519 add(ruleAction47, position)
2520 }
2521 goto l173
2522 l228:
2523 position, tokenIndex, depth = position173, tokenIndex173, depth173
2524 if buffer[position] != rune('\\') {
2525 goto l171
2526 }
2527 position++
2528 if buffer[position] != rune('\\') {
2529 goto l171
2530 }
2531 position++
2532 {
2533 add(ruleAction48, position)
2534 }
2535 }
2536 l173:
2537 depth--
2538 add(ruleEscape, position172)
2539 }
2540 return true
2541 l171:
2542 position, tokenIndex, depth = position171, tokenIndex171, depth171
2543 return false
2544 },
2545 /* 20 LeftArrow <- <((('<' '-') / '←') Spacing)> */
2546 func() bool {
2547 position234, tokenIndex234, depth234 := position, tokenIndex, depth
2548 {
2549 position235 := position
2550 depth++
2551 {
2552 position236, tokenIndex236, depth236 := position, tokenIndex, depth
2553 if buffer[position] != rune('<') {
2554 goto l237
2555 }
2556 position++
2557 if buffer[position] != rune('-') {
2558 goto l237
2559 }
2560 position++
2561 goto l236
2562 l237:
2563 position, tokenIndex, depth = position236, tokenIndex236, depth236
2564 if buffer[position] != rune('←') {
2565 goto l234
2566 }
2567 position++
2568 }
2569 l236:
2570 if !_rules[ruleSpacing]() {
2571 goto l234
2572 }
2573 depth--
2574 add(ruleLeftArrow, position235)
2575 }
2576 return true
2577 l234:
2578 position, tokenIndex, depth = position234, tokenIndex234, depth234
2579 return false
2580 },
2581 /* 21 Slash <- <('/' Spacing)> */
2582 func() bool {
2583 position238, tokenIndex238, depth238 := position, tokenIndex, depth
2584 {
2585 position239 := position
2586 depth++
2587 if buffer[position] != rune('/') {
2588 goto l238
2589 }
2590 position++
2591 if !_rules[ruleSpacing]() {
2592 goto l238
2593 }
2594 depth--
2595 add(ruleSlash, position239)
2596 }
2597 return true
2598 l238:
2599 position, tokenIndex, depth = position238, tokenIndex238, depth238
2600 return false
2601 },
2602 /* 22 And <- <('&' Spacing)> */
2603 func() bool {
2604 position240, tokenIndex240, depth240 := position, tokenIndex, depth
2605 {
2606 position241 := position
2607 depth++
2608 if buffer[position] != rune('&') {
2609 goto l240
2610 }
2611 position++
2612 if !_rules[ruleSpacing]() {
2613 goto l240
2614 }
2615 depth--
2616 add(ruleAnd, position241)
2617 }
2618 return true
2619 l240:
2620 position, tokenIndex, depth = position240, tokenIndex240, depth240
2621 return false
2622 },
2623 /* 23 Not <- <('!' Spacing)> */
2624 func() bool {
2625 position242, tokenIndex242, depth242 := position, tokenIndex, depth
2626 {
2627 position243 := position
2628 depth++
2629 if buffer[position] != rune('!') {
2630 goto l242
2631 }
2632 position++
2633 if !_rules[ruleSpacing]() {
2634 goto l242
2635 }
2636 depth--
2637 add(ruleNot, position243)
2638 }
2639 return true
2640 l242:
2641 position, tokenIndex, depth = position242, tokenIndex242, depth242
2642 return false
2643 },
2644 /* 24 Question <- <('?' Spacing)> */
2645 nil,
2646 /* 25 Star <- <('*' Spacing)> */
2647 nil,
2648 /* 26 Plus <- <('+' Spacing)> */
2649 nil,
2650 /* 27 Open <- <('(' Spacing)> */
2651 nil,
2652 /* 28 Close <- <(')' Spacing)> */
2653 nil,
2654 /* 29 Dot <- <('.' Spacing)> */
2655 nil,
2656 /* 30 SpaceComment <- <(Space / Comment)> */
2657 func() bool {
2658 position250, tokenIndex250, depth250 := position, tokenIndex, depth
2659 {
2660 position251 := position
2661 depth++
2662 {
2663 position252, tokenIndex252, depth252 := position, tokenIndex, depth
2664 {
2665 position254 := position
2666 depth++
2667 {
2668 switch buffer[position] {
2669 case '\t':
2670 if buffer[position] != rune('\t') {
2671 goto l253
2672 }
2673 position++
2674 break
2675 case ' ':
2676 if buffer[position] != rune(' ') {
2677 goto l253
2678 }
2679 position++
2680 break
2681 default:
2682 if !_rules[ruleEndOfLine]() {
2683 goto l253
2684 }
2685 break
2686 }
2687 }
2688
2689 depth--
2690 add(ruleSpace, position254)
2691 }
2692 goto l252
2693 l253:
2694 position, tokenIndex, depth = position252, tokenIndex252, depth252
2695 {
2696 position256 := position
2697 depth++
2698 if buffer[position] != rune('#') {
2699 goto l250
2700 }
2701 position++
2702 l257:
2703 {
2704 position258, tokenIndex258, depth258 := position, tokenIndex, depth
2705 {
2706 position259, tokenIndex259, depth259 := position, tokenIndex, depth
2707 if !_rules[ruleEndOfLine]() {
2708 goto l259
2709 }
2710 goto l258
2711 l259:
2712 position, tokenIndex, depth = position259, tokenIndex259, depth259
2713 }
2714 if !matchDot() {
2715 goto l258
2716 }
2717 goto l257
2718 l258:
2719 position, tokenIndex, depth = position258, tokenIndex258, depth258
2720 }
2721 if !_rules[ruleEndOfLine]() {
2722 goto l250
2723 }
2724 depth--
2725 add(ruleComment, position256)
2726 }
2727 }
2728 l252:
2729 depth--
2730 add(ruleSpaceComment, position251)
2731 }
2732 return true
2733 l250:
2734 position, tokenIndex, depth = position250, tokenIndex250, depth250
2735 return false
2736 },
2737 /* 31 Spacing <- <SpaceComment*> */
2738 func() bool {
2739 {
2740 position261 := position
2741 depth++
2742 l262:
2743 {
2744 position263, tokenIndex263, depth263 := position, tokenIndex, depth
2745 if !_rules[ruleSpaceComment]() {
2746 goto l263
2747 }
2748 goto l262
2749 l263:
2750 position, tokenIndex, depth = position263, tokenIndex263, depth263
2751 }
2752 depth--
2753 add(ruleSpacing, position261)
2754 }
2755 return true
2756 },
2757 /* 32 MustSpacing <- <SpaceComment+> */
2758 func() bool {
2759 position264, tokenIndex264, depth264 := position, tokenIndex, depth
2760 {
2761 position265 := position
2762 depth++
2763 if !_rules[ruleSpaceComment]() {
2764 goto l264
2765 }
2766 l266:
2767 {
2768 position267, tokenIndex267, depth267 := position, tokenIndex, depth
2769 if !_rules[ruleSpaceComment]() {
2770 goto l267
2771 }
2772 goto l266
2773 l267:
2774 position, tokenIndex, depth = position267, tokenIndex267, depth267
2775 }
2776 depth--
2777 add(ruleMustSpacing, position265)
2778 }
2779 return true
2780 l264:
2781 position, tokenIndex, depth = position264, tokenIndex264, depth264
2782 return false
2783 },
2784 /* 33 Comment <- <('#' (!EndOfLine .)* EndOfLine)> */
2785 nil,
2786 /* 34 Space <- <((&('\t') '\t') | (&(' ') ' ') | (&('\n' | '\r') EndOfLine))> */
2787 nil,
2788 /* 35 EndOfLine <- <(('\r' '\n') / '\n' / '\r')> */
2789 func() bool {
2790 position270, tokenIndex270, depth270 := position, tokenIndex, depth
2791 {
2792 position271 := position
2793 depth++
2794 {
2795 position272, tokenIndex272, depth272 := position, tokenIndex, depth
2796 if buffer[position] != rune('\r') {
2797 goto l273
2798 }
2799 position++
2800 if buffer[position] != rune('\n') {
2801 goto l273
2802 }
2803 position++
2804 goto l272
2805 l273:
2806 position, tokenIndex, depth = position272, tokenIndex272, depth272
2807 if buffer[position] != rune('\n') {
2808 goto l274
2809 }
2810 position++
2811 goto l272
2812 l274:
2813 position, tokenIndex, depth = position272, tokenIndex272, depth272
2814 if buffer[position] != rune('\r') {
2815 goto l270
2816 }
2817 position++
2818 }
2819 l272:
2820 depth--
2821 add(ruleEndOfLine, position271)
2822 }
2823 return true
2824 l270:
2825 position, tokenIndex, depth = position270, tokenIndex270, depth270
2826 return false
2827 },
2828 /* 36 EndOfFile <- <!.> */
2829 nil,
2830 /* 37 Action <- <('{' <ActionBody*> '}' Spacing)> */
2831 func() bool {
2832 position276, tokenIndex276, depth276 := position, tokenIndex, depth
2833 {
2834 position277 := position
2835 depth++
2836 if buffer[position] != rune('{') {
2837 goto l276
2838 }
2839 position++
2840 {
2841 position278 := position
2842 depth++
2843 l279:
2844 {
2845 position280, tokenIndex280, depth280 := position, tokenIndex, depth
2846 if !_rules[ruleActionBody]() {
2847 goto l280
2848 }
2849 goto l279
2850 l280:
2851 position, tokenIndex, depth = position280, tokenIndex280, depth280
2852 }
2853 depth--
2854 add(rulePegText, position278)
2855 }
2856 if buffer[position] != rune('}') {
2857 goto l276
2858 }
2859 position++
2860 if !_rules[ruleSpacing]() {
2861 goto l276
2862 }
2863 depth--
2864 add(ruleAction, position277)
2865 }
2866 return true
2867 l276:
2868 position, tokenIndex, depth = position276, tokenIndex276, depth276
2869 return false
2870 },
2871 /* 38 ActionBody <- <((!('{' / '}') .) / ('{' ActionBody* '}'))> */
2872 func() bool {
2873 position281, tokenIndex281, depth281 := position, tokenIndex, depth
2874 {
2875 position282 := position
2876 depth++
2877 {
2878 position283, tokenIndex283, depth283 := position, tokenIndex, depth
2879 {
2880 position285, tokenIndex285, depth285 := position, tokenIndex, depth
2881 {
2882 position286, tokenIndex286, depth286 := position, tokenIndex, depth
2883 if buffer[position] != rune('{') {
2884 goto l287
2885 }
2886 position++
2887 goto l286
2888 l287:
2889 position, tokenIndex, depth = position286, tokenIndex286, depth286
2890 if buffer[position] != rune('}') {
2891 goto l285
2892 }
2893 position++
2894 }
2895 l286:
2896 goto l284
2897 l285:
2898 position, tokenIndex, depth = position285, tokenIndex285, depth285
2899 }
2900 if !matchDot() {
2901 goto l284
2902 }
2903 goto l283
2904 l284:
2905 position, tokenIndex, depth = position283, tokenIndex283, depth283
2906 if buffer[position] != rune('{') {
2907 goto l281
2908 }
2909 position++
2910 l288:
2911 {
2912 position289, tokenIndex289, depth289 := position, tokenIndex, depth
2913 if !_rules[ruleActionBody]() {
2914 goto l289
2915 }
2916 goto l288
2917 l289:
2918 position, tokenIndex, depth = position289, tokenIndex289, depth289
2919 }
2920 if buffer[position] != rune('}') {
2921 goto l281
2922 }
2923 position++
2924 }
2925 l283:
2926 depth--
2927 add(ruleActionBody, position282)
2928 }
2929 return true
2930 l281:
2931 position, tokenIndex, depth = position281, tokenIndex281, depth281
2932 return false
2933 },
2934 /* 39 Begin <- <('<' Spacing)> */
2935 nil,
2936 /* 40 End <- <('>' Spacing)> */
2937 nil,
2938 /* 42 Action0 <- <{ p.AddPackage(text) }> */
2939 nil,
2940 /* 43 Action1 <- <{ p.AddPeg(text) }> */
2941 nil,
2942 /* 44 Action2 <- <{ p.AddState(text) }> */
2943 nil,
2944 nil,
2945 /* 46 Action3 <- <{ p.AddImport(text) }> */
2946 nil,
2947 /* 47 Action4 <- <{ p.AddRule(text) }> */
2948 nil,
2949 /* 48 Action5 <- <{ p.AddExpression() }> */
2950 nil,
2951 /* 49 Action6 <- <{ p.AddAlternate() }> */
2952 nil,
2953 /* 50 Action7 <- <{ p.AddNil(); p.AddAlternate() }> */
2954 nil,
2955 /* 51 Action8 <- <{ p.AddNil() }> */
2956 nil,
2957 /* 52 Action9 <- <{ p.AddSequence() }> */
2958 nil,
2959 /* 53 Action10 <- <{ p.AddPredicate(text) }> */
2960 nil,
2961 /* 54 Action11 <- <{ p.AddStateChange(text) }> */
2962 nil,
2963 /* 55 Action12 <- <{ p.AddPeekFor() }> */
2964 nil,
2965 /* 56 Action13 <- <{ p.AddPeekNot() }> */
2966 nil,
2967 /* 57 Action14 <- <{ p.AddQuery() }> */
2968 nil,
2969 /* 58 Action15 <- <{ p.AddStar() }> */
2970 nil,
2971 /* 59 Action16 <- <{ p.AddPlus() }> */
2972 nil,
2973 /* 60 Action17 <- <{ p.AddName(text) }> */
2974 nil,
2975 /* 61 Action18 <- <{ p.AddDot() }> */
2976 nil,
2977 /* 62 Action19 <- <{ p.AddAction(text) }> */
2978 nil,
2979 /* 63 Action20 <- <{ p.AddPush() }> */
2980 nil,
2981 /* 64 Action21 <- <{ p.AddSequence() }> */
2982 nil,
2983 /* 65 Action22 <- <{ p.AddSequence() }> */
2984 nil,
2985 /* 66 Action23 <- <{ p.AddPeekNot(); p.AddDot(); p.AddSequence() }> */
2986 nil,
2987 /* 67 Action24 <- <{ p.AddPeekNot(); p.AddDot(); p.AddSequence() }> */
2988 nil,
2989 /* 68 Action25 <- <{ p.AddAlternate() }> */
2990 nil,
2991 /* 69 Action26 <- <{ p.AddAlternate() }> */
2992 nil,
2993 /* 70 Action27 <- <{ p.AddRange() }> */
2994 nil,
2995 /* 71 Action28 <- <{ p.AddDoubleRange() }> */
2996 nil,
2997 /* 72 Action29 <- <{ p.AddCharacter(text) }> */
2998 nil,
2999 /* 73 Action30 <- <{ p.AddDoubleCharacter(text) }> */
3000 nil,
3001 /* 74 Action31 <- <{ p.AddCharacter(text) }> */
3002 nil,
3003 /* 75 Action32 <- <{ p.AddCharacter("\a") }> */
3004 nil,
3005 /* 76 Action33 <- <{ p.AddCharacter("\b") }> */
3006 nil,
3007 /* 77 Action34 <- <{ p.AddCharacter("\x1B") }> */
3008 nil,
3009 /* 78 Action35 <- <{ p.AddCharacter("\f") }> */
3010 nil,
3011 /* 79 Action36 <- <{ p.AddCharacter("\n") }> */
3012 nil,
3013 /* 80 Action37 <- <{ p.AddCharacter("\r") }> */
3014 nil,
3015 /* 81 Action38 <- <{ p.AddCharacter("\t") }> */
3016 nil,
3017 /* 82 Action39 <- <{ p.AddCharacter("\v") }> */
3018 nil,
3019 /* 83 Action40 <- <{ p.AddCharacter("'") }> */
3020 nil,
3021 /* 84 Action41 <- <{ p.AddCharacter("\"") }> */
3022 nil,
3023 /* 85 Action42 <- <{ p.AddCharacter("[") }> */
3024 nil,
3025 /* 86 Action43 <- <{ p.AddCharacter("]") }> */
3026 nil,
3027 /* 87 Action44 <- <{ p.AddCharacter("-") }> */
3028 nil,
3029 /* 88 Action45 <- <{ p.AddHexaCharacter(text) }> */
3030 nil,
3031 /* 89 Action46 <- <{ p.AddOctalCharacter(text) }> */
3032 nil,
3033 /* 90 Action47 <- <{ p.AddOctalCharacter(text) }> */
3034 nil,
3035 /* 91 Action48 <- <{ p.AddCharacter("\\") }> */
3036 nil,
3037 }
3038 p.rules = _rules
3039 }
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build ignore
5
6 package main
7
8 import (
9 "flag"
10 "fmt"
11 "io/ioutil"
12 "log"
13 "os"
14 "os/exec"
15 "path/filepath"
16 "reflect"
17 "runtime"
18 "strings"
19 "text/template"
20 "time"
21 )
22
23 func main() {
24 flag.Parse()
25
26 args, target := flag.Args(), "peg"
27 if len(args) > 0 {
28 target = args[0]
29 }
30
31 switch target {
32 case "buildinfo":
33 buildinfo()
34 case "peg":
35 peg()
36 case "clean":
37 clean()
38 case "test":
39 test()
40 case "bench":
41 bench()
42 case "help":
43 fmt.Println("go run build.go [target]")
44 fmt.Println(" peg - build peg from scratch")
45 fmt.Println(" clean - clean up")
46 fmt.Println(" test - run full test")
47 fmt.Println(" bench - run benchmark")
48 fmt.Println(" buildinfo - generate buildinfo.go")
49 }
50 }
51
52 const BuildinfoTemplate = `// Code Generated by "build.go buildinfo" DO NOT EDIT.
53 package main
54
55 const (
56 // VERSION is the version of peg
57 VERSION = "{{.Version}}"
58 // BUILDTIME is the build time of peg
59 BUILDTIME = "{{.Buildtime}}"
60 // COMMIT is the commit hash of peg
61 COMMIT = "{{.Commit}}"
62 // IS_TAGGED is there a version
63 IS_TAGGED = {{.IsTagged}}
64 )
65 `
66
67 func buildinfo() {
68 log.SetPrefix("buildinfo:")
69 type info struct {
70 Version string
71 Buildtime string
72 Commit string
73 IsTagged bool
74 }
75 infFile, err := os.Create("buildinfo.go")
76 defer infFile.Close()
77 if err != nil {
78 log.Println("open buildinfo.go: fatal:", err)
79 }
80 var inf info = info{
81 Version: "unknown", // show this if we can't get the version
82 }
83 vers, err := exec.Command("git", "tag", "--contains").Output()
84 if err != nil {
85 log.Println("error:", err)
86 } else if len(vers) > 1 { // ignore any single newlines that might exist
87 inf.IsTagged = true
88 inf.Version = strings.TrimSuffix(string(vers), "\n")
89 } else {
90 vers, err = exec.Command("git", "tag", "--merged", "--sort=v:refname").Output()
91 if err != nil {
92 log.Println("error:", err)
93 } else if len(vers) > 1 {
94 tags := strings.Split(string(vers), "\n")
95 inf.Version = tags[len(tags)-1]
96 }
97 }
98
99 cmit, err := exec.Command("git", "rev-parse", "HEAD").Output()
100 if err != nil {
101 log.Println("error:", err)
102 }
103 inf.Commit = strings.TrimSuffix(string(cmit), "\n")
104 // slice the constant to remove the timezone specifier
105 inf.Buildtime = time.Now().UTC().Format(time.RFC3339[0:19])
106
107 err = template.Must(template.New("buildinfo").Parse(BuildinfoTemplate)).Execute(infFile, inf)
108 if err != nil {
109 log.Println("error: template:", err)
110 }
111 log.SetPrefix("")
112 }
113
114 var processed = make(map[string]bool)
115
116 func done(file string, deps ...interface{}) bool {
117 fini := true
118 file = filepath.FromSlash(file)
119 info, err := os.Stat(file)
120 if err != nil {
121 fini = false
122 }
123 for _, dep := range deps {
124 switch dep := dep.(type) {
125 case string:
126 if info == nil {
127 fini = false
128 break
129 }
130 dep = filepath.FromSlash(dep)
131 fileInfo, err := os.Stat(dep)
132 if err != nil {
133 panic(err)
134 }
135
136 if fileInfo.ModTime().After(info.ModTime()) {
137 fini = false
138 }
139 case func() bool:
140 name := runtime.FuncForPC(reflect.ValueOf(dep).Pointer()).Name()
141 if result, ok := processed[name]; ok {
142 fini = fini && result
143 fmt.Printf("%s is done\n", name)
144 break
145 }
146 result := dep()
147 fini = fini && result
148 fmt.Printf("%s\n", name)
149 processed[name] = result
150 }
151 }
152
153 return fini
154 }
155
156 func chdir(dir string) string {
157 dir = filepath.FromSlash(dir)
158 working, err := os.Getwd()
159 if err != nil {
160 panic(err)
161 }
162 err = os.Chdir(dir)
163 if err != nil {
164 panic(err)
165 }
166 fmt.Printf("cd %s\n", dir)
167 return working
168 }
169
170 func command(name, inputFile, outputFile string, arg ...string) {
171 name = filepath.FromSlash(name)
172 inputFile = filepath.FromSlash(inputFile)
173 outputFile = filepath.FromSlash(outputFile)
174 fmt.Print(name)
175 for _, a := range arg {
176 fmt.Printf(" %s", a)
177 }
178
179 cmd := exec.Command(name, arg...)
180
181 if inputFile != "" {
182 fmt.Printf(" < %s", inputFile)
183 input, err := ioutil.ReadFile(inputFile)
184 if err != nil {
185 panic(err)
186 }
187 writer, err := cmd.StdinPipe()
188 if err != nil {
189 panic(err)
190 }
191 go func() {
192 defer writer.Close()
193 _, err := writer.Write([]byte(input))
194 if err != nil {
195 panic(err)
196 }
197 }()
198 }
199
200 if outputFile != "" {
201 fmt.Printf(" > %s\n", outputFile)
202 output, err := cmd.Output()
203 if err != nil {
204 panic(err)
205 }
206 err = ioutil.WriteFile(outputFile, output, 0600)
207 if err != nil {
208 panic(err)
209 }
210 } else {
211 output, err := cmd.CombinedOutput()
212 fmt.Printf("\n%s", string(output))
213 if err != nil {
214 panic(err)
215 }
216 }
217 }
218
219 func delete(file string) {
220 file = filepath.FromSlash(file)
221 fmt.Printf("rm -f %s\n", file)
222 os.Remove(file)
223 }
224
225 func deleteFilesWithSuffix(suffix string) {
226 files, err := ioutil.ReadDir(".")
227 if err != nil {
228 panic(err)
229 }
230 for _, file := range files {
231 if strings.HasSuffix(file.Name(), suffix) {
232 delete(file.Name())
233 }
234 }
235 }
236
237 func bootstrap() bool {
238 if done("bootstrap/bootstrap", "bootstrap/main.go", "tree/peg.go") {
239 return true
240 }
241
242 wd := chdir("bootstrap")
243 defer chdir(wd)
244
245 command("go", "", "", "build")
246
247 return false
248 }
249
250 func peg0() bool {
251 if done("cmd/peg-bootstrap/peg0", "cmd/peg-bootstrap/main.go", bootstrap) {
252 return true
253 }
254
255 wd := chdir("cmd/peg-bootstrap/")
256 defer chdir(wd)
257
258 deleteFilesWithSuffix(".peg.go")
259 command("../../bootstrap/bootstrap", "", "")
260 command("go", "", "", "build", "-tags", "bootstrap", "-o", "peg0")
261
262 return false
263 }
264
265 func peg1() bool {
266 if done("cmd/peg-bootstrap/peg1", peg0, "cmd/peg-bootstrap/bootstrap.peg") {
267 return true
268 }
269
270 wd := chdir("cmd/peg-bootstrap/")
271 defer chdir(wd)
272
273 deleteFilesWithSuffix(".peg.go")
274 command("./peg0", "bootstrap.peg", "peg1.peg.go")
275 command("go", "", "", "build", "-tags", "bootstrap", "-o", "peg1")
276
277 return false
278 }
279
280 func peg2() bool {
281 if done("cmd/peg-bootstrap/peg2", peg1, "cmd/peg-bootstrap/peg.bootstrap.peg") {
282 return true
283 }
284
285 wd := chdir("cmd/peg-bootstrap/")
286 defer chdir(wd)
287
288 deleteFilesWithSuffix(".peg.go")
289 command("./peg1", "peg.bootstrap.peg", "peg2.peg.go")
290 command("go", "", "", "build", "-tags", "bootstrap", "-o", "peg2")
291
292 return false
293 }
294
295 func peg3() bool {
296 if done("cmd/peg-bootstrap/peg3", peg2, "peg.peg") {
297 return true
298 }
299
300 wd := chdir("cmd/peg-bootstrap/")
301 defer chdir(wd)
302
303 deleteFilesWithSuffix(".peg.go")
304 command("./peg2", "../../peg.peg", "peg3.peg.go")
305 command("go", "", "", "build", "-tags", "bootstrap", "-o", "peg3")
306
307 return false
308 }
309
310 func peg_bootstrap() bool {
311 if done("cmd/peg-bootstrap/peg-bootstrap", peg3) {
312 return true
313 }
314
315 wd := chdir("cmd/peg-bootstrap/")
316 defer chdir(wd)
317
318 deleteFilesWithSuffix(".peg.go")
319 command("./peg3", "../../peg.peg", "peg-bootstrap.peg.go")
320 command("go", "", "", "build", "-tags", "bootstrap", "-o", "peg-bootstrap")
321
322 return false
323 }
324
325 func peg_peg_go() bool {
326 if done("peg.peg.go", peg_bootstrap) {
327 return true
328 }
329
330 command("cmd/peg-bootstrap/peg-bootstrap", "peg.peg", "peg.peg.go")
331 command("go", "", "", "build")
332 command("./peg", "", "", "-inline", "-switch", "peg.peg")
333
334 return false
335 }
336
337 func peg() bool {
338 if done("peg", peg_peg_go, "main.go") {
339 return true
340 }
341
342 command("go", "", "", "build")
343
344 return false
345 }
346
347 func clean() bool {
348 delete("bootstrap/bootstrap")
349
350 delete("grammars/c/c.peg.go")
351 delete("grammars/calculator/calculator.peg.go")
352 delete("grammars/fexl/fexl.peg.go")
353 delete("grammars/java/java_1_7.peg.go")
354 delete("grammars/long_test/long.peg.go")
355
356 wd := chdir("cmd/peg-bootstrap/")
357 defer chdir(wd)
358
359 deleteFilesWithSuffix(".peg.go")
360 delete("peg0")
361 delete("peg1")
362 delete("peg2")
363 delete("peg3")
364 delete("peg-bootstrap")
365
366 return false
367 }
368
369 func grammars_c() bool {
370 if done("grammars/c/c.peg.go", peg, "grammars/c/c.peg") {
371 return true
372 }
373
374 wd := chdir("grammars/c/")
375 defer chdir(wd)
376
377 command("../../peg", "", "", "-switch", "-inline", "c.peg")
378
379 return false
380 }
381
382 func grammars_calculator() bool {
383 if done("grammars/calculator/calculator.peg.go", peg, "grammars/calculator/calculator.peg") {
384 return true
385 }
386
387 wd := chdir("grammars/calculator/")
388 defer chdir(wd)
389
390 command("../../peg", "", "", "-switch", "-inline", "calculator.peg")
391
392 return false
393 }
394
395 func grammars_calculator_ast() bool {
396 if done("grammars/calculator_ast/calculator.peg.go", peg, "grammars/calculator_ast/calculator.peg") {
397 return true
398 }
399
400 wd := chdir("grammars/calculator_ast/")
401 defer chdir(wd)
402
403 command("../../peg", "", "", "-switch", "-inline", "calculator.peg")
404
405 return false
406 }
407
408 func grammars_fexl() bool {
409 if done("grammars/fexl/fexl.peg.go", peg, "grammars/fexl/fexl.peg") {
410 return true
411 }
412
413 wd := chdir("grammars/fexl/")
414 defer chdir(wd)
415
416 command("../../peg", "", "", "-switch", "-inline", "fexl.peg")
417
418 return false
419 }
420
421 func grammars_java() bool {
422 if done("grammars/java/java_1_7.peg.go", peg, "grammars/java/java_1_7.peg") {
423 return true
424 }
425
426 wd := chdir("grammars/java/")
427 defer chdir(wd)
428
429 command("../../peg", "", "", "-switch", "-inline", "java_1_7.peg")
430
431 return false
432 }
433
434 func grammars_long_test() bool {
435 if done("grammars/long_test/long.peg.go", peg, "grammars/long_test/long.peg") {
436 return true
437 }
438
439 wd := chdir("grammars/long_test/")
440 defer chdir(wd)
441
442 command("../../peg", "", "", "-switch", "-inline", "long.peg")
443
444 return false
445 }
446
447 func test() bool {
448 if done("", grammars_c, grammars_calculator, grammars_calculator_ast,
449 grammars_fexl, grammars_java, grammars_long_test) {
450 return true
451 }
452
453 command("go", "", "", "test", "-short", "-tags", "grammars", "./...")
454
455 return false
456 }
457
458 func bench() bool {
459 peg()
460
461 command("go", "", "", "test", "-benchmem", "-bench", ".")
462
463 return false
464 }
0 // Code Generated by "build.go buildinfo" DO NOT EDIT.
1 package main
2
3 const (
4 // VERSION is the version of peg
5 VERSION = "unknown"
6 // BUILDTIME is the build time of peg
7 BUILDTIME = "2020-08-26T03:40:14"
8 // COMMIT is the commit hash of peg
9 COMMIT = "5cdb3adc061370cdd20392ffe2740cc8db104126"
10 // IS_TAGGED is there a version
11 IS_TAGGED = false
12 )
0 # Core bootstrap PE Grammar for peg language.
1 # Adapted from peg.peg.
2
3 Grammar <- Spacing { p.AddPackage("main") }
4 { p.AddImport("github.com/pointlander/peg/tree") }
5 { p.AddPeg("Peg"); p.AddState("*tree.Tree") }
6 Action* Definition* !.
7
8 Definition <- Identifier { p.AddRule(text) }
9 LeftArrow Expression { p.AddExpression() }
10 Expression <- Sequence (Slash Sequence { p.AddAlternate() } )*
11 Sequence <- Prefix (Prefix { p.AddSequence() } )*
12 Prefix <- '!' Suffix { p.AddPeekNot() } / Suffix
13 Suffix <- Primary (Question { p.AddQuery() }
14 / Star { p.AddStar() } )?
15 Primary <- Identifier !LeftArrow { p.AddName(text) }
16 / Open Expression Close
17 / Literal / Class / Dot { p.AddDot() }
18 / Action { p.AddAction(text) }
19 / Begin Expression End { p.AddPush() }
20
21 Identifier <- < Ident Ident* > Spacing
22 Ident <- [A-Za-z]
23 Literal <- ['] !['] Char (!['] Char { p.AddSequence() } )* ['] Spacing
24 Class <- '[' Range (!']' Range { p.AddAlternate() } )* ']' Spacing
25 Range <- Char '-' Char { p.AddRange() } / Char
26 Char <- '\\0x' <[0-9a-f]*> { p.AddHexaCharacter(text) }
27 / '\\\\' { p.AddCharacter("\\") }
28 / !'\\' <.> { p.AddCharacter(text) }
29
30 LeftArrow <- '<-' Spacing
31 Slash <- '/' Spacing
32 Question <- '?' Spacing
33 Star <- '*' Spacing
34 Open <- '(' Spacing
35 Close <- ')' Spacing
36 Dot <- '.' Spacing
37
38 Spacing <- (Space / Comment)*
39 Comment <- '#' (!EndOfLine .)*
40 Space <- ' ' / '\0x9' / EndOfLine
41 EndOfLine <- '\0xd\0xa' / '\0xa' / '\0xd'
42
43 Action <- '{' < (![}].)* > '}' Spacing
44 Begin <- '<' Spacing
45 End <- '>' Spacing
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build bootstrap
5
6 package main
7
8 import (
9 "io/ioutil"
10 "log"
11 "os"
12
13 "github.com/pointlander/peg/tree"
14 )
15
16 func main() {
17 buffer, err := ioutil.ReadAll(os.Stdin)
18 if err != nil {
19 log.Fatal(err)
20 }
21 p := &Peg{Tree: tree.New(false, false, false), Buffer: string(buffer)}
22 p.Init(Pretty(true), Size(1<<15))
23 if err := p.Parse(); err != nil {
24 log.Fatal(err)
25 }
26 p.Execute()
27 p.Compile("boot.peg.go", os.Args, os.Stdout)
28 }
0 # PE Grammar for bootstrap peg language
1 #
2 # Adapted from peg.peg.
3
4 # Hierarchical syntax
5 Grammar <- Spacing 'package' MustSpacing Identifier { p.AddPackage(text) }
6 Import*
7 'type' MustSpacing Identifier { p.AddPeg(text) }
8 'Peg' Spacing Action { p.AddState(text) }
9 Definition Definition* EndOfFile
10
11 Import <- 'import' Spacing ["] < ([a-zA-Z_/.]/'-')([a-zA-Z_/.]/'-')* > ["] Spacing { p.AddImport(text) }
12
13 Definition <- Identifier { p.AddRule(text) }
14 LeftArrow Expression { p.AddExpression() }
15 Expression <- Sequence (Slash Sequence { p.AddAlternate() }
16 )* (Slash { p.AddNil(); p.AddAlternate() }
17 )?
18 / { p.AddNil() }
19 Sequence <- Prefix (Prefix { p.AddSequence() }
20 )*
21 Prefix <- And Action { p.AddPredicate(text) }
22 / Not Action { p.AddStateChange(text) }
23 / And Suffix { p.AddPeekFor() }
24 / Not Suffix { p.AddPeekNot() }
25 / Suffix
26 Suffix <- Primary (Question { p.AddQuery() }
27 / Star { p.AddStar() }
28 / Plus { p.AddPlus() }
29 )?
30 Primary <- Identifier !LeftArrow { p.AddName(text) }
31 / Open Expression Close
32 / Literal
33 / Class
34 / Dot { p.AddDot() }
35 / Action { p.AddAction(text) }
36 / Begin Expression End { p.AddPush() }
37
38 # Lexical syntax
39
40 Identifier <- < IdentStart IdentCont* > Spacing
41 IdentStart <- [A-Za-z_]
42 IdentCont <- IdentStart / [0-9]
43 Literal <- ['] (!['] Char)? (!['] Char { p.AddSequence() }
44 )* ['] Spacing
45 / ["] (!["] DoubleChar)? (!["] DoubleChar { p.AddSequence() }
46 )* ["] Spacing
47 Class <- ( '[[' ( '^' DoubleRanges { p.AddPeekNot(); p.AddDot(); p.AddSequence() }
48 / DoubleRanges )?
49 ']]'
50 / '[' ( '^' Ranges { p.AddPeekNot(); p.AddDot(); p.AddSequence() }
51 / Ranges )?
52 ']' )
53 Spacing
54 Ranges <- !']' Range (!']' Range { p.AddAlternate() }
55 )*
56 DoubleRanges <- !']]' DoubleRange (!']]' DoubleRange { p.AddAlternate() }
57 )*
58 Range <- Char '-' Char { p.AddRange() }
59 / Char
60 DoubleRange <- Char '-' Char { p.AddDoubleRange() }
61 / DoubleChar
62 Char <- Escape
63 / !'\\' <.> { p.AddCharacter(text) }
64 DoubleChar <- Escape
65 / <[a-zA-Z]> { p.AddDoubleCharacter(text) }
66 / !'\\' <.> { p.AddCharacter(text) }
67 Escape <- '\\' [aA] { p.AddCharacter("\a") } # bell
68 / '\\' [bB] { p.AddCharacter("\b") } # bs
69 / '\\' [eE] { p.AddCharacter("\x1B") } # esc
70 / '\\' [fF] { p.AddCharacter("\f") } # ff
71 / '\\' [nN] { p.AddCharacter("\n") } # nl
72 / '\\' [rR] { p.AddCharacter("\r") } # cr
73 / '\\' [tT] { p.AddCharacter("\t") } # ht
74 / '\\' [vV] { p.AddCharacter("\v") } # vt
75 / '\\' ['] { p.AddCharacter("'") }
76 / '\\"' { p.AddCharacter("\"") }
77 / '\\[' { p.AddCharacter("[") }
78 / '\\]' { p.AddCharacter("]") }
79 / '\\-' { p.AddCharacter("-") }
80 / '\\' '0'[xX] <[0-9a-fA-F][0-9a-fA-F]*> { p.AddHexaCharacter(text) }
81 / '\\' <[0-3][0-7][0-7]> { p.AddOctalCharacter(text) }
82 / '\\' <[0-7][0-7]?> { p.AddOctalCharacter(text) }
83 / '\\\\' { p.AddCharacter("\\") }
84 LeftArrow <- ('<-' / '\0x2190') Spacing
85 Slash <- '/' Spacing
86 And <- '&' Spacing
87 Not <- '!' Spacing
88 Question <- '?' Spacing
89 Star <- '*' Spacing
90 Plus <- '+' Spacing
91 Open <- '(' Spacing
92 Close <- ')' Spacing
93 Dot <- '.' Spacing
94 SpaceComment <- (Space / Comment)
95 Spacing <- SpaceComment*
96 MustSpacing <- SpaceComment Spacing
97 Comment <- '#' (!EndOfLine .)* EndOfLine
98 Space <- ' ' / '\0x9' / EndOfLine
99 EndOfLine <- '\0xd\0xa' / '\0xa' / '\0xd'
100 EndOfFile <- !.
101 Action <- '{' < ActionBody* > '}' Spacing
102 ActionBody <- ![{}]. / '{' ActionBody* '}'
103 Begin <- '<' Spacing
104 End <- '>' Spacing
105
0 module github.com/pointlander/peg
1
2 require github.com/pointlander/jetset v1.0.1-0.20190518214125-eee7eff80bd4
3
4 go 1.13
0 github.com/pointlander/compress v1.1.0 h1:5fUcQV2qEHvk0OpILH6eltwluN5VnwiYrkc1wjGUHnU=
1 github.com/pointlander/compress v1.1.0/go.mod h1:q5NXNGzqj5uPnVuhGkZfmgHqNUhf15VLi6L9kW0VEc0=
2 github.com/pointlander/compress v1.1.1-0.20190518213731-ff44bd196cc3 h1:hUmXhbljNFtrH5hzV9kiRoddZ5nfPTq3K0Sb2hYYiqE=
3 github.com/pointlander/compress v1.1.1-0.20190518213731-ff44bd196cc3/go.mod h1:q5NXNGzqj5uPnVuhGkZfmgHqNUhf15VLi6L9kW0VEc0=
4 github.com/pointlander/jetset v1.0.0 h1:bNlaNAX7cDPID9SlcogmXlDWq0KcRJSpKwHXaAM3bGQ=
5 github.com/pointlander/jetset v1.0.0/go.mod h1:zY6+WHRPB10uzTajloHtybSicLW1bf6Rz0eSaU9Deng=
6 github.com/pointlander/jetset v1.0.1-0.20190518214125-eee7eff80bd4 h1:RHHRCZeaNyBXdYPMjZNH8/XHDBH38TZzw8izrW7dmBE=
7 github.com/pointlander/jetset v1.0.1-0.20190518214125-eee7eff80bd4/go.mod h1:RdR1j20Aj5pB6+fw6Y9Ur7lMHpegTEjY1vc19hEZL40=
+0
-12
grammars/c/Makefile less more
0 # Copyright 2010 The Go Authors. All rights reserved.
1 # Use of this source code is governed by a BSD-style
2 # license that can be found in the LICENSE file.
3
4 c: c.peg.go main.go
5 go build
6
7 c.peg.go: c.peg
8 ../../peg -switch -inline c.peg
9
10 clean:
11 rm -f c c.peg.go
109109
110110 }
111111
112 TranslationUnit <- Spacing ExternalDeclaration+ EOT
112 TranslationUnit <- Spacing ( ExternalDeclaration / SEMI ) * EOT
113113
114114 ExternalDeclaration <- FunctionDefinition / Declaration
115115
170170
171171 StructOrUnionSpecifier
172172 <- StructOrUnion
173 ( Identifier? LWING StructDeclaration+ RWING
173 ( Identifier? LWING StructDeclaration* RWING
174174 / Identifier
175175 )
176176
177177 StructOrUnion <- STRUCT / UNION
178178
179 StructDeclaration <- SpecifierQualifierList StructDeclaratorList SEMI
179 StructDeclaration <- ( SpecifierQualifierList StructDeclaratorList? )? SEMI
180180
181181 SpecifierQualifierList
182182 <- ( TypeQualifier*
312312 #-------------------------------------------------------------------------
313313
314314 PrimaryExpression
315 <- Identifier
315 <- StringLiteral
316316 / Constant
317 / StringLiteral
317 / Identifier
318318 / LPAR Expression RPAR
319319
320320 PostfixExpression
346346 / TILDA
347347 / BANG
348348
349 CastExpression <- (LPAR TypeName RPAR)* UnaryExpression
349 CastExpression <- (LPAR TypeName RPAR CastExpression) / UnaryExpression
350350
351351 MultiplicativeExpression <- CastExpression ((STAR / DIV / MOD) CastExpression)*
352352
607607 / HexEscape
608608 / UniversalCharacter
609609
610 SimpleEscape <- '\\' ['\"?\\abfnrtv]
610 SimpleEscape <- '\\' ['\"?\\%abfnrtv]
611611 OctalEscape <- '\\' [0-7][0-7]?[0-7]?
612612 HexEscape <- '\\x' HexDigit+
613613
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build grammars
5
6 package main
7
8 import (
9 "fmt"
10 "io/ioutil"
11 "log"
12 "os"
13 "strings"
14 "testing"
15 )
16
17 func parseCBuffer(buffer string) (*C, error) {
18 clang := &C{Buffer: buffer}
19 clang.Init()
20 err := clang.Parse()
21 return clang, err
22 }
23
24 func parseC_4t(t *testing.T, src string) *C {
25 c, err := parseCBuffer(src)
26 if err != nil {
27 t.Fatal(err)
28 }
29 return c
30 }
31
32 func noParseC_4t(t *testing.T, src string) {
33 _, err := parseCBuffer(src)
34 if err == nil {
35 t.Fatal("Parsed what should not have parsed.")
36 }
37 }
38
39 func TestCParsing_Expressions1(t *testing.T) {
40 case1src :=
41 `int a() {
42 (es);
43 1++;
44 1+1;
45 a+1;
46 (a)+1;
47 a->x;
48 return 0;
49 }`
50 parseC_4t(t, case1src)
51 }
52
53 func TestCParsing_Expressions2(t *testing.T) {
54 parseC_4t(t,
55 `int a() {
56 if (a) { return (a); }
57
58 return (0);
59 return a+b;
60 return (a+b);
61 return (a)+0;
62 }`)
63
64 parseC_4t(t, `int a() { return (a)+0; }`)
65 }
66
67 func TestCParsing_Expressions3(t *testing.T) {
68 parseC_4t(t,
69 `int a() {
70 1+(a);
71 (a)++;
72 (es)++;
73 (es)||a;
74 (es)->a;
75 return (a)+(b);
76 return 0+(a);
77 }`)
78 }
79
80 func TestCParsing_Expressions4(t *testing.T) {
81 parseC_4t(t, `int a(){1+(a);}`)
82 }
83 func TestCParsing_Expressions5(t *testing.T) {
84 parseC_4t(t, `int a(){return (int)0;}`)
85 }
86 func TestCParsing_Expressions6(t *testing.T) {
87 parseC_4t(t, `int a(){return (in)0;}`)
88 }
89 func TestCParsing_Expressions7(t *testing.T) {
90 parseC_4t(t, `int a()
91 { return (0); }`)
92 }
93 func TestCParsing_Cast0(t *testing.T) {
94 parseC_4t(t, `int a(){(cast)0;}`)
95 }
96 func TestCParsing_Cast1(t *testing.T) {
97 parseC_4t(t, `int a(){(m*)(rsp);}`)
98 parseC_4t(t, `int a(){(struct m*)(rsp);}`)
99 }
100
101 func TestCParsing_Empty(t *testing.T) {
102 parseC_4t(t, `/** empty is valid. */ `)
103 }
104 func TestCParsing_EmptyStruct(t *testing.T) {
105 parseC_4t(t, `struct empty{};`)
106 parseC_4t(t, `struct {} empty;`)
107 parseC_4t(t, `struct empty {} empty;`)
108 }
109 func TestCParsing_EmptyEmbeddedUnion(t *testing.T) {
110 parseC_4t(t, `struct empty{
111 union {
112 int a;
113 char b;
114 };
115 };`)
116 }
117 func TestCParsing_ExtraSEMI(t *testing.T) {
118 parseC_4t(t, `int func(){}
119 ;
120 struct {} empty;
121 struct {} empty;;
122 int foo() {};
123 int foo() {};;
124 `)
125
126 noParseC_4t(t, `struct empty{}`)
127 }
128 func TestCParsing_ExtraSEMI2(t *testing.T) {
129 parseC_4t(t, `
130 struct a { int b; ; };
131 `)
132
133 noParseC_4t(t, `struct empty{}`)
134 }
135
136 func TestCParsing_Escapes(t *testing.T) {
137 parseC_4t(t, `
138 int f() {
139 printf("%s", "\a\b\f\n\r\t\v");
140 printf("\\");
141 printf("\%");
142 printf("\"");
143 printf('\"'); // <- semantically wrong but syntactically valid.
144 }`)
145 }
146
147 func TestCParsing_Long(t *testing.T) {
148 if testing.Short() {
149 t.Skip("skipping c parsing long test")
150 }
151
152 var walk func(name string)
153 walk = func(name string) {
154 fileInfo, err := os.Stat(name)
155 if err != nil {
156 log.Fatal(err)
157 }
158
159 if fileInfo.Mode()&(os.ModeNamedPipe|os.ModeSocket|os.ModeDevice) != 0 {
160 /* will lock up if opened */
161 } else if fileInfo.IsDir() {
162 fmt.Printf("directory %v\n", name)
163
164 file, err := os.Open(name)
165 if err != nil {
166 log.Fatal(err)
167 }
168
169 files, err := file.Readdir(-1)
170 if err != nil {
171 log.Fatal(err)
172 }
173 file.Close()
174
175 for _, f := range files {
176 if !strings.HasSuffix(name, "/") {
177 name += "/"
178 }
179 walk(name + f.Name())
180 }
181 } else if strings.HasSuffix(name, ".c") {
182 fmt.Printf("parse %v\n", name)
183
184 file, err := os.Open(name)
185 if err != nil {
186 log.Fatal(err)
187 }
188
189 buffer, err := ioutil.ReadAll(file)
190 if err != nil {
191 log.Fatal(err)
192 }
193 file.Close()
194
195 clang := &C{Buffer: string(buffer)}
196 clang.Init()
197 if err := clang.Parse(); err != nil {
198 log.Fatal(err)
199 }
200 }
201 }
202 walk("c/")
203 }
204
205 func TestCParsing_WideString(t *testing.T) {
206 parseC_4t(t, `wchar_t *msg = L"Hello";`);
207 }
+0
-72
grammars/c/main.go less more
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 package main
5
6 import (
7 "fmt"
8 "io/ioutil"
9 "log"
10 "os"
11 "strings"
12 )
13
14 func main() {
15 if len(os.Args) < 2 {
16 fmt.Printf("%v FILE\n", os.Args[0])
17 os.Exit(1)
18 }
19
20 var walk func(name string)
21 walk = func(name string) {
22 fileInfo, err := os.Stat(name)
23 if err != nil {
24 log.Fatal(err)
25 }
26
27 if fileInfo.Mode() & (os.ModeNamedPipe | os.ModeSocket | os.ModeDevice) != 0 {
28 /* will lock up if opened */
29 } else if fileInfo.IsDir() {
30 fmt.Printf("directory %v\n", name)
31
32 file, err := os.Open(name)
33 if err != nil {
34 log.Fatal(err)
35 }
36
37 files, err := file.Readdir(-1)
38 if err != nil {
39 log.Fatal(err)
40 }
41 file.Close()
42
43 for _, f := range files {
44 if !strings.HasSuffix(name, "/") {
45 name += "/"
46 }
47 walk(name + f.Name())
48 }
49 } else if strings.HasSuffix(name, ".c") {
50 fmt.Printf("parse %v\n", name)
51
52 file, err := os.Open(name)
53 if err != nil {
54 log.Fatal(err)
55 }
56
57 buffer, err := ioutil.ReadAll(file)
58 if err != nil {
59 log.Fatal(err)
60 }
61 file.Close()
62
63 clang := &C{Buffer: string(buffer)}
64 clang.Init()
65 if err := clang.Parse(); err != nil {
66 log.Fatal(err)
67 }
68 }
69 }
70 walk(os.Args[1])
71 }
+0
-12
grammars/calculator/Makefile less more
0 # Copyright 2010 The Go Authors. All rights reserved.
1 # Use of this source code is governed by a BSD-style
2 # license that can be found in the LICENSE file.
3
4 calculator: calculator.peg.go calculator.go main.go
5 go build
6
7 calculator.peg.go: calculator.peg
8 ../../peg -switch -inline calculator.peg
9
10 clean:
11 rm -f calculator calculator.peg.go
00 // Copyright 2010 The Go Authors. All rights reserved.
11 // Use of this source code is governed by a BSD-style
22 // license that can be found in the LICENSE file.
3
4 // +build grammars
35
46 package main
57
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build grammars
5
6 package main
7
8 import (
9 "math/big"
10 "testing"
11 )
12
13 func TestCalculator(t *testing.T) {
14 expression := "( 1 - -3 ) / 3 + 2 * ( 3 + -4 ) + 3 % 2^2"
15 calc := &Calculator{Buffer: expression}
16 calc.Init()
17 calc.Expression.Init(expression)
18 if err := calc.Parse(); err != nil {
19 t.Fatal(err)
20 }
21 calc.Execute()
22 if calc.Evaluate().Cmp(big.NewInt(2)) != 0 {
23 t.Fatal("got incorrect result")
24 }
25 }
+0
-29
grammars/calculator/main.go less more
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 package main
5
6 import (
7 "fmt"
8 "log"
9 "os"
10 )
11
12 func main() {
13 if len(os.Args) < 2 {
14 name := os.Args[0]
15 fmt.Printf("Usage: %v \"EXPRESSION\"\n", name)
16 fmt.Printf("Example: %v \"( 1 - -3 ) / 3 + 2 * ( 3 + -4 ) + 3 %% 2^2\"\n =2\n", name)
17 os.Exit(1)
18 }
19 expression := os.Args[1]
20 calc := &Calculator{Buffer: expression}
21 calc.Init()
22 calc.Expression.Init(expression)
23 if err := calc.Parse(); err != nil {
24 log.Fatal(err)
25 }
26 calc.Execute()
27 fmt.Printf("= %v\n", calc.Evaluate())
28 }
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build grammars
5
6 package main
7
8 import (
9 "math/big"
10 )
11
12 func (c *Calculator) Eval() *big.Int {
13 return c.Rulee(c.AST())
14 }
15
16 func (c *Calculator) Rulee(node *node32) *big.Int {
17 node = node.up
18 for node != nil {
19 switch node.pegRule {
20 case rulee1:
21 return c.Rulee1(node)
22 }
23 node = node.next
24 }
25 return nil
26 }
27
28 func (c *Calculator) Rulee1(node *node32) *big.Int {
29 node = node.up
30 var a *big.Int
31 for node != nil {
32 switch node.pegRule {
33 case rulee2:
34 a = c.Rulee2(node)
35 case ruleadd:
36 node = node.next
37 b := c.Rulee2(node)
38 a.Add(a, b)
39 case ruleminus:
40 node = node.next
41 b := c.Rulee2(node)
42 a.Sub(a, b)
43 }
44 node = node.next
45 }
46 return a
47 }
48
49 func (c *Calculator) Rulee2(node *node32) *big.Int {
50 node = node.up
51 var a *big.Int
52 for node != nil {
53 switch node.pegRule {
54 case rulee3:
55 a = c.Rulee3(node)
56 case rulemultiply:
57 node = node.next
58 b := c.Rulee3(node)
59 a.Mul(a, b)
60 case ruledivide:
61 node = node.next
62 b := c.Rulee3(node)
63 a.Div(a, b)
64 case rulemodulus:
65 node = node.next
66 b := c.Rulee3(node)
67 a.Mod(a, b)
68 }
69 node = node.next
70 }
71 return a
72 }
73
74 func (c *Calculator) Rulee3(node *node32) *big.Int {
75 node = node.up
76 var a *big.Int
77 for node != nil {
78 switch node.pegRule {
79 case rulee4:
80 a = c.Rulee4(node)
81 case ruleexponentiation:
82 node = node.next
83 b := c.Rulee4(node)
84 a.Exp(a, b, nil)
85 }
86 node = node.next
87 }
88 return a
89 }
90
91 func (c *Calculator) Rulee4(node *node32) *big.Int {
92 node = node.up
93 minus := false
94 for node != nil {
95 switch node.pegRule {
96 case rulevalue:
97 a := c.Rulevalue(node)
98 if minus {
99 a.Neg(a)
100 }
101 return a
102 case ruleminus:
103 minus = true
104 }
105 node = node.next
106 }
107 return nil
108 }
109
110 func (c *Calculator) Rulevalue(node *node32) *big.Int {
111 node = node.up
112 for node != nil {
113 switch node.pegRule {
114 case rulenumber:
115 a := big.NewInt(0)
116 a.SetString(string(c.buffer[node.begin:node.end]), 10)
117 return a
118 case rulesub:
119 return c.Rulesub(node)
120 }
121 node = node.next
122 }
123 return nil
124 }
125
126 func (c *Calculator) Rulesub(node *node32) *big.Int {
127 node = node.up
128 for node != nil {
129 switch node.pegRule {
130 case rulee1:
131 return c.Rulee1(node)
132 }
133 node = node.next
134 }
135 return nil
136 }
0 # Copyright 2010 The Go Authors. All rights reserved.
1 # Use of this source code is governed by a BSD-style
2 # license that can be found in the LICENSE file.
3
4 package main
5
6 type Calculator Peg {
7 }
8
9 e <- sp e1 !.
10 e1 <- e2 ( add e2
11 / minus e2
12 )*
13 e2 <- e3 ( multiply e3
14 / divide e3
15 / modulus e3
16 )*
17 e3 <- e4 ( exponentiation e4
18 )*
19 e4 <- minus value
20 / value
21 value <- number
22 / sub
23 number <- < [0-9]+ > sp
24 sub <- open e1 close
25 add <- '+' sp
26 minus <- '-' sp
27 multiply <- '*' sp
28 divide <- '/' sp
29 modulus <- '%' sp
30 exponentiation <- '^' sp
31 open <- '(' sp
32 close <- ')' sp
33 sp <- ( ' ' / '\t' )*
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build grammars
5
6 package main
7
8 import (
9 "math/big"
10 "testing"
11 )
12
13 func TestCalculator(t *testing.T) {
14 expression := "( 1 - -3 ) / 3 + 2 * ( 3 + -4 ) + 3 % 2^2"
15 calc := &Calculator{Buffer: expression}
16 calc.Init()
17 if err := calc.Parse(); err != nil {
18 t.Fatal(err)
19 }
20 if calc.Eval().Cmp(big.NewInt(2)) != 0 {
21 t.Fatal("got incorrect result")
22 }
23 }
+0
-12
grammars/fexl/Makefile less more
0 # Copyright 2010 The Go Authors. All rights reserved.
1 # Use of this source code is governed by a BSD-style
2 # license that can be found in the LICENSE file.
3
4 fexl: fexl.peg.go main.go
5 go build
6
7 fexl.peg.go: fexl.peg
8 ../../peg -switch -inline fexl.peg
9
10 clean:
11 rm -f fexl fexl.peg.go
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build grammars
5
6 package main
7
8 import (
9 "io/ioutil"
10 "testing"
11 )
12
13 func TestFexl(t *testing.T) {
14 buffer, err := ioutil.ReadFile("doc/try.fxl")
15 if err != nil {
16 t.Fatal(err)
17 }
18
19 fexl := &Fexl{Buffer: string(buffer)}
20 fexl.Init()
21
22 if err := fexl.Parse(); err != nil {
23 t.Fatal(err)
24 }
25 }
+0
-25
grammars/fexl/main.go less more
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 package main
5
6 import (
7 "log"
8 "io/ioutil"
9 )
10
11 func main() {
12 buffer, err := ioutil.ReadFile("doc/try.fxl")
13 if err != nil {
14 log.Fatal(err)
15 }
16
17 fexl := &Fexl{Buffer: string(buffer)}
18 fexl.Init()
19
20 if err := fexl.Parse(); err != nil {
21 log.Fatal(err)
22 }
23 fexl.Highlighter()
24 }
+0
-12
grammars/java/Makefile less more
0 # Copyright 2010 The Go Authors. All rights reserved.
1 # Use of this source code is governed by a BSD-style
2 # license that can be found in the LICENSE file.
3
4 java: java_1_7.peg.go main.go
5 go build
6
7 java_1_7.peg.go: java_1_7.peg
8 ../../peg -switch -inline java_1_7.peg
9
10 clean:
11 rm -f java java_1_7.peg.go
177177 / InterfaceMethodDeclaratorRest
178178
179179 InterfaceMethodDeclaratorRest
180 <- FormalParameters Dim* (THROWS ClassTypeList)? SEM
180 <- FormalParameters Dim* (THROWS ClassTypeList)? SEMI
181181
182182 InterfaceGenericMethodDecl
183183 <- TypeParameters (Type / VOID) Identifier InterfaceMethodDeclaratorRest
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build grammars
5
6 package main
7
8 import (
9 "fmt"
10 "io/ioutil"
11 "log"
12 "os"
13 "strings"
14 "testing"
15 )
16
17 var example1 = `public class HelloWorld {
18 public static void main(String[] args) {
19 System.out.println("Hello, World");
20 }
21 }
22 `
23
24 func TestBasic(t *testing.T) {
25 java := &Java{Buffer: example1}
26 java.Init()
27
28 if err := java.Parse(); err != nil {
29 t.Fatal(err)
30 }
31 }
32
33 func TestJava(t *testing.T) {
34 if testing.Short() {
35 t.Skip("skipping java parsing long test")
36 }
37
38 var walk func(name string)
39 walk = func(name string) {
40 fileInfo, err := os.Stat(name)
41 if err != nil {
42 log.Fatal(err)
43 }
44
45 if fileInfo.Mode()&(os.ModeNamedPipe|os.ModeSocket|os.ModeDevice) != 0 {
46 /* will lock up if opened */
47 } else if fileInfo.IsDir() {
48 fmt.Printf("directory %v\n", name)
49
50 file, err := os.Open(name)
51 if err != nil {
52 log.Fatal(err)
53 }
54
55 files, err := file.Readdir(-1)
56 if err != nil {
57 log.Fatal(err)
58 }
59 file.Close()
60
61 for _, f := range files {
62 if !strings.HasSuffix(name, "/") {
63 name += "/"
64 }
65 walk(name + f.Name())
66 }
67 } else if strings.HasSuffix(name, ".java") {
68 fmt.Printf("parse %v\n", name)
69
70 file, err := os.Open(name)
71 if err != nil {
72 log.Fatal(err)
73 }
74
75 buffer, err := ioutil.ReadAll(file)
76 if err != nil {
77 log.Fatal(err)
78 }
79 file.Close()
80
81 java := &Java{Buffer: string(buffer)}
82 java.Init()
83 if err := java.Parse(); err != nil {
84 log.Fatal(err)
85 }
86 }
87 }
88 walk("java/")
89 }
+0
-72
grammars/java/main.go less more
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 package main
5
6 import (
7 "fmt"
8 "io/ioutil"
9 "log"
10 "os"
11 "strings"
12 )
13
14 func main() {
15 if len(os.Args) < 2 {
16 fmt.Printf("%v FILE\n", os.Args[0])
17 os.Exit(1)
18 }
19
20 var walk func(name string)
21 walk = func(name string) {
22 fileInfo, err := os.Stat(name)
23 if err != nil {
24 log.Fatal(err)
25 }
26
27 if fileInfo.Mode() & (os.ModeNamedPipe | os.ModeSocket | os.ModeDevice) != 0 {
28 /* will lock up if opened */
29 } else if fileInfo.IsDir() {
30 fmt.Printf("directory %v\n", name)
31
32 file, err := os.Open(name)
33 if err != nil {
34 log.Fatal(err)
35 }
36
37 files, err := file.Readdir(-1)
38 if err != nil {
39 log.Fatal(err)
40 }
41 file.Close()
42
43 for _, f := range files {
44 if !strings.HasSuffix(name, "/") {
45 name += "/"
46 }
47 walk(name + f.Name())
48 }
49 } else if strings.HasSuffix(name, ".java") {
50 fmt.Printf("parse %v\n", name)
51
52 file, err := os.Open(name)
53 if err != nil {
54 log.Fatal(err)
55 }
56
57 buffer, err := ioutil.ReadAll(file)
58 if err != nil {
59 log.Fatal(err)
60 }
61 file.Close()
62
63 java := &Java{Buffer: string(buffer)}
64 java.Init()
65 if err := java.Parse(); err != nil {
66 log.Fatal(err)
67 }
68 }
69 }
70 walk(os.Args[1])
71 }
+0
-12
grammars/long_test/Makefile less more
0 # Copyright 2010 The Go Authors. All rights reserved.
1 # Use of this source code is governed by a BSD-style
2 # license that can be found in the LICENSE file.
3
4 long_test: long.peg.go main.go
5 go build
6
7 long.peg.go: long.peg
8 peg -switch -inline long.peg
9
10 clean:
11 rm -f long_test long.peg.go
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 // +build grammars
5
6 package main
7
8 import (
9 "testing"
10 )
11
12 func TestLong(t *testing.T) {
13 length := 100000
14 if testing.Short() {
15 length = 100
16 }
17
18 expression := ""
19 long := &Long{Buffer: "\"" + expression + "\""}
20 long.Init()
21 for c := 0; c < length; c++ {
22 if err := long.Parse(); err != nil {
23 t.Fatal(err)
24 }
25 long.Reset()
26 expression = expression + "X"
27 long.Buffer = "\"" + expression + "\""
28 }
29 }
+0
-25
grammars/long_test/main.go less more
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 package main
5
6 import (
7 "fmt"
8 "log"
9 )
10
11 func main() {
12 expression := ""
13 long := &Long{Buffer: "\"" + expression + "\""}
14 long.Init()
15 for c := 0; c < 100000; c++ {
16 if err := long.Parse(); err != nil {
17 fmt.Printf("%v\n", c)
18 log.Fatal(err)
19 }
20 long.Reset()
21 expression = expression + "X"
22 long.Buffer = "\"" + expression + "\""
23 }
24 }
1010 "log"
1111 "os"
1212 "runtime"
13 "time"
13
14 "github.com/pointlander/peg/tree"
1415 )
1516
17 //go:generate -command build go run build.go
18 //go:generate build buildinfo
19 //go:generate build peg
20
1621 var (
17 inline = flag.Bool("inline", false, "parse rule inlining")
18 _switch = flag.Bool("switch", false, "replace if-else if-else like blocks with switch blocks")
19 syntax = flag.Bool("syntax", false, "print out the syntax tree")
20 highlight = flag.Bool("highlight", false, "test the syntax highlighter")
21 ast = flag.Bool("ast", false, "generate an AST")
22 test = flag.Bool("test", false, "test the PEG parser performance")
23 print = flag.Bool("print", false, "directly dump the syntax tree")
22 inline = flag.Bool("inline", false, "parse rule inlining")
23 _switch = flag.Bool("switch", false, "replace if-else if-else like blocks with switch blocks")
24 print = flag.Bool("print", false, "directly dump the syntax tree")
25 syntax = flag.Bool("syntax", false, "print out the syntax tree")
26 noast = flag.Bool("noast", false, "disable AST")
27 strict = flag.Bool("strict", false, "treat compiler warnings as errors")
28 filename = flag.String("output", "", "specify name of output file")
29 showVersion = flag.Bool("version", false, "print the version and exit")
30 showBuildTime = flag.Bool("time", false, "show the last time `build.go buildinfo` was ran")
2431 )
2532
2633 func main() {
2734 runtime.GOMAXPROCS(2)
2835 flag.Parse()
36
37 if *showVersion {
38 if IS_TAGGED {
39 fmt.Println("version:", VERSION)
40 } else {
41 fmt.Printf("version: %s-%s\n", VERSION, COMMIT)
42 }
43 if *showBuildTime {
44 fmt.Println("time:", BUILDTIME)
45 }
46 return
47 }
2948
3049 if flag.NArg() != 1 {
3150 flag.Usage()
3857 log.Fatal(err)
3958 }
4059
41 if *test {
42 iterations, p := 1000, &Peg{Tree: New(*inline, *_switch), Buffer: string(buffer)}
43 p.Init()
44 start := time.Now()
45 for i := 0; i < iterations; i++ {
46 p.Parse()
47 p.Reset()
48 }
49 total := float64(time.Since(start).Nanoseconds()) / float64(1000)
50 fmt.Printf("time: %v us\n", total/float64(iterations))
51 return
52 }
53
54 p := &Peg{Tree: New(*inline, *_switch), Buffer: string(buffer), Pretty: true}
55 p.Init()
60 p := &Peg{Tree: tree.New(*inline, *_switch, *noast), Buffer: string(buffer)}
61 p.Init(Pretty(true), Size(1<<15))
5662 if err := p.Parse(); err != nil {
5763 log.Fatal(err)
5864 }
5965
6066 p.Execute()
6167
62 if *ast {
63 p.AST().Print(p.Buffer)
64 }
6568 if *print {
6669 p.Print()
6770 }
6871 if *syntax {
6972 p.PrintSyntaxTree()
7073 }
71 if *highlight {
72 p.Highlighter()
74
75 if *filename == "" {
76 *filename = file + ".go"
7377 }
74
75 filename := file + ".go"
76 out, error := os.OpenFile(filename, os.O_RDWR|os.O_CREATE|os.O_TRUNC, 0644)
77 if error != nil {
78 fmt.Printf("%v: %v\n", filename, error)
78 out, err := os.OpenFile(*filename, os.O_RDWR|os.O_CREATE|os.O_TRUNC, 0644)
79 if err != nil {
80 fmt.Printf("%v: %v\n", *filename, err)
7981 return
8082 }
8183 defer out.Close()
82 p.Compile(filename, out)
84
85 p.Strict = *strict
86 if err = p.Compile(*filename, os.Args, out); err != nil {
87 log.Fatal(err)
88 }
8389 }
+0
-1593
peg.go less more
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 package main
5
6 import (
7 "bytes"
8 "fmt"
9 "go/parser"
10 "go/printer"
11 "go/token"
12 "io"
13 "math"
14 "os"
15 "strconv"
16 "strings"
17 "text/template"
18
19 "github.com/pointlander/jetset"
20 )
21
22 const pegHeaderTemplate = `package {{.PackageName}}
23
24 import (
25 {{range .Imports}}"{{.}}"
26 {{end}}
27 )
28
29 const endSymbol rune = {{.EndSymbol}}
30
31 /* The rule types inferred from the grammar are below. */
32 type pegRule {{.PegRuleType}}
33
34 const (
35 ruleUnknown pegRule = iota
36 {{range .RuleNames}}rule{{.String}}
37 {{end}}
38 rulePre
39 ruleIn
40 ruleSuf
41 )
42
43 var rul3s = [...]string {
44 "Unknown",
45 {{range .RuleNames}}"{{.String}}",
46 {{end}}
47 "Pre_",
48 "_In_",
49 "_Suf",
50 }
51
52 type node32 struct {
53 token32
54 up, next *node32
55 }
56
57 func (node *node32) print(depth int, buffer string) {
58 for node != nil {
59 for c := 0; c < depth; c++ {
60 fmt.Printf(" ")
61 }
62 fmt.Printf("\x1B[34m%v\x1B[m %v\n", rul3s[node.pegRule], strconv.Quote(string(([]rune(buffer)[node.begin:node.end]))))
63 if node.up != nil {
64 node.up.print(depth + 1, buffer)
65 }
66 node = node.next
67 }
68 }
69
70 func (node *node32) Print(buffer string) {
71 node.print(0, buffer)
72 }
73
74 type element struct {
75 node *node32
76 down *element
77 }
78
79 {{range .Sizes}}
80
81 /* ${@} bit structure for abstract syntax tree */
82 type token{{.}} struct {
83 pegRule
84 begin, end, next uint{{.}}
85 }
86
87 func (t *token{{.}}) isZero() bool {
88 return t.pegRule == ruleUnknown && t.begin == 0 && t.end == 0 && t.next == 0
89 }
90
91 func (t *token{{.}}) isParentOf(u token{{.}}) bool {
92 return t.begin <= u.begin && t.end >= u.end && t.next > u.next
93 }
94
95 func (t *token{{.}}) getToken32() token32 {
96 return token32{pegRule: t.pegRule, begin: uint32(t.begin), end: uint32(t.end), next: uint32(t.next)}
97 }
98
99 func (t *token{{.}}) String() string {
100 return fmt.Sprintf("\x1B[34m%v\x1B[m %v %v %v", rul3s[t.pegRule], t.begin, t.end, t.next)
101 }
102
103 type tokens{{.}} struct {
104 tree []token{{.}}
105 ordered [][]token{{.}}
106 }
107
108 func (t *tokens{{.}}) trim(length int) {
109 t.tree = t.tree[0:length]
110 }
111
112 func (t *tokens{{.}}) Print() {
113 for _, token := range t.tree {
114 fmt.Println(token.String())
115 }
116 }
117
118 func (t *tokens{{.}}) Order() [][]token{{.}} {
119 if t.ordered != nil {
120 return t.ordered
121 }
122
123 depths := make([]int{{.}}, 1, math.MaxInt16)
124 for i, token := range t.tree {
125 if token.pegRule == ruleUnknown {
126 t.tree = t.tree[:i]
127 break
128 }
129 depth := int(token.next)
130 if length := len(depths); depth >= length {
131 depths = depths[:depth + 1]
132 }
133 depths[depth]++
134 }
135 depths = append(depths, 0)
136
137 ordered, pool := make([][]token{{.}}, len(depths)), make([]token{{.}}, len(t.tree) + len(depths))
138 for i, depth := range depths {
139 depth++
140 ordered[i], pool, depths[i] = pool[:depth], pool[depth:], 0
141 }
142
143 for i, token := range t.tree {
144 depth := token.next
145 token.next = uint{{.}}(i)
146 ordered[depth][depths[depth]] = token
147 depths[depth]++
148 }
149 t.ordered = ordered
150 return ordered
151 }
152
153 type state{{.}} struct {
154 token{{.}}
155 depths []int{{.}}
156 leaf bool
157 }
158
159 func (t *tokens{{.}}) AST() *node32 {
160 tokens := t.Tokens()
161 stack := &element{node: &node32{token32:<-tokens}}
162 for token := range tokens {
163 if token.begin == token.end {
164 continue
165 }
166 node := &node32{token32: token}
167 for stack != nil && stack.node.begin >= token.begin && stack.node.end <= token.end {
168 stack.node.next = node.up
169 node.up = stack.node
170 stack = stack.down
171 }
172 stack = &element{node: node, down: stack}
173 }
174 return stack.node
175 }
176
177 func (t *tokens{{.}}) PreOrder() (<-chan state{{.}}, [][]token{{.}}) {
178 s, ordered := make(chan state{{.}}, 6), t.Order()
179 go func() {
180 var states [8]state{{.}}
181 for i := range states {
182 states[i].depths = make([]int{{.}}, len(ordered))
183 }
184 depths, state, depth := make([]int{{.}}, len(ordered)), 0, 1
185 write := func(t token{{.}}, leaf bool) {
186 S := states[state]
187 state, S.pegRule, S.begin, S.end, S.next, S.leaf = (state + 1) % 8, t.pegRule, t.begin, t.end, uint{{.}}(depth), leaf
188 copy(S.depths, depths)
189 s <- S
190 }
191
192 states[state].token{{.}} = ordered[0][0]
193 depths[0]++
194 state++
195 a, b := ordered[depth - 1][depths[depth - 1] - 1], ordered[depth][depths[depth]]
196 depthFirstSearch: for {
197 for {
198 if i := depths[depth]; i > 0 {
199 if c, j := ordered[depth][i - 1], depths[depth - 1]; a.isParentOf(c) &&
200 (j < 2 || !ordered[depth - 1][j - 2].isParentOf(c)) {
201 if c.end != b.begin {
202 write(token{{.}} {pegRule: ruleIn, begin: c.end, end: b.begin}, true)
203 }
204 break
205 }
206 }
207
208 if a.begin < b.begin {
209 write(token{{.}} {pegRule: rulePre, begin: a.begin, end: b.begin}, true)
210 }
211 break
212 }
213
214 next := depth + 1
215 if c := ordered[next][depths[next]]; c.pegRule != ruleUnknown && b.isParentOf(c) {
216 write(b, false)
217 depths[depth]++
218 depth, a, b = next, b, c
219 continue
220 }
221
222 write(b, true)
223 depths[depth]++
224 c, parent := ordered[depth][depths[depth]], true
225 for {
226 if c.pegRule != ruleUnknown && a.isParentOf(c) {
227 b = c
228 continue depthFirstSearch
229 } else if parent && b.end != a.end {
230 write(token{{.}} {pegRule: ruleSuf, begin: b.end, end: a.end}, true)
231 }
232
233 depth--
234 if depth > 0 {
235 a, b, c = ordered[depth - 1][depths[depth - 1] - 1], a, ordered[depth][depths[depth]]
236 parent = a.isParentOf(b)
237 continue
238 }
239
240 break depthFirstSearch
241 }
242 }
243
244 close(s)
245 }()
246 return s, ordered
247 }
248
249 func (t *tokens{{.}}) PrintSyntax() {
250 tokens, ordered := t.PreOrder()
251 max := -1
252 for token := range tokens {
253 if !token.leaf {
254 fmt.Printf("%v", token.begin)
255 for i, leaf, depths := 0, int(token.next), token.depths; i < leaf; i++ {
256 fmt.Printf(" \x1B[36m%v\x1B[m", rul3s[ordered[i][depths[i] - 1].pegRule])
257 }
258 fmt.Printf(" \x1B[36m%v\x1B[m\n", rul3s[token.pegRule])
259 } else if token.begin == token.end {
260 fmt.Printf("%v", token.begin)
261 for i, leaf, depths := 0, int(token.next), token.depths; i < leaf; i++ {
262 fmt.Printf(" \x1B[31m%v\x1B[m", rul3s[ordered[i][depths[i] - 1].pegRule])
263 }
264 fmt.Printf(" \x1B[31m%v\x1B[m\n", rul3s[token.pegRule])
265 } else {
266 for c, end := token.begin, token.end; c < end; c++ {
267 if i := int(c); max + 1 < i {
268 for j := max; j < i; j++ {
269 fmt.Printf("skip %v %v\n", j, token.String())
270 }
271 max = i
272 } else if i := int(c); i <= max {
273 for j := i; j <= max; j++ {
274 fmt.Printf("dupe %v %v\n", j, token.String())
275 }
276 } else {
277 max = int(c)
278 }
279 fmt.Printf("%v", c)
280 for i, leaf, depths := 0, int(token.next), token.depths; i < leaf; i++ {
281 fmt.Printf(" \x1B[34m%v\x1B[m", rul3s[ordered[i][depths[i] - 1].pegRule])
282 }
283 fmt.Printf(" \x1B[34m%v\x1B[m\n", rul3s[token.pegRule])
284 }
285 fmt.Printf("\n")
286 }
287 }
288 }
289
290 func (t *tokens{{.}}) PrintSyntaxTree(buffer string) {
291 tokens, _ := t.PreOrder()
292 for token := range tokens {
293 for c := 0; c < int(token.next); c++ {
294 fmt.Printf(" ")
295 }
296 fmt.Printf("\x1B[34m%v\x1B[m %v\n", rul3s[token.pegRule], strconv.Quote(string(([]rune(buffer)[token.begin:token.end]))))
297 }
298 }
299
300 func (t *tokens{{.}}) Add(rule pegRule, begin, end, depth uint32, index int) {
301 t.tree[index] = token{{.}}{pegRule: rule, begin: uint{{.}}(begin), end: uint{{.}}(end), next: uint{{.}}(depth)}
302 }
303
304 func (t *tokens{{.}}) Tokens() <-chan token32 {
305 s := make(chan token32, 16)
306 go func() {
307 for _, v := range t.tree {
308 s <- v.getToken32()
309 }
310 close(s)
311 }()
312 return s
313 }
314
315 func (t *tokens{{.}}) Error() []token32 {
316 ordered := t.Order()
317 length := len(ordered)
318 tokens, length := make([]token32, length), length - 1
319 for i := range tokens {
320 o := ordered[length - i]
321 if len(o) > 1 {
322 tokens[i] = o[len(o) - 2].getToken32()
323 }
324 }
325 return tokens
326 }
327 {{end}}
328
329 func (t *tokens32) Expand(index int) {
330 tree := t.tree
331 if index >= len(tree) {
332 expanded := make([]token32, 2 * len(tree))
333 copy(expanded, tree)
334 t.tree = expanded
335 }
336 }
337
338 type {{.StructName}} struct {
339 {{.StructVariables}}
340 Buffer string
341 buffer []rune
342 rules [{{.RulesCount}}]func() bool
343 Parse func(rule ...int) error
344 Reset func()
345 Pretty bool
346 tokens32
347 }
348
349 type textPosition struct {
350 line, symbol int
351 }
352
353 type textPositionMap map[int] textPosition
354
355 func translatePositions(buffer []rune, positions []int) textPositionMap {
356 length, translations, j, line, symbol := len(positions), make(textPositionMap, len(positions)), 0, 1, 0
357 sort.Ints(positions)
358
359 search: for i, c := range buffer {
360 if c == '\n' {line, symbol = line + 1, 0} else {symbol++}
361 if i == positions[j] {
362 translations[positions[j]] = textPosition{line, symbol}
363 for j++; j < length; j++ {if i != positions[j] {continue search}}
364 break search
365 }
366 }
367
368 return translations
369 }
370
371 type parseError struct {
372 p *{{.StructName}}
373 max token32
374 }
375
376 func (e *parseError) Error() string {
377 tokens, error := []token32{e.max}, "\n"
378 positions, p := make([]int, 2 * len(tokens)), 0
379 for _, token := range tokens {
380 positions[p], p = int(token.begin), p + 1
381 positions[p], p = int(token.end), p + 1
382 }
383 translations := translatePositions(e.p.buffer, positions)
384 format := "parse error near %v (line %v symbol %v - line %v symbol %v):\n%v\n"
385 if e.p.Pretty {
386 format = "parse error near \x1B[34m%v\x1B[m (line %v symbol %v - line %v symbol %v):\n%v\n"
387 }
388 for _, token := range tokens {
389 begin, end := int(token.begin), int(token.end)
390 error += fmt.Sprintf(format,
391 rul3s[token.pegRule],
392 translations[begin].line, translations[begin].symbol,
393 translations[end].line, translations[end].symbol,
394 strconv.Quote(string(e.p.buffer[begin:end])))
395 }
396
397 return error
398 }
399
400 func (p *{{.StructName}}) PrintSyntaxTree() {
401 p.tokens32.PrintSyntaxTree(p.Buffer)
402 }
403
404 func (p *{{.StructName}}) Highlighter() {
405 p.PrintSyntax()
406 }
407
408 {{if .HasActions}}
409 func (p *{{.StructName}}) Execute() {
410 buffer, _buffer, text, begin, end := p.Buffer, p.buffer, "", 0, 0
411 for token := range p.Tokens() {
412 switch (token.pegRule) {
413 {{if .HasPush}}
414 case rulePegText:
415 begin, end = int(token.begin), int(token.end)
416 text = string(_buffer[begin:end])
417 {{end}}
418 {{range .Actions}}case ruleAction{{.GetId}}:
419 {{.String}}
420 {{end}}
421 }
422 }
423 _, _, _, _, _ = buffer, _buffer, text, begin, end
424 }
425 {{end}}
426
427 func (p *{{.StructName}}) Init() {
428 p.buffer = []rune(p.Buffer)
429 if len(p.buffer) == 0 || p.buffer[len(p.buffer) - 1] != endSymbol {
430 p.buffer = append(p.buffer, endSymbol)
431 }
432
433 tree := tokens32{tree: make([]token32, math.MaxInt16)}
434 var max token32
435 position, depth, tokenIndex, buffer, _rules := uint32(0), uint32(0), 0, p.buffer, p.rules
436
437 p.Parse = func(rule ...int) error {
438 r := 1
439 if len(rule) > 0 {
440 r = rule[0]
441 }
442 matches := p.rules[r]()
443 p.tokens32 = tree
444 if matches {
445 p.trim(tokenIndex)
446 return nil
447 }
448 return &parseError{p, max}
449 }
450
451 p.Reset = func() {
452 position, tokenIndex, depth = 0, 0, 0
453 }
454
455 add := func(rule pegRule, begin uint32) {
456 tree.Expand(tokenIndex)
457 tree.Add(rule, begin, position, depth, tokenIndex)
458 tokenIndex++
459 if begin != position && position > max.end {
460 max = token32{rule, begin, position, depth}
461 }
462 }
463
464 {{if .HasDot}}
465 matchDot := func() bool {
466 if buffer[position] != endSymbol {
467 position++
468 return true
469 }
470 return false
471 }
472 {{end}}
473
474 {{if .HasCharacter}}
475 /*matchChar := func(c byte) bool {
476 if buffer[position] == c {
477 position++
478 return true
479 }
480 return false
481 }*/
482 {{end}}
483
484 {{if .HasString}}
485 matchString := func(s string) bool {
486 i := position
487 for _, c := range s {
488 if buffer[i] != c {
489 return false
490 }
491 i++
492 }
493 position = i
494 return true
495 }
496 {{end}}
497
498 {{if .HasRange}}
499 /*matchRange := func(lower byte, upper byte) bool {
500 if c := buffer[position]; c >= lower && c <= upper {
501 position++
502 return true
503 }
504 return false
505 }*/
506 {{end}}
507
508 _rules = [...]func() bool {
509 nil,`
510
511 type Type uint8
512
513 const (
514 TypeUnknown Type = iota
515 TypeRule
516 TypeName
517 TypeDot
518 TypeCharacter
519 TypeRange
520 TypeString
521 TypePredicate
522 TypeStateChange
523 TypeCommit
524 TypeAction
525 TypePackage
526 TypeImport
527 TypeState
528 TypeAlternate
529 TypeUnorderedAlternate
530 TypeSequence
531 TypePeekFor
532 TypePeekNot
533 TypeQuery
534 TypeStar
535 TypePlus
536 TypePeg
537 TypePush
538 TypeImplicitPush
539 TypeNil
540 TypeLast
541 )
542
543 var TypeMap = [...]string{
544 "TypeUnknown",
545 "TypeRule",
546 "TypeName",
547 "TypeDot",
548 "TypeCharacter",
549 "TypeRange",
550 "TypeString",
551 "TypePredicate",
552 "TypeCommit",
553 "TypeAction",
554 "TypePackage",
555 "TypeImport",
556 "TypeState",
557 "TypeAlternate",
558 "TypeUnorderedAlternate",
559 "TypeSequence",
560 "TypePeekFor",
561 "TypePeekNot",
562 "TypeQuery",
563 "TypeStar",
564 "TypePlus",
565 "TypePeg",
566 "TypePush",
567 "TypeImplicitPush",
568 "TypeNil",
569 "TypeLast"}
570
571 func (t Type) GetType() Type {
572 return t
573 }
574
575 type Node interface {
576 fmt.Stringer
577 debug()
578
579 Escaped() string
580 SetString(s string)
581
582 GetType() Type
583 SetType(t Type)
584
585 GetId() int
586 SetId(id int)
587
588 Init()
589 Front() *node
590 Next() *node
591 PushFront(value *node)
592 PopFront() *node
593 PushBack(value *node)
594 Len() int
595 Copy() *node
596 Slice() []*node
597 }
598
599 type node struct {
600 Type
601 string
602 id int
603
604 front *node
605 back *node
606 length int
607
608 /* use hash table here instead of Copy? */
609 next *node
610 }
611
612 func (n *node) String() string {
613 return n.string
614 }
615
616 func (n *node) debug() {
617 if len(n.string) == 1 {
618 fmt.Printf("%v %v '%v' %d\n", n.id, TypeMap[n.Type], n.string, n.string[0])
619 } else {
620 fmt.Printf("%v %v '%v'\n", n.id, TypeMap[n.Type], n.string)
621 }
622 }
623
624 func (n *node) Escaped() string {
625 return escape(n.string)
626 }
627
628 func (n *node) SetString(s string) {
629 n.string = s
630 }
631
632 func (n *node) SetType(t Type) {
633 n.Type = t
634 }
635
636 func (n *node) GetId() int {
637 return n.id
638 }
639
640 func (n *node) SetId(id int) {
641 n.id = id
642 }
643
644 func (n *node) Init() {
645 n.front = nil
646 n.back = nil
647 n.length = 0
648 }
649
650 func (n *node) Front() *node {
651 return n.front
652 }
653
654 func (n *node) Next() *node {
655 return n.next
656 }
657
658 func (n *node) PushFront(value *node) {
659 if n.back == nil {
660 n.back = value
661 } else {
662 value.next = n.front
663 }
664 n.front = value
665 n.length++
666 }
667
668 func (n *node) PopFront() *node {
669 front := n.front
670
671 switch true {
672 case front == nil:
673 panic("tree is empty")
674 case front == n.back:
675 n.front, n.back = nil, nil
676 default:
677 n.front, front.next = front.next, nil
678 }
679
680 n.length--
681 return front
682 }
683
684 func (n *node) PushBack(value *node) {
685 if n.front == nil {
686 n.front = value
687 } else {
688 n.back.next = value
689 }
690 n.back = value
691 n.length++
692 }
693
694 func (n *node) Len() (c int) {
695 return n.length
696 }
697
698 func (n *node) Copy() *node {
699 return &node{Type: n.Type, string: n.string, id: n.id, front: n.front, back: n.back, length: n.length}
700 }
701
702 func (n *node) Slice() []*node {
703 s := make([]*node, n.length)
704 for element, i := n.Front(), 0; element != nil; element, i = element.Next(), i+1 {
705 s[i] = element
706 }
707 return s
708 }
709
710 /* A tree data structure into which a PEG can be parsed. */
711 type Tree struct {
712 Rules map[string]Node
713 rulesCount map[string]uint
714 node
715 inline, _switch bool
716
717 RuleNames []Node
718 Sizes [1]int
719 PackageName string
720 Imports []string
721 EndSymbol rune
722 PegRuleType string
723 StructName string
724 StructVariables string
725 RulesCount int
726 Bits int
727 HasActions bool
728 Actions []Node
729 HasPush bool
730 HasCommit bool
731 HasDot bool
732 HasCharacter bool
733 HasString bool
734 HasRange bool
735 }
736
737 func New(inline, _switch bool) *Tree {
738 return &Tree{Rules: make(map[string]Node),
739 Sizes: [1]int{32},
740 rulesCount: make(map[string]uint),
741 inline: inline,
742 _switch: _switch}
743 }
744
745 func (t *Tree) AddRule(name string) {
746 t.PushFront(&node{Type: TypeRule, string: name, id: t.RulesCount})
747 t.RulesCount++
748 }
749
750 func (t *Tree) AddExpression() {
751 expression := t.PopFront()
752 rule := t.PopFront()
753 rule.PushBack(expression)
754 t.PushBack(rule)
755 }
756
757 func (t *Tree) AddName(text string) {
758 t.PushFront(&node{Type: TypeName, string: text})
759 }
760
761 func (t *Tree) AddDot() { t.PushFront(&node{Type: TypeDot, string: "."}) }
762 func (t *Tree) AddCharacter(text string) {
763 t.PushFront(&node{Type: TypeCharacter, string: text})
764 }
765 func (t *Tree) AddDoubleCharacter(text string) {
766 t.PushFront(&node{Type: TypeCharacter, string: strings.ToLower(text)})
767 t.PushFront(&node{Type: TypeCharacter, string: strings.ToUpper(text)})
768 t.AddAlternate()
769 }
770 func (t *Tree) AddHexaCharacter(text string) {
771 hexa, _ := strconv.ParseInt(text, 16, 32)
772 t.PushFront(&node{Type: TypeCharacter, string: string(hexa)})
773 }
774 func (t *Tree) AddOctalCharacter(text string) {
775 octal, _ := strconv.ParseInt(text, 8, 8)
776 t.PushFront(&node{Type: TypeCharacter, string: string(octal)})
777 }
778 func (t *Tree) AddPredicate(text string) { t.PushFront(&node{Type: TypePredicate, string: text}) }
779 func (t *Tree) AddStateChange(text string) { t.PushFront(&node{Type: TypeStateChange, string: text}) }
780 func (t *Tree) AddNil() { t.PushFront(&node{Type: TypeNil, string: "<nil>"}) }
781 func (t *Tree) AddAction(text string) { t.PushFront(&node{Type: TypeAction, string: text}) }
782 func (t *Tree) AddPackage(text string) { t.PushBack(&node{Type: TypePackage, string: text}) }
783 func (t *Tree) AddImport(text string) { t.PushBack(&node{Type: TypeImport, string: text}) }
784 func (t *Tree) AddState(text string) {
785 peg := t.PopFront()
786 peg.PushBack(&node{Type: TypeState, string: text})
787 t.PushBack(peg)
788 }
789
790 func (t *Tree) addList(listType Type) {
791 a := t.PopFront()
792 b := t.PopFront()
793 var l *node
794 if b.GetType() == listType {
795 l = b
796 } else {
797 l = &node{Type: listType}
798 l.PushBack(b)
799 }
800 l.PushBack(a)
801 t.PushFront(l)
802 }
803 func (t *Tree) AddAlternate() { t.addList(TypeAlternate) }
804 func (t *Tree) AddSequence() { t.addList(TypeSequence) }
805 func (t *Tree) AddRange() { t.addList(TypeRange) }
806 func (t *Tree) AddDoubleRange() {
807 a := t.PopFront()
808 b := t.PopFront()
809
810 t.AddCharacter(strings.ToLower(b.String()))
811 t.AddCharacter(strings.ToLower(a.String()))
812 t.addList(TypeRange)
813
814 t.AddCharacter(strings.ToUpper(b.String()))
815 t.AddCharacter(strings.ToUpper(a.String()))
816 t.addList(TypeRange)
817
818 t.AddAlternate()
819 }
820
821 func (t *Tree) addFix(fixType Type) {
822 n := &node{Type: fixType}
823 n.PushBack(t.PopFront())
824 t.PushFront(n)
825 }
826 func (t *Tree) AddPeekFor() { t.addFix(TypePeekFor) }
827 func (t *Tree) AddPeekNot() { t.addFix(TypePeekNot) }
828 func (t *Tree) AddQuery() { t.addFix(TypeQuery) }
829 func (t *Tree) AddStar() { t.addFix(TypeStar) }
830 func (t *Tree) AddPlus() { t.addFix(TypePlus) }
831 func (t *Tree) AddPush() { t.addFix(TypePush) }
832
833 func (t *Tree) AddPeg(text string) { t.PushFront(&node{Type: TypePeg, string: text}) }
834
835 func join(tasks []func()) {
836 length := len(tasks)
837 done := make(chan int, length)
838 for _, task := range tasks {
839 go func(task func()) { task(); done <- 1 }(task)
840 }
841 for d := <-done; d < length; d += <-done {
842 }
843 }
844
845 func escape(c string) string {
846 switch c {
847 case "'":
848 return "\\'"
849 case "\"":
850 return "\""
851 default:
852 c = strconv.Quote(c)
853 return c[1 : len(c)-1]
854 }
855 }
856
857 func (t *Tree) Compile(file string, out io.Writer) {
858 t.AddImport("fmt")
859 t.AddImport("math")
860 t.AddImport("sort")
861 t.AddImport("strconv")
862 t.EndSymbol = 0x110000
863 t.RulesCount++
864
865 counts := [TypeLast]uint{}
866 {
867 var rule *node
868 var link func(node Node)
869 link = func(n Node) {
870 nodeType := n.GetType()
871 id := counts[nodeType]
872 counts[nodeType]++
873 switch nodeType {
874 case TypeAction:
875 n.SetId(int(id))
876 copy, name := n.Copy(), fmt.Sprintf("Action%v", id)
877 t.Actions = append(t.Actions, copy)
878 n.Init()
879 n.SetType(TypeName)
880 n.SetString(name)
881 n.SetId(t.RulesCount)
882
883 emptyRule := &node{Type: TypeRule, string: name, id: t.RulesCount}
884 implicitPush := &node{Type: TypeImplicitPush}
885 emptyRule.PushBack(implicitPush)
886 implicitPush.PushBack(copy)
887 implicitPush.PushBack(emptyRule.Copy())
888 t.PushBack(emptyRule)
889 t.RulesCount++
890
891 t.Rules[name] = emptyRule
892 t.RuleNames = append(t.RuleNames, emptyRule)
893 case TypeName:
894 name := n.String()
895 if _, ok := t.Rules[name]; !ok {
896 emptyRule := &node{Type: TypeRule, string: name, id: t.RulesCount}
897 implicitPush := &node{Type: TypeImplicitPush}
898 emptyRule.PushBack(implicitPush)
899 implicitPush.PushBack(&node{Type: TypeNil, string: "<nil>"})
900 implicitPush.PushBack(emptyRule.Copy())
901 t.PushBack(emptyRule)
902 t.RulesCount++
903
904 t.Rules[name] = emptyRule
905 t.RuleNames = append(t.RuleNames, emptyRule)
906 }
907 case TypePush:
908 copy, name := rule.Copy(), "PegText"
909 copy.SetString(name)
910 if _, ok := t.Rules[name]; !ok {
911 emptyRule := &node{Type: TypeRule, string: name, id: t.RulesCount}
912 emptyRule.PushBack(&node{Type: TypeNil, string: "<nil>"})
913 t.PushBack(emptyRule)
914 t.RulesCount++
915
916 t.Rules[name] = emptyRule
917 t.RuleNames = append(t.RuleNames, emptyRule)
918 }
919 n.PushBack(copy)
920 fallthrough
921 case TypeImplicitPush:
922 link(n.Front())
923 case TypeRule, TypeAlternate, TypeUnorderedAlternate, TypeSequence,
924 TypePeekFor, TypePeekNot, TypeQuery, TypeStar, TypePlus:
925 for _, node := range n.Slice() {
926 link(node)
927 }
928 }
929 }
930 /* first pass */
931 for _, node := range t.Slice() {
932 switch node.GetType() {
933 case TypePackage:
934 t.PackageName = node.String()
935 case TypeImport:
936 t.Imports = append(t.Imports, node.String())
937 case TypePeg:
938 t.StructName = node.String()
939 t.StructVariables = node.Front().String()
940 case TypeRule:
941 if _, ok := t.Rules[node.String()]; !ok {
942 expression := node.Front()
943 copy := expression.Copy()
944 expression.Init()
945 expression.SetType(TypeImplicitPush)
946 expression.PushBack(copy)
947 expression.PushBack(node.Copy())
948
949 t.Rules[node.String()] = node
950 t.RuleNames = append(t.RuleNames, node)
951 }
952 }
953 }
954 /* second pass */
955 for _, node := range t.Slice() {
956 if node.GetType() == TypeRule {
957 rule = node
958 link(node)
959 }
960 }
961 }
962
963 join([]func(){
964 func() {
965 var countRules func(node Node)
966 ruleReached := make([]bool, t.RulesCount)
967 countRules = func(node Node) {
968 switch node.GetType() {
969 case TypeRule:
970 name, id := node.String(), node.GetId()
971 if count, ok := t.rulesCount[name]; ok {
972 t.rulesCount[name] = count + 1
973 } else {
974 t.rulesCount[name] = 1
975 }
976 if ruleReached[id] {
977 return
978 }
979 ruleReached[id] = true
980 countRules(node.Front())
981 case TypeName:
982 countRules(t.Rules[node.String()])
983 case TypeImplicitPush, TypePush:
984 countRules(node.Front())
985 case TypeAlternate, TypeUnorderedAlternate, TypeSequence,
986 TypePeekFor, TypePeekNot, TypeQuery, TypeStar, TypePlus:
987 for _, element := range node.Slice() {
988 countRules(element)
989 }
990 }
991 }
992 for _, node := range t.Slice() {
993 if node.GetType() == TypeRule {
994 countRules(node)
995 break
996 }
997 }
998 },
999 func() {
1000 var checkRecursion func(node Node) bool
1001 ruleReached := make([]bool, t.RulesCount)
1002 checkRecursion = func(node Node) bool {
1003 switch node.GetType() {
1004 case TypeRule:
1005 id := node.GetId()
1006 if ruleReached[id] {
1007 fmt.Fprintf(os.Stderr, "possible infinite left recursion in rule '%v'\n", node)
1008 return false
1009 }
1010 ruleReached[id] = true
1011 consumes := checkRecursion(node.Front())
1012 ruleReached[id] = false
1013 return consumes
1014 case TypeAlternate:
1015 for _, element := range node.Slice() {
1016 if !checkRecursion(element) {
1017 return false
1018 }
1019 }
1020 return true
1021 case TypeSequence:
1022 for _, element := range node.Slice() {
1023 if checkRecursion(element) {
1024 return true
1025 }
1026 }
1027 case TypeName:
1028 return checkRecursion(t.Rules[node.String()])
1029 case TypePlus, TypePush, TypeImplicitPush:
1030 return checkRecursion(node.Front())
1031 case TypeCharacter, TypeString:
1032 return len(node.String()) > 0
1033 case TypeDot, TypeRange:
1034 return true
1035 }
1036 return false
1037 }
1038 for _, node := range t.Slice() {
1039 if node.GetType() == TypeRule {
1040 checkRecursion(node)
1041 }
1042 }
1043 }})
1044
1045 if t._switch {
1046 var optimizeAlternates func(node Node) (consumes bool, s jetset.Set)
1047 cache, firstPass := make([]struct {
1048 reached, consumes bool
1049 s jetset.Set
1050 }, t.RulesCount), true
1051 optimizeAlternates = func(n Node) (consumes bool, s jetset.Set) {
1052 /*n.debug()*/
1053 switch n.GetType() {
1054 case TypeRule:
1055 cache := &cache[n.GetId()]
1056 if cache.reached {
1057 consumes, s = cache.consumes, cache.s
1058 return
1059 }
1060
1061 cache.reached = true
1062 consumes, s = optimizeAlternates(n.Front())
1063 cache.consumes, cache.s = consumes, s
1064 case TypeName:
1065 consumes, s = optimizeAlternates(t.Rules[n.String()])
1066 case TypeDot:
1067 consumes = true
1068 /* TypeDot set doesn't include the EndSymbol */
1069 s = s.Add(uint64(t.EndSymbol))
1070 s = s.Complement(uint64(t.EndSymbol))
1071 case TypeString, TypeCharacter:
1072 consumes = true
1073 s = s.Add(uint64([]rune(n.String())[0]))
1074 case TypeRange:
1075 consumes = true
1076 element := n.Front()
1077 lower := []rune(element.String())[0]
1078 element = element.Next()
1079 upper := []rune(element.String())[0]
1080 s = s.AddRange(uint64(lower), uint64(upper))
1081 case TypeAlternate:
1082 consumes = true
1083 mconsumes, properties, c :=
1084 consumes, make([]struct {
1085 intersects bool
1086 s jetset.Set
1087 }, n.Len()), 0
1088 for _, element := range n.Slice() {
1089 mconsumes, properties[c].s = optimizeAlternates(element)
1090 consumes = consumes && mconsumes
1091 s = s.Union(properties[c].s)
1092 c++
1093 }
1094
1095 if firstPass {
1096 break
1097 }
1098
1099 intersections := 2
1100 compare:
1101 for ai, a := range properties[0 : len(properties)-1] {
1102 for _, b := range properties[ai+1:] {
1103 if a.s.Intersects(b.s) {
1104 intersections++
1105 properties[ai].intersects = true
1106 continue compare
1107 }
1108 }
1109 }
1110 if intersections >= len(properties) {
1111 break
1112 }
1113
1114 c, unordered, ordered, max :=
1115 0, &node{Type: TypeUnorderedAlternate}, &node{Type: TypeAlternate}, 0
1116 for _, element := range n.Slice() {
1117 if properties[c].intersects {
1118 ordered.PushBack(element.Copy())
1119 } else {
1120 class := &node{Type: TypeUnorderedAlternate}
1121 for d := 0; d < 256; d++ {
1122 if properties[c].s.Has(uint64(d)) {
1123 class.PushBack(&node{Type: TypeCharacter, string: string(d)})
1124 }
1125 }
1126
1127 sequence, predicate, length :=
1128 &node{Type: TypeSequence}, &node{Type: TypePeekFor}, properties[c].s.Len()
1129 if length == 0 {
1130 class.PushBack(&node{Type: TypeNil, string: "<nil>"})
1131 }
1132 predicate.PushBack(class)
1133 sequence.PushBack(predicate)
1134 sequence.PushBack(element.Copy())
1135
1136 if element.GetType() == TypeNil {
1137 unordered.PushBack(sequence)
1138 } else if length > max {
1139 unordered.PushBack(sequence)
1140 max = length
1141 } else {
1142 unordered.PushFront(sequence)
1143 }
1144 }
1145 c++
1146 }
1147 n.Init()
1148 if ordered.Front() == nil {
1149 n.SetType(TypeUnorderedAlternate)
1150 for _, element := range unordered.Slice() {
1151 n.PushBack(element.Copy())
1152 }
1153 } else {
1154 for _, element := range ordered.Slice() {
1155 n.PushBack(element.Copy())
1156 }
1157 n.PushBack(unordered)
1158 }
1159 case TypeSequence:
1160 classes, elements :=
1161 make([]struct {
1162 s jetset.Set
1163 }, n.Len()), n.Slice()
1164
1165 for c, element := range elements {
1166 consumes, classes[c].s = optimizeAlternates(element)
1167 if consumes {
1168 elements, classes = elements[c+1:], classes[:c+1]
1169 break
1170 }
1171 }
1172
1173 for c := len(classes) - 1; c >= 0; c-- {
1174 s = s.Union(classes[c].s)
1175 }
1176
1177 for _, element := range elements {
1178 optimizeAlternates(element)
1179 }
1180 case TypePeekNot, TypePeekFor:
1181 optimizeAlternates(n.Front())
1182 case TypeQuery, TypeStar:
1183 _, s = optimizeAlternates(n.Front())
1184 case TypePlus, TypePush, TypeImplicitPush:
1185 consumes, s = optimizeAlternates(n.Front())
1186 case TypeAction, TypeNil:
1187 //empty
1188 }
1189 return
1190 }
1191 for _, element := range t.Slice() {
1192 if element.GetType() == TypeRule {
1193 optimizeAlternates(element)
1194 break
1195 }
1196 }
1197
1198 for i, _ := range cache {
1199 cache[i].reached = false
1200 }
1201 firstPass = false
1202 for _, element := range t.Slice() {
1203 if element.GetType() == TypeRule {
1204 optimizeAlternates(element)
1205 break
1206 }
1207 }
1208 }
1209
1210 var buffer bytes.Buffer
1211 defer func() {
1212 fileSet := token.NewFileSet()
1213 code, error := parser.ParseFile(fileSet, file, &buffer, parser.ParseComments)
1214 if error != nil {
1215 buffer.WriteTo(out)
1216 fmt.Printf("%v: %v\n", file, error)
1217 return
1218 }
1219 formatter := printer.Config{Mode: printer.TabIndent | printer.UseSpaces, Tabwidth: 8}
1220 error = formatter.Fprint(out, fileSet, code)
1221 if error != nil {
1222 buffer.WriteTo(out)
1223 fmt.Printf("%v: %v\n", file, error)
1224 return
1225 }
1226
1227 }()
1228
1229 _print := func(format string, a ...interface{}) { fmt.Fprintf(&buffer, format, a...) }
1230 printSave := func(n uint) { _print("\n position%d, tokenIndex%d, depth%d := position, tokenIndex, depth", n, n, n) }
1231 printRestore := func(n uint) { _print("\n position, tokenIndex, depth = position%d, tokenIndex%d, depth%d", n, n, n) }
1232 printTemplate := func(s string) {
1233 if error := template.Must(template.New("peg").Parse(s)).Execute(&buffer, t); error != nil {
1234 panic(error)
1235 }
1236 }
1237
1238 t.HasActions = counts[TypeAction] > 0
1239 t.HasPush = counts[TypePush] > 0
1240 t.HasCommit = counts[TypeCommit] > 0
1241 t.HasDot = counts[TypeDot] > 0
1242 t.HasCharacter = counts[TypeCharacter] > 0
1243 t.HasString = counts[TypeString] > 0
1244 t.HasRange = counts[TypeRange] > 0
1245
1246 var printRule func(n Node)
1247 var compile func(expression Node, ko uint)
1248 var label uint
1249 labels := make(map[uint]bool)
1250 printBegin := func() { _print("\n {") }
1251 printEnd := func() { _print("\n }") }
1252 printLabel := func(n uint) {
1253 _print("\n")
1254 if labels[n] {
1255 _print(" l%d:\t", n)
1256 }
1257 }
1258 printJump := func(n uint) {
1259 _print("\n goto l%d", n)
1260 labels[n] = true
1261 }
1262 printRule = func(n Node) {
1263 switch n.GetType() {
1264 case TypeRule:
1265 _print("%v <- ", n)
1266 printRule(n.Front())
1267 case TypeDot:
1268 _print(".")
1269 case TypeName:
1270 _print("%v", n)
1271 case TypeCharacter:
1272 _print("'%v'", escape(n.String()))
1273 case TypeString:
1274 s := escape(n.String())
1275 _print("'%v'", s[1:len(s)-1])
1276 case TypeRange:
1277 element := n.Front()
1278 lower := element
1279 element = element.Next()
1280 upper := element
1281 _print("[%v-%v]", escape(lower.String()), escape(upper.String()))
1282 case TypePredicate:
1283 _print("&{%v}", n)
1284 case TypeStateChange:
1285 _print("!{%v}", n)
1286 case TypeAction:
1287 _print("{%v}", n)
1288 case TypeCommit:
1289 _print("commit")
1290 case TypeAlternate:
1291 _print("(")
1292 elements := n.Slice()
1293 printRule(elements[0])
1294 for _, element := range elements[1:] {
1295 _print(" / ")
1296 printRule(element)
1297 }
1298 _print(")")
1299 case TypeUnorderedAlternate:
1300 _print("(")
1301 elements := n.Slice()
1302 printRule(elements[0])
1303 for _, element := range elements[1:] {
1304 _print(" | ")
1305 printRule(element)
1306 }
1307 _print(")")
1308 case TypeSequence:
1309 _print("(")
1310 elements := n.Slice()
1311 printRule(elements[0])
1312 for _, element := range elements[1:] {
1313 _print(" ")
1314 printRule(element)
1315 }
1316 _print(")")
1317 case TypePeekFor:
1318 _print("&")
1319 printRule(n.Front())
1320 case TypePeekNot:
1321 _print("!")
1322 printRule(n.Front())
1323 case TypeQuery:
1324 printRule(n.Front())
1325 _print("?")
1326 case TypeStar:
1327 printRule(n.Front())
1328 _print("*")
1329 case TypePlus:
1330 printRule(n.Front())
1331 _print("+")
1332 case TypePush, TypeImplicitPush:
1333 _print("<")
1334 printRule(n.Front())
1335 _print(">")
1336 case TypeNil:
1337 default:
1338 fmt.Fprintf(os.Stderr, "illegal node type: %v\n", n.GetType())
1339 }
1340 }
1341 compile = func(n Node, ko uint) {
1342 switch n.GetType() {
1343 case TypeRule:
1344 fmt.Fprintf(os.Stderr, "internal error #1 (%v)\n", n)
1345 case TypeDot:
1346 _print("\n if !matchDot() {")
1347 /*print("\n if buffer[position] == endSymbol {")*/
1348 printJump(ko)
1349 /*print("}\nposition++")*/
1350 _print("}")
1351 case TypeName:
1352 name := n.String()
1353 rule := t.Rules[name]
1354 if t.inline && t.rulesCount[name] == 1 {
1355 compile(rule.Front(), ko)
1356 return
1357 }
1358 _print("\n if !_rules[rule%v]() {", name /*rule.GetId()*/)
1359 printJump(ko)
1360 _print("}")
1361 case TypeRange:
1362 element := n.Front()
1363 lower := element
1364 element = element.Next()
1365 upper := element
1366 /*print("\n if !matchRange('%v', '%v') {", escape(lower.String()), escape(upper.String()))*/
1367 _print("\n if c := buffer[position]; c < rune('%v') || c > rune('%v') {", escape(lower.String()), escape(upper.String()))
1368 printJump(ko)
1369 _print("}\nposition++")
1370 case TypeCharacter:
1371 /*print("\n if !matchChar('%v') {", escape(n.String()))*/
1372 _print("\n if buffer[position] != rune('%v') {", escape(n.String()))
1373 printJump(ko)
1374 _print("}\nposition++")
1375 case TypeString:
1376 _print("\n if !matchString(%v) {", strconv.Quote(n.String()))
1377 printJump(ko)
1378 _print("}")
1379 case TypePredicate:
1380 _print("\n if !(%v) {", n)
1381 printJump(ko)
1382 _print("}")
1383 case TypeStateChange:
1384 _print("\n %v", n)
1385 case TypeAction:
1386 case TypeCommit:
1387 case TypePush:
1388 fallthrough
1389 case TypeImplicitPush:
1390 ok, element := label, n.Front()
1391 label++
1392 nodeType, rule := element.GetType(), element.Next()
1393 printBegin()
1394 if nodeType == TypeAction {
1395 _print("\nadd(rule%v, position)", rule)
1396 } else {
1397 _print("\nposition%d := position", ok)
1398 _print("\ndepth++")
1399 compile(element, ko)
1400 _print("\ndepth--")
1401 _print("\nadd(rule%v, position%d)", rule, ok)
1402 }
1403 printEnd()
1404 case TypeAlternate:
1405 ok := label
1406 label++
1407 printBegin()
1408 elements := n.Slice()
1409 printSave(ok)
1410 for _, element := range elements[:len(elements)-1] {
1411 next := label
1412 label++
1413 compile(element, next)
1414 printJump(ok)
1415 printLabel(next)
1416 printRestore(ok)
1417 }
1418 compile(elements[len(elements)-1], ko)
1419 printEnd()
1420 printLabel(ok)
1421 case TypeUnorderedAlternate:
1422 done, ok := ko, label
1423 label++
1424 printBegin()
1425 _print("\n switch buffer[position] {")
1426 elements := n.Slice()
1427 elements, last := elements[:len(elements)-1], elements[len(elements)-1].Front().Next()
1428 for _, element := range elements {
1429 sequence := element.Front()
1430 class := sequence.Front()
1431 sequence = sequence.Next()
1432 _print("\n case")
1433 comma := false
1434 for _, character := range class.Slice() {
1435 if comma {
1436 _print(",")
1437 } else {
1438 comma = true
1439 }
1440 _print(" '%s'", escape(character.String()))
1441 }
1442 _print(":")
1443 compile(sequence, done)
1444 _print("\nbreak")
1445 }
1446 _print("\n default:")
1447 compile(last, done)
1448 _print("\nbreak")
1449 _print("\n }")
1450 printEnd()
1451 printLabel(ok)
1452 case TypeSequence:
1453 for _, element := range n.Slice() {
1454 compile(element, ko)
1455 }
1456 case TypePeekFor:
1457 ok := label
1458 label++
1459 printBegin()
1460 printSave(ok)
1461 compile(n.Front(), ko)
1462 printRestore(ok)
1463 printEnd()
1464 case TypePeekNot:
1465 ok := label
1466 label++
1467 printBegin()
1468 printSave(ok)
1469 compile(n.Front(), ok)
1470 printJump(ko)
1471 printLabel(ok)
1472 printRestore(ok)
1473 printEnd()
1474 case TypeQuery:
1475 qko := label
1476 label++
1477 qok := label
1478 label++
1479 printBegin()
1480 printSave(qko)
1481 compile(n.Front(), qko)
1482 printJump(qok)
1483 printLabel(qko)
1484 printRestore(qko)
1485 printEnd()
1486 printLabel(qok)
1487 case TypeStar:
1488 again := label
1489 label++
1490 out := label
1491 label++
1492 printLabel(again)
1493 printBegin()
1494 printSave(out)
1495 compile(n.Front(), out)
1496 printJump(again)
1497 printLabel(out)
1498 printRestore(out)
1499 printEnd()
1500 case TypePlus:
1501 again := label
1502 label++
1503 out := label
1504 label++
1505 compile(n.Front(), ko)
1506 printLabel(again)
1507 printBegin()
1508 printSave(out)
1509 compile(n.Front(), out)
1510 printJump(again)
1511 printLabel(out)
1512 printRestore(out)
1513 printEnd()
1514 case TypeNil:
1515 default:
1516 fmt.Fprintf(os.Stderr, "illegal node type: %v\n", n.GetType())
1517 }
1518 }
1519
1520 /* lets figure out which jump labels are going to be used with this dry compile */
1521 printTemp, _print := _print, func(format string, a ...interface{}) {}
1522 for _, element := range t.Slice() {
1523 if element.GetType() != TypeRule {
1524 continue
1525 }
1526 expression := element.Front()
1527 if expression.GetType() == TypeNil {
1528 continue
1529 }
1530 ko := label
1531 label++
1532 if count, ok := t.rulesCount[element.String()]; !ok {
1533 continue
1534 } else if t.inline && count == 1 && ko != 0 {
1535 continue
1536 }
1537 compile(expression, ko)
1538 }
1539 _print, label = printTemp, 0
1540
1541 /* now for the real compile pass */
1542 t.PegRuleType = "uint8"
1543 if length := int64(t.Len()); length > math.MaxUint32 {
1544 t.PegRuleType = "uint64"
1545 } else if length > math.MaxUint16 {
1546 t.PegRuleType = "uint32"
1547 } else if length > math.MaxUint8 {
1548 t.PegRuleType = "uint16"
1549 }
1550 printTemplate(pegHeaderTemplate)
1551 for _, element := range t.Slice() {
1552 if element.GetType() != TypeRule {
1553 continue
1554 }
1555 expression := element.Front()
1556 if implicit := expression.Front(); expression.GetType() == TypeNil || implicit.GetType() == TypeNil {
1557 if element.String() != "PegText" {
1558 fmt.Fprintf(os.Stderr, "rule '%v' used but not defined\n", element)
1559 }
1560 _print("\n nil,")
1561 continue
1562 }
1563 ko := label
1564 label++
1565 _print("\n /* %v ", element.GetId())
1566 printRule(element)
1567 _print(" */")
1568 if count, ok := t.rulesCount[element.String()]; !ok {
1569 fmt.Fprintf(os.Stderr, "rule '%v' defined but not used\n", element)
1570 _print("\n nil,")
1571 continue
1572 } else if t.inline && count == 1 && ko != 0 {
1573 _print("\n nil,")
1574 continue
1575 }
1576 _print("\n func() bool {")
1577 if labels[ko] {
1578 printSave(ko)
1579 }
1580 compile(expression, ko)
1581 //print("\n fmt.Printf(\"%v\\n\")", element.String())
1582 _print("\n return true")
1583 if labels[ko] {
1584 printLabel(ko)
1585 printRestore(ko)
1586 _print("\n return false")
1587 }
1588 _print("\n },")
1589 }
1590 _print("\n }\n p.rules = _rules")
1591 _print("\n}\n")
1592 }
99
1010 package main
1111
12 import "github.com/pointlander/peg/tree"
13
1214 # parser declaration
1315
1416 type Peg Peg {
15 *Tree
17 *tree.Tree
1618 }
1719
1820 # Hierarchical syntax
2224 'Peg' Spacing Action { p.AddState(text) }
2325 Definition+ EndOfFile
2426
25 Import <- 'import' Spacing ["] < [a-zA-Z_/.\-]+ > ["] Spacing { p.AddImport(text) }
27 Import <- 'import' Spacing (MultiImport / SingleImport) Spacing
28 SingleImport <- ImportName
29 MultiImport <- '(' Spacing (ImportName '\n' Spacing)* Spacing ')'
30
31 ImportName <- ["] < [0-9a-zA-Z_/.\-]+ > ["] { p.AddImport(text) }
2632
2733 Definition <- Identifier { p.AddRule(text) }
2834 LeftArrow Expression { p.AddExpression() } &(Identifier LeftArrow / !.)
109115 SpaceComment <- (Space / Comment)
110116 Spacing <- SpaceComment*
111117 MustSpacing <- SpaceComment+
112 Comment <- '#' (!EndOfLine .)* EndOfLine
118 Comment <- ('#' / '//') (!EndOfLine .)* EndOfLine
113119 Space <- ' ' / '\t' / EndOfLine
114120 EndOfLine <- '\r\n' / '\n' / '\r'
115121 EndOfFile <- !.
0 package main
1
2 // Code generated by ./peg -inline -switch peg.peg DO NOT EDIT.
3
4 import (
5 "fmt"
6 "github.com/pointlander/peg/tree"
7 "io"
8 "os"
9 "sort"
10 "strconv"
11 "strings"
12 )
13
14 const endSymbol rune = 1114112
15
16 /* The rule types inferred from the grammar are below. */
17 type pegRule uint8
18
19 const (
20 ruleUnknown pegRule = iota
21 ruleGrammar
22 ruleImport
23 ruleSingleImport
24 ruleMultiImport
25 ruleImportName
26 ruleDefinition
27 ruleExpression
28 ruleSequence
29 rulePrefix
30 ruleSuffix
31 rulePrimary
32 ruleIdentifier
33 ruleIdentStart
34 ruleIdentCont
35 ruleLiteral
36 ruleClass
37 ruleRanges
38 ruleDoubleRanges
39 ruleRange
40 ruleDoubleRange
41 ruleChar
42 ruleDoubleChar
43 ruleEscape
44 ruleLeftArrow
45 ruleSlash
46 ruleAnd
47 ruleNot
48 ruleQuestion
49 ruleStar
50 rulePlus
51 ruleOpen
52 ruleClose
53 ruleDot
54 ruleSpaceComment
55 ruleSpacing
56 ruleMustSpacing
57 ruleComment
58 ruleSpace
59 ruleEndOfLine
60 ruleEndOfFile
61 ruleAction
62 ruleActionBody
63 ruleBegin
64 ruleEnd
65 ruleAction0
66 ruleAction1
67 ruleAction2
68 rulePegText
69 ruleAction3
70 ruleAction4
71 ruleAction5
72 ruleAction6
73 ruleAction7
74 ruleAction8
75 ruleAction9
76 ruleAction10
77 ruleAction11
78 ruleAction12
79 ruleAction13
80 ruleAction14
81 ruleAction15
82 ruleAction16
83 ruleAction17
84 ruleAction18
85 ruleAction19
86 ruleAction20
87 ruleAction21
88 ruleAction22
89 ruleAction23
90 ruleAction24
91 ruleAction25
92 ruleAction26
93 ruleAction27
94 ruleAction28
95 ruleAction29
96 ruleAction30
97 ruleAction31
98 ruleAction32
99 ruleAction33
100 ruleAction34
101 ruleAction35
102 ruleAction36
103 ruleAction37
104 ruleAction38
105 ruleAction39
106 ruleAction40
107 ruleAction41
108 ruleAction42
109 ruleAction43
110 ruleAction44
111 ruleAction45
112 ruleAction46
113 ruleAction47
114 ruleAction48
115 )
116
117 var rul3s = [...]string{
118 "Unknown",
119 "Grammar",
120 "Import",
121 "SingleImport",
122 "MultiImport",
123 "ImportName",
124 "Definition",
125 "Expression",
126 "Sequence",
127 "Prefix",
128 "Suffix",
129 "Primary",
130 "Identifier",
131 "IdentStart",
132 "IdentCont",
133 "Literal",
134 "Class",
135 "Ranges",
136 "DoubleRanges",
137 "Range",
138 "DoubleRange",
139 "Char",
140 "DoubleChar",
141 "Escape",
142 "LeftArrow",
143 "Slash",
144 "And",
145 "Not",
146 "Question",
147 "Star",
148 "Plus",
149 "Open",
150 "Close",
151 "Dot",
152 "SpaceComment",
153 "Spacing",
154 "MustSpacing",
155 "Comment",
156 "Space",
157 "EndOfLine",
158 "EndOfFile",
159 "Action",
160 "ActionBody",
161 "Begin",
162 "End",
163 "Action0",
164 "Action1",
165 "Action2",
166 "PegText",
167 "Action3",
168 "Action4",
169 "Action5",
170 "Action6",
171 "Action7",
172 "Action8",
173 "Action9",
174 "Action10",
175 "Action11",
176 "Action12",
177 "Action13",
178 "Action14",
179 "Action15",
180 "Action16",
181 "Action17",
182 "Action18",
183 "Action19",
184 "Action20",
185 "Action21",
186 "Action22",
187 "Action23",
188 "Action24",
189 "Action25",
190 "Action26",
191 "Action27",
192 "Action28",
193 "Action29",
194 "Action30",
195 "Action31",
196 "Action32",
197 "Action33",
198 "Action34",
199 "Action35",
200 "Action36",
201 "Action37",
202 "Action38",
203 "Action39",
204 "Action40",
205 "Action41",
206 "Action42",
207 "Action43",
208 "Action44",
209 "Action45",
210 "Action46",
211 "Action47",
212 "Action48",
213 }
214
215 type token32 struct {
216 pegRule
217 begin, end uint32
218 }
219
220 func (t *token32) String() string {
221 return fmt.Sprintf("\x1B[34m%v\x1B[m %v %v", rul3s[t.pegRule], t.begin, t.end)
222 }
223
224 type node32 struct {
225 token32
226 up, next *node32
227 }
228
229 func (node *node32) print(w io.Writer, pretty bool, buffer string) {
230 var print func(node *node32, depth int)
231 print = func(node *node32, depth int) {
232 for node != nil {
233 for c := 0; c < depth; c++ {
234 fmt.Fprintf(w, " ")
235 }
236 rule := rul3s[node.pegRule]
237 quote := strconv.Quote(string(([]rune(buffer)[node.begin:node.end])))
238 if !pretty {
239 fmt.Fprintf(w, "%v %v\n", rule, quote)
240 } else {
241 fmt.Fprintf(w, "\x1B[36m%v\x1B[m %v\n", rule, quote)
242 }
243 if node.up != nil {
244 print(node.up, depth+1)
245 }
246 node = node.next
247 }
248 }
249 print(node, 0)
250 }
251
252 func (node *node32) Print(w io.Writer, buffer string) {
253 node.print(w, false, buffer)
254 }
255
256 func (node *node32) PrettyPrint(w io.Writer, buffer string) {
257 node.print(w, true, buffer)
258 }
259
260 type tokens32 struct {
261 tree []token32
262 }
263
264 func (t *tokens32) Trim(length uint32) {
265 t.tree = t.tree[:length]
266 }
267
268 func (t *tokens32) Print() {
269 for _, token := range t.tree {
270 fmt.Println(token.String())
271 }
272 }
273
274 func (t *tokens32) AST() *node32 {
275 type element struct {
276 node *node32
277 down *element
278 }
279 tokens := t.Tokens()
280 var stack *element
281 for _, token := range tokens {
282 if token.begin == token.end {
283 continue
284 }
285 node := &node32{token32: token}
286 for stack != nil && stack.node.begin >= token.begin && stack.node.end <= token.end {
287 stack.node.next = node.up
288 node.up = stack.node
289 stack = stack.down
290 }
291 stack = &element{node: node, down: stack}
292 }
293 if stack != nil {
294 return stack.node
295 }
296 return nil
297 }
298
299 func (t *tokens32) PrintSyntaxTree(buffer string) {
300 t.AST().Print(os.Stdout, buffer)
301 }
302
303 func (t *tokens32) WriteSyntaxTree(w io.Writer, buffer string) {
304 t.AST().Print(w, buffer)
305 }
306
307 func (t *tokens32) PrettyPrintSyntaxTree(buffer string) {
308 t.AST().PrettyPrint(os.Stdout, buffer)
309 }
310
311 func (t *tokens32) Add(rule pegRule, begin, end, index uint32) {
312 tree, i := t.tree, int(index)
313 if i >= len(tree) {
314 t.tree = append(tree, token32{pegRule: rule, begin: begin, end: end})
315 return
316 }
317 tree[i] = token32{pegRule: rule, begin: begin, end: end}
318 }
319
320 func (t *tokens32) Tokens() []token32 {
321 return t.tree
322 }
323
324 type Peg struct {
325 *tree.Tree
326
327 Buffer string
328 buffer []rune
329 rules [95]func() bool
330 parse func(rule ...int) error
331 reset func()
332 Pretty bool
333 tokens32
334 }
335
336 func (p *Peg) Parse(rule ...int) error {
337 return p.parse(rule...)
338 }
339
340 func (p *Peg) Reset() {
341 p.reset()
342 }
343
344 type textPosition struct {
345 line, symbol int
346 }
347
348 type textPositionMap map[int]textPosition
349
350 func translatePositions(buffer []rune, positions []int) textPositionMap {
351 length, translations, j, line, symbol := len(positions), make(textPositionMap, len(positions)), 0, 1, 0
352 sort.Ints(positions)
353
354 search:
355 for i, c := range buffer {
356 if c == '\n' {
357 line, symbol = line+1, 0
358 } else {
359 symbol++
360 }
361 if i == positions[j] {
362 translations[positions[j]] = textPosition{line, symbol}
363 for j++; j < length; j++ {
364 if i != positions[j] {
365 continue search
366 }
367 }
368 break search
369 }
370 }
371
372 return translations
373 }
374
375 type parseError struct {
376 p *Peg
377 max token32
378 }
379
380 func (e *parseError) Error() string {
381 tokens, err := []token32{e.max}, "\n"
382 positions, p := make([]int, 2*len(tokens)), 0
383 for _, token := range tokens {
384 positions[p], p = int(token.begin), p+1
385 positions[p], p = int(token.end), p+1
386 }
387 translations := translatePositions(e.p.buffer, positions)
388 format := "parse error near %v (line %v symbol %v - line %v symbol %v):\n%v\n"
389 if e.p.Pretty {
390 format = "parse error near \x1B[34m%v\x1B[m (line %v symbol %v - line %v symbol %v):\n%v\n"
391 }
392 for _, token := range tokens {
393 begin, end := int(token.begin), int(token.end)
394 err += fmt.Sprintf(format,
395 rul3s[token.pegRule],
396 translations[begin].line, translations[begin].symbol,
397 translations[end].line, translations[end].symbol,
398 strconv.Quote(string(e.p.buffer[begin:end])))
399 }
400
401 return err
402 }
403
404 func (p *Peg) PrintSyntaxTree() {
405 if p.Pretty {
406 p.tokens32.PrettyPrintSyntaxTree(p.Buffer)
407 } else {
408 p.tokens32.PrintSyntaxTree(p.Buffer)
409 }
410 }
411
412 func (p *Peg) WriteSyntaxTree(w io.Writer) {
413 p.tokens32.WriteSyntaxTree(w, p.Buffer)
414 }
415
416 func (p *Peg) SprintSyntaxTree() string {
417 var bldr strings.Builder
418 p.WriteSyntaxTree(&bldr)
419 return bldr.String()
420 }
421
422 func (p *Peg) Execute() {
423 buffer, _buffer, text, begin, end := p.Buffer, p.buffer, "", 0, 0
424 for _, token := range p.Tokens() {
425 switch token.pegRule {
426
427 case rulePegText:
428 begin, end = int(token.begin), int(token.end)
429 text = string(_buffer[begin:end])
430
431 case ruleAction0:
432 p.AddPackage(text)
433 case ruleAction1:
434 p.AddPeg(text)
435 case ruleAction2:
436 p.AddState(text)
437 case ruleAction3:
438 p.AddImport(text)
439 case ruleAction4:
440 p.AddRule(text)
441 case ruleAction5:
442 p.AddExpression()
443 case ruleAction6:
444 p.AddAlternate()
445 case ruleAction7:
446 p.AddNil()
447 p.AddAlternate()
448 case ruleAction8:
449 p.AddNil()
450 case ruleAction9:
451 p.AddSequence()
452 case ruleAction10:
453 p.AddPredicate(text)
454 case ruleAction11:
455 p.AddStateChange(text)
456 case ruleAction12:
457 p.AddPeekFor()
458 case ruleAction13:
459 p.AddPeekNot()
460 case ruleAction14:
461 p.AddQuery()
462 case ruleAction15:
463 p.AddStar()
464 case ruleAction16:
465 p.AddPlus()
466 case ruleAction17:
467 p.AddName(text)
468 case ruleAction18:
469 p.AddDot()
470 case ruleAction19:
471 p.AddAction(text)
472 case ruleAction20:
473 p.AddPush()
474 case ruleAction21:
475 p.AddSequence()
476 case ruleAction22:
477 p.AddSequence()
478 case ruleAction23:
479 p.AddPeekNot()
480 p.AddDot()
481 p.AddSequence()
482 case ruleAction24:
483 p.AddPeekNot()
484 p.AddDot()
485 p.AddSequence()
486 case ruleAction25:
487 p.AddAlternate()
488 case ruleAction26:
489 p.AddAlternate()
490 case ruleAction27:
491 p.AddRange()
492 case ruleAction28:
493 p.AddDoubleRange()
494 case ruleAction29:
495 p.AddCharacter(text)
496 case ruleAction30:
497 p.AddDoubleCharacter(text)
498 case ruleAction31:
499 p.AddCharacter(text)
500 case ruleAction32:
501 p.AddCharacter("\a")
502 case ruleAction33:
503 p.AddCharacter("\b")
504 case ruleAction34:
505 p.AddCharacter("\x1B")
506 case ruleAction35:
507 p.AddCharacter("\f")
508 case ruleAction36:
509 p.AddCharacter("\n")
510 case ruleAction37:
511 p.AddCharacter("\r")
512 case ruleAction38:
513 p.AddCharacter("\t")
514 case ruleAction39:
515 p.AddCharacter("\v")
516 case ruleAction40:
517 p.AddCharacter("'")
518 case ruleAction41:
519 p.AddCharacter("\"")
520 case ruleAction42:
521 p.AddCharacter("[")
522 case ruleAction43:
523 p.AddCharacter("]")
524 case ruleAction44:
525 p.AddCharacter("-")
526 case ruleAction45:
527 p.AddHexaCharacter(text)
528 case ruleAction46:
529 p.AddOctalCharacter(text)
530 case ruleAction47:
531 p.AddOctalCharacter(text)
532 case ruleAction48:
533 p.AddCharacter("\\")
534
535 }
536 }
537 _, _, _, _, _ = buffer, _buffer, text, begin, end
538 }
539
540 func Pretty(pretty bool) func(*Peg) error {
541 return func(p *Peg) error {
542 p.Pretty = pretty
543 return nil
544 }
545 }
546
547 func Size(size int) func(*Peg) error {
548 return func(p *Peg) error {
549 p.tokens32 = tokens32{tree: make([]token32, 0, size)}
550 return nil
551 }
552 }
553 func (p *Peg) Init(options ...func(*Peg) error) error {
554 var (
555 max token32
556 position, tokenIndex uint32
557 buffer []rune
558 )
559 for _, option := range options {
560 err := option(p)
561 if err != nil {
562 return err
563 }
564 }
565 p.reset = func() {
566 max = token32{}
567 position, tokenIndex = 0, 0
568
569 p.buffer = []rune(p.Buffer)
570 if len(p.buffer) == 0 || p.buffer[len(p.buffer)-1] != endSymbol {
571 p.buffer = append(p.buffer, endSymbol)
572 }
573 buffer = p.buffer
574 }
575 p.reset()
576
577 _rules := p.rules
578 tree := p.tokens32
579 p.parse = func(rule ...int) error {
580 r := 1
581 if len(rule) > 0 {
582 r = rule[0]
583 }
584 matches := p.rules[r]()
585 p.tokens32 = tree
586 if matches {
587 p.Trim(tokenIndex)
588 return nil
589 }
590 return &parseError{p, max}
591 }
592
593 add := func(rule pegRule, begin uint32) {
594 tree.Add(rule, begin, position, tokenIndex)
595 tokenIndex++
596 if begin != position && position > max.end {
597 max = token32{rule, begin, position}
598 }
599 }
600
601 matchDot := func() bool {
602 if buffer[position] != endSymbol {
603 position++
604 return true
605 }
606 return false
607 }
608
609 /*matchChar := func(c byte) bool {
610 if buffer[position] == c {
611 position++
612 return true
613 }
614 return false
615 }*/
616
617 /*matchRange := func(lower byte, upper byte) bool {
618 if c := buffer[position]; c >= lower && c <= upper {
619 position++
620 return true
621 }
622 return false
623 }*/
624
625 _rules = [...]func() bool{
626 nil,
627 /* 0 Grammar <- <(Spacing ('p' 'a' 'c' 'k' 'a' 'g' 'e') MustSpacing Identifier Action0 Import* ('t' 'y' 'p' 'e') MustSpacing Identifier Action1 ('P' 'e' 'g') Spacing Action Action2 Definition+ EndOfFile)> */
628 func() bool {
629 position0, tokenIndex0 := position, tokenIndex
630 {
631 position1 := position
632 if !_rules[ruleSpacing]() {
633 goto l0
634 }
635 if buffer[position] != rune('p') {
636 goto l0
637 }
638 position++
639 if buffer[position] != rune('a') {
640 goto l0
641 }
642 position++
643 if buffer[position] != rune('c') {
644 goto l0
645 }
646 position++
647 if buffer[position] != rune('k') {
648 goto l0
649 }
650 position++
651 if buffer[position] != rune('a') {
652 goto l0
653 }
654 position++
655 if buffer[position] != rune('g') {
656 goto l0
657 }
658 position++
659 if buffer[position] != rune('e') {
660 goto l0
661 }
662 position++
663 if !_rules[ruleMustSpacing]() {
664 goto l0
665 }
666 if !_rules[ruleIdentifier]() {
667 goto l0
668 }
669 {
670 add(ruleAction0, position)
671 }
672 l3:
673 {
674 position4, tokenIndex4 := position, tokenIndex
675 {
676 position5 := position
677 if buffer[position] != rune('i') {
678 goto l4
679 }
680 position++
681 if buffer[position] != rune('m') {
682 goto l4
683 }
684 position++
685 if buffer[position] != rune('p') {
686 goto l4
687 }
688 position++
689 if buffer[position] != rune('o') {
690 goto l4
691 }
692 position++
693 if buffer[position] != rune('r') {
694 goto l4
695 }
696 position++
697 if buffer[position] != rune('t') {
698 goto l4
699 }
700 position++
701 if !_rules[ruleSpacing]() {
702 goto l4
703 }
704 {
705 position6, tokenIndex6 := position, tokenIndex
706 {
707 position8 := position
708 if buffer[position] != rune('(') {
709 goto l7
710 }
711 position++
712 if !_rules[ruleSpacing]() {
713 goto l7
714 }
715 l9:
716 {
717 position10, tokenIndex10 := position, tokenIndex
718 if !_rules[ruleImportName]() {
719 goto l10
720 }
721 if buffer[position] != rune('\n') {
722 goto l10
723 }
724 position++
725 if !_rules[ruleSpacing]() {
726 goto l10
727 }
728 goto l9
729 l10:
730 position, tokenIndex = position10, tokenIndex10
731 }
732 if !_rules[ruleSpacing]() {
733 goto l7
734 }
735 if buffer[position] != rune(')') {
736 goto l7
737 }
738 position++
739 add(ruleMultiImport, position8)
740 }
741 goto l6
742 l7:
743 position, tokenIndex = position6, tokenIndex6
744 {
745 position11 := position
746 if !_rules[ruleImportName]() {
747 goto l4
748 }
749 add(ruleSingleImport, position11)
750 }
751 }
752 l6:
753 if !_rules[ruleSpacing]() {
754 goto l4
755 }
756 add(ruleImport, position5)
757 }
758 goto l3
759 l4:
760 position, tokenIndex = position4, tokenIndex4
761 }
762 if buffer[position] != rune('t') {
763 goto l0
764 }
765 position++
766 if buffer[position] != rune('y') {
767 goto l0
768 }
769 position++
770 if buffer[position] != rune('p') {
771 goto l0
772 }
773 position++
774 if buffer[position] != rune('e') {
775 goto l0
776 }
777 position++
778 if !_rules[ruleMustSpacing]() {
779 goto l0
780 }
781 if !_rules[ruleIdentifier]() {
782 goto l0
783 }
784 {
785 add(ruleAction1, position)
786 }
787 if buffer[position] != rune('P') {
788 goto l0
789 }
790 position++
791 if buffer[position] != rune('e') {
792 goto l0
793 }
794 position++
795 if buffer[position] != rune('g') {
796 goto l0
797 }
798 position++
799 if !_rules[ruleSpacing]() {
800 goto l0
801 }
802 if !_rules[ruleAction]() {
803 goto l0
804 }
805 {
806 add(ruleAction2, position)
807 }
808 {
809 position16 := position
810 if !_rules[ruleIdentifier]() {
811 goto l0
812 }
813 {
814 add(ruleAction4, position)
815 }
816 if !_rules[ruleLeftArrow]() {
817 goto l0
818 }
819 if !_rules[ruleExpression]() {
820 goto l0
821 }
822 {
823 add(ruleAction5, position)
824 }
825 {
826 position19, tokenIndex19 := position, tokenIndex
827 {
828 position20, tokenIndex20 := position, tokenIndex
829 if !_rules[ruleIdentifier]() {
830 goto l21
831 }
832 if !_rules[ruleLeftArrow]() {
833 goto l21
834 }
835 goto l20
836 l21:
837 position, tokenIndex = position20, tokenIndex20
838 {
839 position22, tokenIndex22 := position, tokenIndex
840 if !matchDot() {
841 goto l22
842 }
843 goto l0
844 l22:
845 position, tokenIndex = position22, tokenIndex22
846 }
847 }
848 l20:
849 position, tokenIndex = position19, tokenIndex19
850 }
851 add(ruleDefinition, position16)
852 }
853 l14:
854 {
855 position15, tokenIndex15 := position, tokenIndex
856 {
857 position23 := position
858 if !_rules[ruleIdentifier]() {
859 goto l15
860 }
861 {
862 add(ruleAction4, position)
863 }
864 if !_rules[ruleLeftArrow]() {
865 goto l15
866 }
867 if !_rules[ruleExpression]() {
868 goto l15
869 }
870 {
871 add(ruleAction5, position)
872 }
873 {
874 position26, tokenIndex26 := position, tokenIndex
875 {
876 position27, tokenIndex27 := position, tokenIndex
877 if !_rules[ruleIdentifier]() {
878 goto l28
879 }
880 if !_rules[ruleLeftArrow]() {
881 goto l28
882 }
883 goto l27
884 l28:
885 position, tokenIndex = position27, tokenIndex27
886 {
887 position29, tokenIndex29 := position, tokenIndex
888 if !matchDot() {
889 goto l29
890 }
891 goto l15
892 l29:
893 position, tokenIndex = position29, tokenIndex29
894 }
895 }
896 l27:
897 position, tokenIndex = position26, tokenIndex26
898 }
899 add(ruleDefinition, position23)
900 }
901 goto l14
902 l15:
903 position, tokenIndex = position15, tokenIndex15
904 }
905 {
906 position30 := position
907 {
908 position31, tokenIndex31 := position, tokenIndex
909 if !matchDot() {
910 goto l31
911 }
912 goto l0
913 l31:
914 position, tokenIndex = position31, tokenIndex31
915 }
916 add(ruleEndOfFile, position30)
917 }
918 add(ruleGrammar, position1)
919 }
920 return true
921 l0:
922 position, tokenIndex = position0, tokenIndex0
923 return false
924 },
925 /* 1 Import <- <('i' 'm' 'p' 'o' 'r' 't' Spacing (MultiImport / SingleImport) Spacing)> */
926 nil,
927 /* 2 SingleImport <- <ImportName> */
928 nil,
929 /* 3 MultiImport <- <('(' Spacing (ImportName '\n' Spacing)* Spacing ')')> */
930 nil,
931 /* 4 ImportName <- <('"' <((&('-') '-') | (&('.') '.') | (&('/') '/') | (&('_') '_') | (&('A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z') [A-Z]) | (&('0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9') [0-9]) | (&('a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z') [a-z]))+> '"' Action3)> */
932 func() bool {
933 position35, tokenIndex35 := position, tokenIndex
934 {
935 position36 := position
936 if buffer[position] != rune('"') {
937 goto l35
938 }
939 position++
940 {
941 position37 := position
942 {
943 switch buffer[position] {
944 case '-':
945 if buffer[position] != rune('-') {
946 goto l35
947 }
948 position++
949 case '.':
950 if buffer[position] != rune('.') {
951 goto l35
952 }
953 position++
954 case '/':
955 if buffer[position] != rune('/') {
956 goto l35
957 }
958 position++
959 case '_':
960 if buffer[position] != rune('_') {
961 goto l35
962 }
963 position++
964 case 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z':
965 if c := buffer[position]; c < rune('A') || c > rune('Z') {
966 goto l35
967 }
968 position++
969 case '0', '1', '2', '3', '4', '5', '6', '7', '8', '9':
970 if c := buffer[position]; c < rune('0') || c > rune('9') {
971 goto l35
972 }
973 position++
974 default:
975 if c := buffer[position]; c < rune('a') || c > rune('z') {
976 goto l35
977 }
978 position++
979 }
980 }
981
982 l38:
983 {
984 position39, tokenIndex39 := position, tokenIndex
985 {
986 switch buffer[position] {
987 case '-':
988 if buffer[position] != rune('-') {
989 goto l39
990 }
991 position++
992 case '.':
993 if buffer[position] != rune('.') {
994 goto l39
995 }
996 position++
997 case '/':
998 if buffer[position] != rune('/') {
999 goto l39
1000 }
1001 position++
1002 case '_':
1003 if buffer[position] != rune('_') {
1004 goto l39
1005 }
1006 position++
1007 case 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z':
1008 if c := buffer[position]; c < rune('A') || c > rune('Z') {
1009 goto l39
1010 }
1011 position++
1012 case '0', '1', '2', '3', '4', '5', '6', '7', '8', '9':
1013 if c := buffer[position]; c < rune('0') || c > rune('9') {
1014 goto l39
1015 }
1016 position++
1017 default:
1018 if c := buffer[position]; c < rune('a') || c > rune('z') {
1019 goto l39
1020 }
1021 position++
1022 }
1023 }
1024
1025 goto l38
1026 l39:
1027 position, tokenIndex = position39, tokenIndex39
1028 }
1029 add(rulePegText, position37)
1030 }
1031 if buffer[position] != rune('"') {
1032 goto l35
1033 }
1034 position++
1035 {
1036 add(ruleAction3, position)
1037 }
1038 add(ruleImportName, position36)
1039 }
1040 return true
1041 l35:
1042 position, tokenIndex = position35, tokenIndex35
1043 return false
1044 },
1045 /* 5 Definition <- <(Identifier Action4 LeftArrow Expression Action5 &((Identifier LeftArrow) / !.))> */
1046 nil,
1047 /* 6 Expression <- <((Sequence (Slash Sequence Action6)* (Slash Action7)?) / Action8)> */
1048 func() bool {
1049 {
1050 position45 := position
1051 {
1052 position46, tokenIndex46 := position, tokenIndex
1053 if !_rules[ruleSequence]() {
1054 goto l47
1055 }
1056 l48:
1057 {
1058 position49, tokenIndex49 := position, tokenIndex
1059 if !_rules[ruleSlash]() {
1060 goto l49
1061 }
1062 if !_rules[ruleSequence]() {
1063 goto l49
1064 }
1065 {
1066 add(ruleAction6, position)
1067 }
1068 goto l48
1069 l49:
1070 position, tokenIndex = position49, tokenIndex49
1071 }
1072 {
1073 position51, tokenIndex51 := position, tokenIndex
1074 if !_rules[ruleSlash]() {
1075 goto l51
1076 }
1077 {
1078 add(ruleAction7, position)
1079 }
1080 goto l52
1081 l51:
1082 position, tokenIndex = position51, tokenIndex51
1083 }
1084 l52:
1085 goto l46
1086 l47:
1087 position, tokenIndex = position46, tokenIndex46
1088 {
1089 add(ruleAction8, position)
1090 }
1091 }
1092 l46:
1093 add(ruleExpression, position45)
1094 }
1095 return true
1096 },
1097 /* 7 Sequence <- <(Prefix (Prefix Action9)*)> */
1098 func() bool {
1099 position55, tokenIndex55 := position, tokenIndex
1100 {
1101 position56 := position
1102 if !_rules[rulePrefix]() {
1103 goto l55
1104 }
1105 l57:
1106 {
1107 position58, tokenIndex58 := position, tokenIndex
1108 if !_rules[rulePrefix]() {
1109 goto l58
1110 }
1111 {
1112 add(ruleAction9, position)
1113 }
1114 goto l57
1115 l58:
1116 position, tokenIndex = position58, tokenIndex58
1117 }
1118 add(ruleSequence, position56)
1119 }
1120 return true
1121 l55:
1122 position, tokenIndex = position55, tokenIndex55
1123 return false
1124 },
1125 /* 8 Prefix <- <((And Action Action10) / (Not Action Action11) / ((&('!') (Not Suffix Action13)) | (&('&') (And Suffix Action12)) | (&('"' | '\'' | '(' | '.' | '<' | 'A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z' | '[' | '_' | 'a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z' | '{') Suffix)))> */
1126 func() bool {
1127 position60, tokenIndex60 := position, tokenIndex
1128 {
1129 position61 := position
1130 {
1131 position62, tokenIndex62 := position, tokenIndex
1132 if !_rules[ruleAnd]() {
1133 goto l63
1134 }
1135 if !_rules[ruleAction]() {
1136 goto l63
1137 }
1138 {
1139 add(ruleAction10, position)
1140 }
1141 goto l62
1142 l63:
1143 position, tokenIndex = position62, tokenIndex62
1144 if !_rules[ruleNot]() {
1145 goto l65
1146 }
1147 if !_rules[ruleAction]() {
1148 goto l65
1149 }
1150 {
1151 add(ruleAction11, position)
1152 }
1153 goto l62
1154 l65:
1155 position, tokenIndex = position62, tokenIndex62
1156 {
1157 switch buffer[position] {
1158 case '!':
1159 if !_rules[ruleNot]() {
1160 goto l60
1161 }
1162 if !_rules[ruleSuffix]() {
1163 goto l60
1164 }
1165 {
1166 add(ruleAction13, position)
1167 }
1168 case '&':
1169 if !_rules[ruleAnd]() {
1170 goto l60
1171 }
1172 if !_rules[ruleSuffix]() {
1173 goto l60
1174 }
1175 {
1176 add(ruleAction12, position)
1177 }
1178 default:
1179 if !_rules[ruleSuffix]() {
1180 goto l60
1181 }
1182 }
1183 }
1184
1185 }
1186 l62:
1187 add(rulePrefix, position61)
1188 }
1189 return true
1190 l60:
1191 position, tokenIndex = position60, tokenIndex60
1192 return false
1193 },
1194 /* 9 Suffix <- <(Primary ((&('+') (Plus Action16)) | (&('*') (Star Action15)) | (&('?') (Question Action14)))?)> */
1195 func() bool {
1196 position70, tokenIndex70 := position, tokenIndex
1197 {
1198 position71 := position
1199 {
1200 position72 := position
1201 {
1202 switch buffer[position] {
1203 case '<':
1204 {
1205 position74 := position
1206 if buffer[position] != rune('<') {
1207 goto l70
1208 }
1209 position++
1210 if !_rules[ruleSpacing]() {
1211 goto l70
1212 }
1213 add(ruleBegin, position74)
1214 }
1215 if !_rules[ruleExpression]() {
1216 goto l70
1217 }
1218 {
1219 position75 := position
1220 if buffer[position] != rune('>') {
1221 goto l70
1222 }
1223 position++
1224 if !_rules[ruleSpacing]() {
1225 goto l70
1226 }
1227 add(ruleEnd, position75)
1228 }
1229 {
1230 add(ruleAction20, position)
1231 }
1232 case '{':
1233 if !_rules[ruleAction]() {
1234 goto l70
1235 }
1236 {
1237 add(ruleAction19, position)
1238 }
1239 case '.':
1240 {
1241 position78 := position
1242 if buffer[position] != rune('.') {
1243 goto l70
1244 }
1245 position++
1246 if !_rules[ruleSpacing]() {
1247 goto l70
1248 }
1249 add(ruleDot, position78)
1250 }
1251 {
1252 add(ruleAction18, position)
1253 }
1254 case '[':
1255 {
1256 position80 := position
1257 {
1258 position81, tokenIndex81 := position, tokenIndex
1259 if buffer[position] != rune('[') {
1260 goto l82
1261 }
1262 position++
1263 if buffer[position] != rune('[') {
1264 goto l82
1265 }
1266 position++
1267 {
1268 position83, tokenIndex83 := position, tokenIndex
1269 {
1270 position85, tokenIndex85 := position, tokenIndex
1271 if buffer[position] != rune('^') {
1272 goto l86
1273 }
1274 position++
1275 if !_rules[ruleDoubleRanges]() {
1276 goto l86
1277 }
1278 {
1279 add(ruleAction23, position)
1280 }
1281 goto l85
1282 l86:
1283 position, tokenIndex = position85, tokenIndex85
1284 if !_rules[ruleDoubleRanges]() {
1285 goto l83
1286 }
1287 }
1288 l85:
1289 goto l84
1290 l83:
1291 position, tokenIndex = position83, tokenIndex83
1292 }
1293 l84:
1294 if buffer[position] != rune(']') {
1295 goto l82
1296 }
1297 position++
1298 if buffer[position] != rune(']') {
1299 goto l82
1300 }
1301 position++
1302 goto l81
1303 l82:
1304 position, tokenIndex = position81, tokenIndex81
1305 if buffer[position] != rune('[') {
1306 goto l70
1307 }
1308 position++
1309 {
1310 position88, tokenIndex88 := position, tokenIndex
1311 {
1312 position90, tokenIndex90 := position, tokenIndex
1313 if buffer[position] != rune('^') {
1314 goto l91
1315 }
1316 position++
1317 if !_rules[ruleRanges]() {
1318 goto l91
1319 }
1320 {
1321 add(ruleAction24, position)
1322 }
1323 goto l90
1324 l91:
1325 position, tokenIndex = position90, tokenIndex90
1326 if !_rules[ruleRanges]() {
1327 goto l88
1328 }
1329 }
1330 l90:
1331 goto l89
1332 l88:
1333 position, tokenIndex = position88, tokenIndex88
1334 }
1335 l89:
1336 if buffer[position] != rune(']') {
1337 goto l70
1338 }
1339 position++
1340 }
1341 l81:
1342 if !_rules[ruleSpacing]() {
1343 goto l70
1344 }
1345 add(ruleClass, position80)
1346 }
1347 case '"', '\'':
1348 {
1349 position93 := position
1350 {
1351 position94, tokenIndex94 := position, tokenIndex
1352 if buffer[position] != rune('\'') {
1353 goto l95
1354 }
1355 position++
1356 {
1357 position96, tokenIndex96 := position, tokenIndex
1358 {
1359 position98, tokenIndex98 := position, tokenIndex
1360 if buffer[position] != rune('\'') {
1361 goto l98
1362 }
1363 position++
1364 goto l96
1365 l98:
1366 position, tokenIndex = position98, tokenIndex98
1367 }
1368 if !_rules[ruleChar]() {
1369 goto l96
1370 }
1371 goto l97
1372 l96:
1373 position, tokenIndex = position96, tokenIndex96
1374 }
1375 l97:
1376 l99:
1377 {
1378 position100, tokenIndex100 := position, tokenIndex
1379 {
1380 position101, tokenIndex101 := position, tokenIndex
1381 if buffer[position] != rune('\'') {
1382 goto l101
1383 }
1384 position++
1385 goto l100
1386 l101:
1387 position, tokenIndex = position101, tokenIndex101
1388 }
1389 if !_rules[ruleChar]() {
1390 goto l100
1391 }
1392 {
1393 add(ruleAction21, position)
1394 }
1395 goto l99
1396 l100:
1397 position, tokenIndex = position100, tokenIndex100
1398 }
1399 if buffer[position] != rune('\'') {
1400 goto l95
1401 }
1402 position++
1403 if !_rules[ruleSpacing]() {
1404 goto l95
1405 }
1406 goto l94
1407 l95:
1408 position, tokenIndex = position94, tokenIndex94
1409 if buffer[position] != rune('"') {
1410 goto l70
1411 }
1412 position++
1413 {
1414 position103, tokenIndex103 := position, tokenIndex
1415 {
1416 position105, tokenIndex105 := position, tokenIndex
1417 if buffer[position] != rune('"') {
1418 goto l105
1419 }
1420 position++
1421 goto l103
1422 l105:
1423 position, tokenIndex = position105, tokenIndex105
1424 }
1425 if !_rules[ruleDoubleChar]() {
1426 goto l103
1427 }
1428 goto l104
1429 l103:
1430 position, tokenIndex = position103, tokenIndex103
1431 }
1432 l104:
1433 l106:
1434 {
1435 position107, tokenIndex107 := position, tokenIndex
1436 {
1437 position108, tokenIndex108 := position, tokenIndex
1438 if buffer[position] != rune('"') {
1439 goto l108
1440 }
1441 position++
1442 goto l107
1443 l108:
1444 position, tokenIndex = position108, tokenIndex108
1445 }
1446 if !_rules[ruleDoubleChar]() {
1447 goto l107
1448 }
1449 {
1450 add(ruleAction22, position)
1451 }
1452 goto l106
1453 l107:
1454 position, tokenIndex = position107, tokenIndex107
1455 }
1456 if buffer[position] != rune('"') {
1457 goto l70
1458 }
1459 position++
1460 if !_rules[ruleSpacing]() {
1461 goto l70
1462 }
1463 }
1464 l94:
1465 add(ruleLiteral, position93)
1466 }
1467 case '(':
1468 {
1469 position110 := position
1470 if buffer[position] != rune('(') {
1471 goto l70
1472 }
1473 position++
1474 if !_rules[ruleSpacing]() {
1475 goto l70
1476 }
1477 add(ruleOpen, position110)
1478 }
1479 if !_rules[ruleExpression]() {
1480 goto l70
1481 }
1482 {
1483 position111 := position
1484 if buffer[position] != rune(')') {
1485 goto l70
1486 }
1487 position++
1488 if !_rules[ruleSpacing]() {
1489 goto l70
1490 }
1491 add(ruleClose, position111)
1492 }
1493 default:
1494 if !_rules[ruleIdentifier]() {
1495 goto l70
1496 }
1497 {
1498 position112, tokenIndex112 := position, tokenIndex
1499 if !_rules[ruleLeftArrow]() {
1500 goto l112
1501 }
1502 goto l70
1503 l112:
1504 position, tokenIndex = position112, tokenIndex112
1505 }
1506 {
1507 add(ruleAction17, position)
1508 }
1509 }
1510 }
1511
1512 add(rulePrimary, position72)
1513 }
1514 {
1515 position114, tokenIndex114 := position, tokenIndex
1516 {
1517 switch buffer[position] {
1518 case '+':
1519 {
1520 position117 := position
1521 if buffer[position] != rune('+') {
1522 goto l114
1523 }
1524 position++
1525 if !_rules[ruleSpacing]() {
1526 goto l114
1527 }
1528 add(rulePlus, position117)
1529 }
1530 {
1531 add(ruleAction16, position)
1532 }
1533 case '*':
1534 {
1535 position119 := position
1536 if buffer[position] != rune('*') {
1537 goto l114
1538 }
1539 position++
1540 if !_rules[ruleSpacing]() {
1541 goto l114
1542 }
1543 add(ruleStar, position119)
1544 }
1545 {
1546 add(ruleAction15, position)
1547 }
1548 default:
1549 {
1550 position121 := position
1551 if buffer[position] != rune('?') {
1552 goto l114
1553 }
1554 position++
1555 if !_rules[ruleSpacing]() {
1556 goto l114
1557 }
1558 add(ruleQuestion, position121)
1559 }
1560 {
1561 add(ruleAction14, position)
1562 }
1563 }
1564 }
1565
1566 goto l115
1567 l114:
1568 position, tokenIndex = position114, tokenIndex114
1569 }
1570 l115:
1571 add(ruleSuffix, position71)
1572 }
1573 return true
1574 l70:
1575 position, tokenIndex = position70, tokenIndex70
1576 return false
1577 },
1578 /* 10 Primary <- <((&('<') (Begin Expression End Action20)) | (&('{') (Action Action19)) | (&('.') (Dot Action18)) | (&('[') Class) | (&('"' | '\'') Literal) | (&('(') (Open Expression Close)) | (&('A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z' | '_' | 'a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z') (Identifier !LeftArrow Action17)))> */
1579 nil,
1580 /* 11 Identifier <- <(<(IdentStart IdentCont*)> Spacing)> */
1581 func() bool {
1582 position124, tokenIndex124 := position, tokenIndex
1583 {
1584 position125 := position
1585 {
1586 position126 := position
1587 if !_rules[ruleIdentStart]() {
1588 goto l124
1589 }
1590 l127:
1591 {
1592 position128, tokenIndex128 := position, tokenIndex
1593 {
1594 position129 := position
1595 {
1596 position130, tokenIndex130 := position, tokenIndex
1597 if !_rules[ruleIdentStart]() {
1598 goto l131
1599 }
1600 goto l130
1601 l131:
1602 position, tokenIndex = position130, tokenIndex130
1603 if c := buffer[position]; c < rune('0') || c > rune('9') {
1604 goto l128
1605 }
1606 position++
1607 }
1608 l130:
1609 add(ruleIdentCont, position129)
1610 }
1611 goto l127
1612 l128:
1613 position, tokenIndex = position128, tokenIndex128
1614 }
1615 add(rulePegText, position126)
1616 }
1617 if !_rules[ruleSpacing]() {
1618 goto l124
1619 }
1620 add(ruleIdentifier, position125)
1621 }
1622 return true
1623 l124:
1624 position, tokenIndex = position124, tokenIndex124
1625 return false
1626 },
1627 /* 12 IdentStart <- <((&('_') '_') | (&('A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z') [A-Z]) | (&('a' | 'b' | 'c' | 'd' | 'e' | 'f' | 'g' | 'h' | 'i' | 'j' | 'k' | 'l' | 'm' | 'n' | 'o' | 'p' | 'q' | 'r' | 's' | 't' | 'u' | 'v' | 'w' | 'x' | 'y' | 'z') [a-z]))> */
1628 func() bool {
1629 position132, tokenIndex132 := position, tokenIndex
1630 {
1631 position133 := position
1632 {
1633 switch buffer[position] {
1634 case '_':
1635 if buffer[position] != rune('_') {
1636 goto l132
1637 }
1638 position++
1639 case 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z':
1640 if c := buffer[position]; c < rune('A') || c > rune('Z') {
1641 goto l132
1642 }
1643 position++
1644 default:
1645 if c := buffer[position]; c < rune('a') || c > rune('z') {
1646 goto l132
1647 }
1648 position++
1649 }
1650 }
1651
1652 add(ruleIdentStart, position133)
1653 }
1654 return true
1655 l132:
1656 position, tokenIndex = position132, tokenIndex132
1657 return false
1658 },
1659 /* 13 IdentCont <- <(IdentStart / [0-9])> */
1660 nil,
1661 /* 14 Literal <- <(('\'' (!'\'' Char)? (!'\'' Char Action21)* '\'' Spacing) / ('"' (!'"' DoubleChar)? (!'"' DoubleChar Action22)* '"' Spacing))> */
1662 nil,
1663 /* 15 Class <- <((('[' '[' (('^' DoubleRanges Action23) / DoubleRanges)? (']' ']')) / ('[' (('^' Ranges Action24) / Ranges)? ']')) Spacing)> */
1664 nil,
1665 /* 16 Ranges <- <(!']' Range (!']' Range Action25)*)> */
1666 func() bool {
1667 position138, tokenIndex138 := position, tokenIndex
1668 {
1669 position139 := position
1670 {
1671 position140, tokenIndex140 := position, tokenIndex
1672 if buffer[position] != rune(']') {
1673 goto l140
1674 }
1675 position++
1676 goto l138
1677 l140:
1678 position, tokenIndex = position140, tokenIndex140
1679 }
1680 if !_rules[ruleRange]() {
1681 goto l138
1682 }
1683 l141:
1684 {
1685 position142, tokenIndex142 := position, tokenIndex
1686 {
1687 position143, tokenIndex143 := position, tokenIndex
1688 if buffer[position] != rune(']') {
1689 goto l143
1690 }
1691 position++
1692 goto l142
1693 l143:
1694 position, tokenIndex = position143, tokenIndex143
1695 }
1696 if !_rules[ruleRange]() {
1697 goto l142
1698 }
1699 {
1700 add(ruleAction25, position)
1701 }
1702 goto l141
1703 l142:
1704 position, tokenIndex = position142, tokenIndex142
1705 }
1706 add(ruleRanges, position139)
1707 }
1708 return true
1709 l138:
1710 position, tokenIndex = position138, tokenIndex138
1711 return false
1712 },
1713 /* 17 DoubleRanges <- <(!(']' ']') DoubleRange (!(']' ']') DoubleRange Action26)*)> */
1714 func() bool {
1715 position145, tokenIndex145 := position, tokenIndex
1716 {
1717 position146 := position
1718 {
1719 position147, tokenIndex147 := position, tokenIndex
1720 if buffer[position] != rune(']') {
1721 goto l147
1722 }
1723 position++
1724 if buffer[position] != rune(']') {
1725 goto l147
1726 }
1727 position++
1728 goto l145
1729 l147:
1730 position, tokenIndex = position147, tokenIndex147
1731 }
1732 if !_rules[ruleDoubleRange]() {
1733 goto l145
1734 }
1735 l148:
1736 {
1737 position149, tokenIndex149 := position, tokenIndex
1738 {
1739 position150, tokenIndex150 := position, tokenIndex
1740 if buffer[position] != rune(']') {
1741 goto l150
1742 }
1743 position++
1744 if buffer[position] != rune(']') {
1745 goto l150
1746 }
1747 position++
1748 goto l149
1749 l150:
1750 position, tokenIndex = position150, tokenIndex150
1751 }
1752 if !_rules[ruleDoubleRange]() {
1753 goto l149
1754 }
1755 {
1756 add(ruleAction26, position)
1757 }
1758 goto l148
1759 l149:
1760 position, tokenIndex = position149, tokenIndex149
1761 }
1762 add(ruleDoubleRanges, position146)
1763 }
1764 return true
1765 l145:
1766 position, tokenIndex = position145, tokenIndex145
1767 return false
1768 },
1769 /* 18 Range <- <((Char '-' Char Action27) / Char)> */
1770 func() bool {
1771 position152, tokenIndex152 := position, tokenIndex
1772 {
1773 position153 := position
1774 {
1775 position154, tokenIndex154 := position, tokenIndex
1776 if !_rules[ruleChar]() {
1777 goto l155
1778 }
1779 if buffer[position] != rune('-') {
1780 goto l155
1781 }
1782 position++
1783 if !_rules[ruleChar]() {
1784 goto l155
1785 }
1786 {
1787 add(ruleAction27, position)
1788 }
1789 goto l154
1790 l155:
1791 position, tokenIndex = position154, tokenIndex154
1792 if !_rules[ruleChar]() {
1793 goto l152
1794 }
1795 }
1796 l154:
1797 add(ruleRange, position153)
1798 }
1799 return true
1800 l152:
1801 position, tokenIndex = position152, tokenIndex152
1802 return false
1803 },
1804 /* 19 DoubleRange <- <((Char '-' Char Action28) / DoubleChar)> */
1805 func() bool {
1806 position157, tokenIndex157 := position, tokenIndex
1807 {
1808 position158 := position
1809 {
1810 position159, tokenIndex159 := position, tokenIndex
1811 if !_rules[ruleChar]() {
1812 goto l160
1813 }
1814 if buffer[position] != rune('-') {
1815 goto l160
1816 }
1817 position++
1818 if !_rules[ruleChar]() {
1819 goto l160
1820 }
1821 {
1822 add(ruleAction28, position)
1823 }
1824 goto l159
1825 l160:
1826 position, tokenIndex = position159, tokenIndex159
1827 if !_rules[ruleDoubleChar]() {
1828 goto l157
1829 }
1830 }
1831 l159:
1832 add(ruleDoubleRange, position158)
1833 }
1834 return true
1835 l157:
1836 position, tokenIndex = position157, tokenIndex157
1837 return false
1838 },
1839 /* 20 Char <- <(Escape / (!'\\' <.> Action29))> */
1840 func() bool {
1841 position162, tokenIndex162 := position, tokenIndex
1842 {
1843 position163 := position
1844 {
1845 position164, tokenIndex164 := position, tokenIndex
1846 if !_rules[ruleEscape]() {
1847 goto l165
1848 }
1849 goto l164
1850 l165:
1851 position, tokenIndex = position164, tokenIndex164
1852 {
1853 position166, tokenIndex166 := position, tokenIndex
1854 if buffer[position] != rune('\\') {
1855 goto l166
1856 }
1857 position++
1858 goto l162
1859 l166:
1860 position, tokenIndex = position166, tokenIndex166
1861 }
1862 {
1863 position167 := position
1864 if !matchDot() {
1865 goto l162
1866 }
1867 add(rulePegText, position167)
1868 }
1869 {
1870 add(ruleAction29, position)
1871 }
1872 }
1873 l164:
1874 add(ruleChar, position163)
1875 }
1876 return true
1877 l162:
1878 position, tokenIndex = position162, tokenIndex162
1879 return false
1880 },
1881 /* 21 DoubleChar <- <(Escape / (<([a-z] / [A-Z])> Action30) / (!'\\' <.> Action31))> */
1882 func() bool {
1883 position169, tokenIndex169 := position, tokenIndex
1884 {
1885 position170 := position
1886 {
1887 position171, tokenIndex171 := position, tokenIndex
1888 if !_rules[ruleEscape]() {
1889 goto l172
1890 }
1891 goto l171
1892 l172:
1893 position, tokenIndex = position171, tokenIndex171
1894 {
1895 position174 := position
1896 {
1897 position175, tokenIndex175 := position, tokenIndex
1898 if c := buffer[position]; c < rune('a') || c > rune('z') {
1899 goto l176
1900 }
1901 position++
1902 goto l175
1903 l176:
1904 position, tokenIndex = position175, tokenIndex175
1905 if c := buffer[position]; c < rune('A') || c > rune('Z') {
1906 goto l173
1907 }
1908 position++
1909 }
1910 l175:
1911 add(rulePegText, position174)
1912 }
1913 {
1914 add(ruleAction30, position)
1915 }
1916 goto l171
1917 l173:
1918 position, tokenIndex = position171, tokenIndex171
1919 {
1920 position178, tokenIndex178 := position, tokenIndex
1921 if buffer[position] != rune('\\') {
1922 goto l178
1923 }
1924 position++
1925 goto l169
1926 l178:
1927 position, tokenIndex = position178, tokenIndex178
1928 }
1929 {
1930 position179 := position
1931 if !matchDot() {
1932 goto l169
1933 }
1934 add(rulePegText, position179)
1935 }
1936 {
1937 add(ruleAction31, position)
1938 }
1939 }
1940 l171:
1941 add(ruleDoubleChar, position170)
1942 }
1943 return true
1944 l169:
1945 position, tokenIndex = position169, tokenIndex169
1946 return false
1947 },
1948 /* 22 Escape <- <(('\\' ('a' / 'A') Action32) / ('\\' ('b' / 'B') Action33) / ('\\' ('e' / 'E') Action34) / ('\\' ('f' / 'F') Action35) / ('\\' ('n' / 'N') Action36) / ('\\' ('r' / 'R') Action37) / ('\\' ('t' / 'T') Action38) / ('\\' ('v' / 'V') Action39) / ('\\' '\'' Action40) / ('\\' '"' Action41) / ('\\' '[' Action42) / ('\\' ']' Action43) / ('\\' '-' Action44) / ('\\' ('0' ('x' / 'X')) <((&('A' | 'B' | 'C' | 'D' | 'E' | 'F') [A-F]) | (&('a' | 'b' | 'c' | 'd' | 'e' | 'f') [a-f]) | (&('0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9') [0-9]))+> Action45) / ('\\' <([0-3] [0-7] [0-7])> Action46) / ('\\' <([0-7] [0-7]?)> Action47) / ('\\' '\\' Action48))> */
1949 func() bool {
1950 position181, tokenIndex181 := position, tokenIndex
1951 {
1952 position182 := position
1953 {
1954 position183, tokenIndex183 := position, tokenIndex
1955 if buffer[position] != rune('\\') {
1956 goto l184
1957 }
1958 position++
1959 {
1960 position185, tokenIndex185 := position, tokenIndex
1961 if buffer[position] != rune('a') {
1962 goto l186
1963 }
1964 position++
1965 goto l185
1966 l186:
1967 position, tokenIndex = position185, tokenIndex185
1968 if buffer[position] != rune('A') {
1969 goto l184
1970 }
1971 position++
1972 }
1973 l185:
1974 {
1975 add(ruleAction32, position)
1976 }
1977 goto l183
1978 l184:
1979 position, tokenIndex = position183, tokenIndex183
1980 if buffer[position] != rune('\\') {
1981 goto l188
1982 }
1983 position++
1984 {
1985 position189, tokenIndex189 := position, tokenIndex
1986 if buffer[position] != rune('b') {
1987 goto l190
1988 }
1989 position++
1990 goto l189
1991 l190:
1992 position, tokenIndex = position189, tokenIndex189
1993 if buffer[position] != rune('B') {
1994 goto l188
1995 }
1996 position++
1997 }
1998 l189:
1999 {
2000 add(ruleAction33, position)
2001 }
2002 goto l183
2003 l188:
2004 position, tokenIndex = position183, tokenIndex183
2005 if buffer[position] != rune('\\') {
2006 goto l192
2007 }
2008 position++
2009 {
2010 position193, tokenIndex193 := position, tokenIndex
2011 if buffer[position] != rune('e') {
2012 goto l194
2013 }
2014 position++
2015 goto l193
2016 l194:
2017 position, tokenIndex = position193, tokenIndex193
2018 if buffer[position] != rune('E') {
2019 goto l192
2020 }
2021 position++
2022 }
2023 l193:
2024 {
2025 add(ruleAction34, position)
2026 }
2027 goto l183
2028 l192:
2029 position, tokenIndex = position183, tokenIndex183
2030 if buffer[position] != rune('\\') {
2031 goto l196
2032 }
2033 position++
2034 {
2035 position197, tokenIndex197 := position, tokenIndex
2036 if buffer[position] != rune('f') {
2037 goto l198
2038 }
2039 position++
2040 goto l197
2041 l198:
2042 position, tokenIndex = position197, tokenIndex197
2043 if buffer[position] != rune('F') {
2044 goto l196
2045 }
2046 position++
2047 }
2048 l197:
2049 {
2050 add(ruleAction35, position)
2051 }
2052 goto l183
2053 l196:
2054 position, tokenIndex = position183, tokenIndex183
2055 if buffer[position] != rune('\\') {
2056 goto l200
2057 }
2058 position++
2059 {
2060 position201, tokenIndex201 := position, tokenIndex
2061 if buffer[position] != rune('n') {
2062 goto l202
2063 }
2064 position++
2065 goto l201
2066 l202:
2067 position, tokenIndex = position201, tokenIndex201
2068 if buffer[position] != rune('N') {
2069 goto l200
2070 }
2071 position++
2072 }
2073 l201:
2074 {
2075 add(ruleAction36, position)
2076 }
2077 goto l183
2078 l200:
2079 position, tokenIndex = position183, tokenIndex183
2080 if buffer[position] != rune('\\') {
2081 goto l204
2082 }
2083 position++
2084 {
2085 position205, tokenIndex205 := position, tokenIndex
2086 if buffer[position] != rune('r') {
2087 goto l206
2088 }
2089 position++
2090 goto l205
2091 l206:
2092 position, tokenIndex = position205, tokenIndex205
2093 if buffer[position] != rune('R') {
2094 goto l204
2095 }
2096 position++
2097 }
2098 l205:
2099 {
2100 add(ruleAction37, position)
2101 }
2102 goto l183
2103 l204:
2104 position, tokenIndex = position183, tokenIndex183
2105 if buffer[position] != rune('\\') {
2106 goto l208
2107 }
2108 position++
2109 {
2110 position209, tokenIndex209 := position, tokenIndex
2111 if buffer[position] != rune('t') {
2112 goto l210
2113 }
2114 position++
2115 goto l209
2116 l210:
2117 position, tokenIndex = position209, tokenIndex209
2118 if buffer[position] != rune('T') {
2119 goto l208
2120 }
2121 position++
2122 }
2123 l209:
2124 {
2125 add(ruleAction38, position)
2126 }
2127 goto l183
2128 l208:
2129 position, tokenIndex = position183, tokenIndex183
2130 if buffer[position] != rune('\\') {
2131 goto l212
2132 }
2133 position++
2134 {
2135 position213, tokenIndex213 := position, tokenIndex
2136 if buffer[position] != rune('v') {
2137 goto l214
2138 }
2139 position++
2140 goto l213
2141 l214:
2142 position, tokenIndex = position213, tokenIndex213
2143 if buffer[position] != rune('V') {
2144 goto l212
2145 }
2146 position++
2147 }
2148 l213:
2149 {
2150 add(ruleAction39, position)
2151 }
2152 goto l183
2153 l212:
2154 position, tokenIndex = position183, tokenIndex183
2155 if buffer[position] != rune('\\') {
2156 goto l216
2157 }
2158 position++
2159 if buffer[position] != rune('\'') {
2160 goto l216
2161 }
2162 position++
2163 {
2164 add(ruleAction40, position)
2165 }
2166 goto l183
2167 l216:
2168 position, tokenIndex = position183, tokenIndex183
2169 if buffer[position] != rune('\\') {
2170 goto l218
2171 }
2172 position++
2173 if buffer[position] != rune('"') {
2174 goto l218
2175 }
2176 position++
2177 {
2178 add(ruleAction41, position)
2179 }
2180 goto l183
2181 l218:
2182 position, tokenIndex = position183, tokenIndex183
2183 if buffer[position] != rune('\\') {
2184 goto l220
2185 }
2186 position++
2187 if buffer[position] != rune('[') {
2188 goto l220
2189 }
2190 position++
2191 {
2192 add(ruleAction42, position)
2193 }
2194 goto l183
2195 l220:
2196 position, tokenIndex = position183, tokenIndex183
2197 if buffer[position] != rune('\\') {
2198 goto l222
2199 }
2200 position++
2201 if buffer[position] != rune(']') {
2202 goto l222
2203 }
2204 position++
2205 {
2206 add(ruleAction43, position)
2207 }
2208 goto l183
2209 l222:
2210 position, tokenIndex = position183, tokenIndex183
2211 if buffer[position] != rune('\\') {
2212 goto l224
2213 }
2214 position++
2215 if buffer[position] != rune('-') {
2216 goto l224
2217 }
2218 position++
2219 {
2220 add(ruleAction44, position)
2221 }
2222 goto l183
2223 l224:
2224 position, tokenIndex = position183, tokenIndex183
2225 if buffer[position] != rune('\\') {
2226 goto l226
2227 }
2228 position++
2229 if buffer[position] != rune('0') {
2230 goto l226
2231 }
2232 position++
2233 {
2234 position227, tokenIndex227 := position, tokenIndex
2235 if buffer[position] != rune('x') {
2236 goto l228
2237 }
2238 position++
2239 goto l227
2240 l228:
2241 position, tokenIndex = position227, tokenIndex227
2242 if buffer[position] != rune('X') {
2243 goto l226
2244 }
2245 position++
2246 }
2247 l227:
2248 {
2249 position229 := position
2250 {
2251 switch buffer[position] {
2252 case 'A', 'B', 'C', 'D', 'E', 'F':
2253 if c := buffer[position]; c < rune('A') || c > rune('F') {
2254 goto l226
2255 }
2256 position++
2257 case 'a', 'b', 'c', 'd', 'e', 'f':
2258 if c := buffer[position]; c < rune('a') || c > rune('f') {
2259 goto l226
2260 }
2261 position++
2262 default:
2263 if c := buffer[position]; c < rune('0') || c > rune('9') {
2264 goto l226
2265 }
2266 position++
2267 }
2268 }
2269
2270 l230:
2271 {
2272 position231, tokenIndex231 := position, tokenIndex
2273 {
2274 switch buffer[position] {
2275 case 'A', 'B', 'C', 'D', 'E', 'F':
2276 if c := buffer[position]; c < rune('A') || c > rune('F') {
2277 goto l231
2278 }
2279 position++
2280 case 'a', 'b', 'c', 'd', 'e', 'f':
2281 if c := buffer[position]; c < rune('a') || c > rune('f') {
2282 goto l231
2283 }
2284 position++
2285 default:
2286 if c := buffer[position]; c < rune('0') || c > rune('9') {
2287 goto l231
2288 }
2289 position++
2290 }
2291 }
2292
2293 goto l230
2294 l231:
2295 position, tokenIndex = position231, tokenIndex231
2296 }
2297 add(rulePegText, position229)
2298 }
2299 {
2300 add(ruleAction45, position)
2301 }
2302 goto l183
2303 l226:
2304 position, tokenIndex = position183, tokenIndex183
2305 if buffer[position] != rune('\\') {
2306 goto l235
2307 }
2308 position++
2309 {
2310 position236 := position
2311 if c := buffer[position]; c < rune('0') || c > rune('3') {
2312 goto l235
2313 }
2314 position++
2315 if c := buffer[position]; c < rune('0') || c > rune('7') {
2316 goto l235
2317 }
2318 position++
2319 if c := buffer[position]; c < rune('0') || c > rune('7') {
2320 goto l235
2321 }
2322 position++
2323 add(rulePegText, position236)
2324 }
2325 {
2326 add(ruleAction46, position)
2327 }
2328 goto l183
2329 l235:
2330 position, tokenIndex = position183, tokenIndex183
2331 if buffer[position] != rune('\\') {
2332 goto l238
2333 }
2334 position++
2335 {
2336 position239 := position
2337 if c := buffer[position]; c < rune('0') || c > rune('7') {
2338 goto l238
2339 }
2340 position++
2341 {
2342 position240, tokenIndex240 := position, tokenIndex
2343 if c := buffer[position]; c < rune('0') || c > rune('7') {
2344 goto l240
2345 }
2346 position++
2347 goto l241
2348 l240:
2349 position, tokenIndex = position240, tokenIndex240
2350 }
2351 l241:
2352 add(rulePegText, position239)
2353 }
2354 {
2355 add(ruleAction47, position)
2356 }
2357 goto l183
2358 l238:
2359 position, tokenIndex = position183, tokenIndex183
2360 if buffer[position] != rune('\\') {
2361 goto l181
2362 }
2363 position++
2364 if buffer[position] != rune('\\') {
2365 goto l181
2366 }
2367 position++
2368 {
2369 add(ruleAction48, position)
2370 }
2371 }
2372 l183:
2373 add(ruleEscape, position182)
2374 }
2375 return true
2376 l181:
2377 position, tokenIndex = position181, tokenIndex181
2378 return false
2379 },
2380 /* 23 LeftArrow <- <((('<' '-') / '←') Spacing)> */
2381 func() bool {
2382 position244, tokenIndex244 := position, tokenIndex
2383 {
2384 position245 := position
2385 {
2386 position246, tokenIndex246 := position, tokenIndex
2387 if buffer[position] != rune('<') {
2388 goto l247
2389 }
2390 position++
2391 if buffer[position] != rune('-') {
2392 goto l247
2393 }
2394 position++
2395 goto l246
2396 l247:
2397 position, tokenIndex = position246, tokenIndex246
2398 if buffer[position] != rune('←') {
2399 goto l244
2400 }
2401 position++
2402 }
2403 l246:
2404 if !_rules[ruleSpacing]() {
2405 goto l244
2406 }
2407 add(ruleLeftArrow, position245)
2408 }
2409 return true
2410 l244:
2411 position, tokenIndex = position244, tokenIndex244
2412 return false
2413 },
2414 /* 24 Slash <- <('/' Spacing)> */
2415 func() bool {
2416 position248, tokenIndex248 := position, tokenIndex
2417 {
2418 position249 := position
2419 if buffer[position] != rune('/') {
2420 goto l248
2421 }
2422 position++
2423 if !_rules[ruleSpacing]() {
2424 goto l248
2425 }
2426 add(ruleSlash, position249)
2427 }
2428 return true
2429 l248:
2430 position, tokenIndex = position248, tokenIndex248
2431 return false
2432 },
2433 /* 25 And <- <('&' Spacing)> */
2434 func() bool {
2435 position250, tokenIndex250 := position, tokenIndex
2436 {
2437 position251 := position
2438 if buffer[position] != rune('&') {
2439 goto l250
2440 }
2441 position++
2442 if !_rules[ruleSpacing]() {
2443 goto l250
2444 }
2445 add(ruleAnd, position251)
2446 }
2447 return true
2448 l250:
2449 position, tokenIndex = position250, tokenIndex250
2450 return false
2451 },
2452 /* 26 Not <- <('!' Spacing)> */
2453 func() bool {
2454 position252, tokenIndex252 := position, tokenIndex
2455 {
2456 position253 := position
2457 if buffer[position] != rune('!') {
2458 goto l252
2459 }
2460 position++
2461 if !_rules[ruleSpacing]() {
2462 goto l252
2463 }
2464 add(ruleNot, position253)
2465 }
2466 return true
2467 l252:
2468 position, tokenIndex = position252, tokenIndex252
2469 return false
2470 },
2471 /* 27 Question <- <('?' Spacing)> */
2472 nil,
2473 /* 28 Star <- <('*' Spacing)> */
2474 nil,
2475 /* 29 Plus <- <('+' Spacing)> */
2476 nil,
2477 /* 30 Open <- <('(' Spacing)> */
2478 nil,
2479 /* 31 Close <- <(')' Spacing)> */
2480 nil,
2481 /* 32 Dot <- <('.' Spacing)> */
2482 nil,
2483 /* 33 SpaceComment <- <(Space / Comment)> */
2484 func() bool {
2485 position260, tokenIndex260 := position, tokenIndex
2486 {
2487 position261 := position
2488 {
2489 position262, tokenIndex262 := position, tokenIndex
2490 {
2491 position264 := position
2492 {
2493 switch buffer[position] {
2494 case '\t':
2495 if buffer[position] != rune('\t') {
2496 goto l263
2497 }
2498 position++
2499 case ' ':
2500 if buffer[position] != rune(' ') {
2501 goto l263
2502 }
2503 position++
2504 default:
2505 if !_rules[ruleEndOfLine]() {
2506 goto l263
2507 }
2508 }
2509 }
2510
2511 add(ruleSpace, position264)
2512 }
2513 goto l262
2514 l263:
2515 position, tokenIndex = position262, tokenIndex262
2516 {
2517 position266 := position
2518 {
2519 position267, tokenIndex267 := position, tokenIndex
2520 if buffer[position] != rune('#') {
2521 goto l268
2522 }
2523 position++
2524 goto l267
2525 l268:
2526 position, tokenIndex = position267, tokenIndex267
2527 if buffer[position] != rune('/') {
2528 goto l260
2529 }
2530 position++
2531 if buffer[position] != rune('/') {
2532 goto l260
2533 }
2534 position++
2535 }
2536 l267:
2537 l269:
2538 {
2539 position270, tokenIndex270 := position, tokenIndex
2540 {
2541 position271, tokenIndex271 := position, tokenIndex
2542 if !_rules[ruleEndOfLine]() {
2543 goto l271
2544 }
2545 goto l270
2546 l271:
2547 position, tokenIndex = position271, tokenIndex271
2548 }
2549 if !matchDot() {
2550 goto l270
2551 }
2552 goto l269
2553 l270:
2554 position, tokenIndex = position270, tokenIndex270
2555 }
2556 if !_rules[ruleEndOfLine]() {
2557 goto l260
2558 }
2559 add(ruleComment, position266)
2560 }
2561 }
2562 l262:
2563 add(ruleSpaceComment, position261)
2564 }
2565 return true
2566 l260:
2567 position, tokenIndex = position260, tokenIndex260
2568 return false
2569 },
2570 /* 34 Spacing <- <SpaceComment*> */
2571 func() bool {
2572 {
2573 position273 := position
2574 l274:
2575 {
2576 position275, tokenIndex275 := position, tokenIndex
2577 if !_rules[ruleSpaceComment]() {
2578 goto l275
2579 }
2580 goto l274
2581 l275:
2582 position, tokenIndex = position275, tokenIndex275
2583 }
2584 add(ruleSpacing, position273)
2585 }
2586 return true
2587 },
2588 /* 35 MustSpacing <- <SpaceComment+> */
2589 func() bool {
2590 position276, tokenIndex276 := position, tokenIndex
2591 {
2592 position277 := position
2593 if !_rules[ruleSpaceComment]() {
2594 goto l276
2595 }
2596 l278:
2597 {
2598 position279, tokenIndex279 := position, tokenIndex
2599 if !_rules[ruleSpaceComment]() {
2600 goto l279
2601 }
2602 goto l278
2603 l279:
2604 position, tokenIndex = position279, tokenIndex279
2605 }
2606 add(ruleMustSpacing, position277)
2607 }
2608 return true
2609 l276:
2610 position, tokenIndex = position276, tokenIndex276
2611 return false
2612 },
2613 /* 36 Comment <- <(('#' / ('/' '/')) (!EndOfLine .)* EndOfLine)> */
2614 nil,
2615 /* 37 Space <- <((&('\t') '\t') | (&(' ') ' ') | (&('\n' | '\r') EndOfLine))> */
2616 nil,
2617 /* 38 EndOfLine <- <(('\r' '\n') / '\n' / '\r')> */
2618 func() bool {
2619 position282, tokenIndex282 := position, tokenIndex
2620 {
2621 position283 := position
2622 {
2623 position284, tokenIndex284 := position, tokenIndex
2624 if buffer[position] != rune('\r') {
2625 goto l285
2626 }
2627 position++
2628 if buffer[position] != rune('\n') {
2629 goto l285
2630 }
2631 position++
2632 goto l284
2633 l285:
2634 position, tokenIndex = position284, tokenIndex284
2635 if buffer[position] != rune('\n') {
2636 goto l286
2637 }
2638 position++
2639 goto l284
2640 l286:
2641 position, tokenIndex = position284, tokenIndex284
2642 if buffer[position] != rune('\r') {
2643 goto l282
2644 }
2645 position++
2646 }
2647 l284:
2648 add(ruleEndOfLine, position283)
2649 }
2650 return true
2651 l282:
2652 position, tokenIndex = position282, tokenIndex282
2653 return false
2654 },
2655 /* 39 EndOfFile <- <!.> */
2656 nil,
2657 /* 40 Action <- <('{' <ActionBody*> '}' Spacing)> */
2658 func() bool {
2659 position288, tokenIndex288 := position, tokenIndex
2660 {
2661 position289 := position
2662 if buffer[position] != rune('{') {
2663 goto l288
2664 }
2665 position++
2666 {
2667 position290 := position
2668 l291:
2669 {
2670 position292, tokenIndex292 := position, tokenIndex
2671 if !_rules[ruleActionBody]() {
2672 goto l292
2673 }
2674 goto l291
2675 l292:
2676 position, tokenIndex = position292, tokenIndex292
2677 }
2678 add(rulePegText, position290)
2679 }
2680 if buffer[position] != rune('}') {
2681 goto l288
2682 }
2683 position++
2684 if !_rules[ruleSpacing]() {
2685 goto l288
2686 }
2687 add(ruleAction, position289)
2688 }
2689 return true
2690 l288:
2691 position, tokenIndex = position288, tokenIndex288
2692 return false
2693 },
2694 /* 41 ActionBody <- <((!('{' / '}') .) / ('{' ActionBody* '}'))> */
2695 func() bool {
2696 position293, tokenIndex293 := position, tokenIndex
2697 {
2698 position294 := position
2699 {
2700 position295, tokenIndex295 := position, tokenIndex
2701 {
2702 position297, tokenIndex297 := position, tokenIndex
2703 {
2704 position298, tokenIndex298 := position, tokenIndex
2705 if buffer[position] != rune('{') {
2706 goto l299
2707 }
2708 position++
2709 goto l298
2710 l299:
2711 position, tokenIndex = position298, tokenIndex298
2712 if buffer[position] != rune('}') {
2713 goto l297
2714 }
2715 position++
2716 }
2717 l298:
2718 goto l296
2719 l297:
2720 position, tokenIndex = position297, tokenIndex297
2721 }
2722 if !matchDot() {
2723 goto l296
2724 }
2725 goto l295
2726 l296:
2727 position, tokenIndex = position295, tokenIndex295
2728 if buffer[position] != rune('{') {
2729 goto l293
2730 }
2731 position++
2732 l300:
2733 {
2734 position301, tokenIndex301 := position, tokenIndex
2735 if !_rules[ruleActionBody]() {
2736 goto l301
2737 }
2738 goto l300
2739 l301:
2740 position, tokenIndex = position301, tokenIndex301
2741 }
2742 if buffer[position] != rune('}') {
2743 goto l293
2744 }
2745 position++
2746 }
2747 l295:
2748 add(ruleActionBody, position294)
2749 }
2750 return true
2751 l293:
2752 position, tokenIndex = position293, tokenIndex293
2753 return false
2754 },
2755 /* 42 Begin <- <('<' Spacing)> */
2756 nil,
2757 /* 43 End <- <('>' Spacing)> */
2758 nil,
2759 /* 45 Action0 <- <{ p.AddPackage(text) }> */
2760 nil,
2761 /* 46 Action1 <- <{ p.AddPeg(text) }> */
2762 nil,
2763 /* 47 Action2 <- <{ p.AddState(text) }> */
2764 nil,
2765 nil,
2766 /* 49 Action3 <- <{ p.AddImport(text) }> */
2767 nil,
2768 /* 50 Action4 <- <{ p.AddRule(text) }> */
2769 nil,
2770 /* 51 Action5 <- <{ p.AddExpression() }> */
2771 nil,
2772 /* 52 Action6 <- <{ p.AddAlternate() }> */
2773 nil,
2774 /* 53 Action7 <- <{ p.AddNil(); p.AddAlternate() }> */
2775 nil,
2776 /* 54 Action8 <- <{ p.AddNil() }> */
2777 nil,
2778 /* 55 Action9 <- <{ p.AddSequence() }> */
2779 nil,
2780 /* 56 Action10 <- <{ p.AddPredicate(text) }> */
2781 nil,
2782 /* 57 Action11 <- <{ p.AddStateChange(text) }> */
2783 nil,
2784 /* 58 Action12 <- <{ p.AddPeekFor() }> */
2785 nil,
2786 /* 59 Action13 <- <{ p.AddPeekNot() }> */
2787 nil,
2788 /* 60 Action14 <- <{ p.AddQuery() }> */
2789 nil,
2790 /* 61 Action15 <- <{ p.AddStar() }> */
2791 nil,
2792 /* 62 Action16 <- <{ p.AddPlus() }> */
2793 nil,
2794 /* 63 Action17 <- <{ p.AddName(text) }> */
2795 nil,
2796 /* 64 Action18 <- <{ p.AddDot() }> */
2797 nil,
2798 /* 65 Action19 <- <{ p.AddAction(text) }> */
2799 nil,
2800 /* 66 Action20 <- <{ p.AddPush() }> */
2801 nil,
2802 /* 67 Action21 <- <{ p.AddSequence() }> */
2803 nil,
2804 /* 68 Action22 <- <{ p.AddSequence() }> */
2805 nil,
2806 /* 69 Action23 <- <{ p.AddPeekNot(); p.AddDot(); p.AddSequence() }> */
2807 nil,
2808 /* 70 Action24 <- <{ p.AddPeekNot(); p.AddDot(); p.AddSequence() }> */
2809 nil,
2810 /* 71 Action25 <- <{ p.AddAlternate() }> */
2811 nil,
2812 /* 72 Action26 <- <{ p.AddAlternate() }> */
2813 nil,
2814 /* 73 Action27 <- <{ p.AddRange() }> */
2815 nil,
2816 /* 74 Action28 <- <{ p.AddDoubleRange() }> */
2817 nil,
2818 /* 75 Action29 <- <{ p.AddCharacter(text) }> */
2819 nil,
2820 /* 76 Action30 <- <{ p.AddDoubleCharacter(text) }> */
2821 nil,
2822 /* 77 Action31 <- <{ p.AddCharacter(text) }> */
2823 nil,
2824 /* 78 Action32 <- <{ p.AddCharacter("\a") }> */
2825 nil,
2826 /* 79 Action33 <- <{ p.AddCharacter("\b") }> */
2827 nil,
2828 /* 80 Action34 <- <{ p.AddCharacter("\x1B") }> */
2829 nil,
2830 /* 81 Action35 <- <{ p.AddCharacter("\f") }> */
2831 nil,
2832 /* 82 Action36 <- <{ p.AddCharacter("\n") }> */
2833 nil,
2834 /* 83 Action37 <- <{ p.AddCharacter("\r") }> */
2835 nil,
2836 /* 84 Action38 <- <{ p.AddCharacter("\t") }> */
2837 nil,
2838 /* 85 Action39 <- <{ p.AddCharacter("\v") }> */
2839 nil,
2840 /* 86 Action40 <- <{ p.AddCharacter("'") }> */
2841 nil,
2842 /* 87 Action41 <- <{ p.AddCharacter("\"") }> */
2843 nil,
2844 /* 88 Action42 <- <{ p.AddCharacter("[") }> */
2845 nil,
2846 /* 89 Action43 <- <{ p.AddCharacter("]") }> */
2847 nil,
2848 /* 90 Action44 <- <{ p.AddCharacter("-") }> */
2849 nil,
2850 /* 91 Action45 <- <{ p.AddHexaCharacter(text) }> */
2851 nil,
2852 /* 92 Action46 <- <{ p.AddOctalCharacter(text) }> */
2853 nil,
2854 /* 93 Action47 <- <{ p.AddOctalCharacter(text) }> */
2855 nil,
2856 /* 94 Action48 <- <{ p.AddCharacter("\\") }> */
2857 nil,
2858 }
2859 p.rules = _rules
2860 return nil
2861 }
22 import (
33 "bytes"
44 "io/ioutil"
5 "os"
56 "testing"
7
8 "github.com/pointlander/peg/tree"
69 )
710
811 func TestCorrect(t *testing.T) {
1013 type T Peg {}
1114 Grammar <- !.
1215 `
13 p := &Peg{Tree: New(false, false), Buffer: buffer}
16 p := &Peg{Tree: tree.New(false, false, false), Buffer: buffer}
1417 p.Init()
1518 err := p.Parse()
19 if err != nil {
20 t.Error(err)
21 }
22
23 p = &Peg{Tree: tree.New(false, false, false), Buffer: buffer}
24 p.Init(Size(1<<15))
25 err = p.Parse()
1626 if err != nil {
1727 t.Error(err)
1828 }
2333 type T Peg {}
2434 Grammar <- !.
2535 `
26 p := &Peg{Tree: New(false, false), Buffer: buffer}
27 p.Init()
36 p := &Peg{Tree: tree.New(false, false, false), Buffer: buffer}
37 p.Init(Size(1<<15))
2838 err := p.Parse()
2939 if err == nil {
3040 t.Error("packagenospace was parsed without error")
3747 typenospace Peg {}
3848 Grammar <- !.
3949 `
40 p := &Peg{Tree: New(false, false), Buffer: buffer}
41 p.Init()
50 p := &Peg{Tree: tree.New(false, false, false), Buffer: buffer}
51 p.Init(Size(1<<15))
4252 err := p.Parse()
4353 if err == nil {
4454 t.Error("typenospace was parsed without error")
5161 t.Error(err)
5262 }
5363
54 p := &Peg{Tree: New(true, true), Buffer: string(buffer)}
55 p.Init()
56 if err := p.Parse(); err != nil {
64 p := &Peg{Tree: tree.New(true, true, false), Buffer: string(buffer)}
65 p.Init(Size(1<<15))
66 if err = p.Parse(); err != nil {
5767 t.Error(err)
5868 }
5969
6070 p.Execute()
6171
6272 out := &bytes.Buffer{}
63 p.Compile("peg.peg.go", out)
64
65 bootstrap, err := ioutil.ReadFile("bootstrap.peg.go")
73 p.Compile("peg.peg.go", []string{"./peg", "-inline", "-switch", "peg.peg"}, out)
74
75 bootstrap, err := ioutil.ReadFile("peg.peg.go")
6676 if err != nil {
6777 t.Error(err)
6878 }
6979
7080 if len(out.Bytes()) != len(bootstrap) {
71 t.Error("code generated from peg.peg is not the same as bootstrap.peg.go")
81 t.Error("code generated from peg.peg is not the same as .go")
7282 return
7383 }
7484
7585 for i, v := range out.Bytes() {
7686 if v != bootstrap[i] {
77 t.Error("code generated from peg.peg is not the same as bootstrap.peg.go")
87 t.Error("code generated from peg.peg is not the same as .go")
7888 return
7989 }
8090 }
8191 }
8292
93 func TestStrict(t *testing.T) {
94 tt := []string{
95 // rule used but not defined
96 `
97 package main
98 type test Peg {}
99 Begin <- begin !.
100 `,
101 // rule defined but not used
102 `
103 package main
104 type test Peg {}
105 Begin <- .
106 unused <- 'unused'
107 `,
108 // left recursive rule
109 `package main
110 type test Peg {}
111 Begin <- Begin 'x'
112 `,
113 }
114
115 for i, buffer := range tt {
116 p := &Peg{Tree: tree.New(false, false, false), Buffer: buffer}
117 p.Init(Size(1<<15))
118 if err := p.Parse(); err != nil {
119 t.Fatal(err)
120 }
121 p.Execute()
122
123 f, err := ioutil.TempFile("", "peg")
124 if err != nil {
125 t.Fatal(err)
126 }
127 defer func() {
128 os.Remove(f.Name())
129 f.Close()
130 }()
131 out := &bytes.Buffer{}
132 p.Strict = true
133 if err = p.Compile(f.Name(), []string{"peg"}, out); err == nil {
134 t.Fatalf("#%d: expected warning error", i)
135 }
136 p.Strict = false
137 if err = p.Compile(f.Name(), []string{"peg"}, out); err != nil {
138 t.Fatalf("#%d: unexpected error (%v)", i, err)
139 }
140 }
141 }
142
143 var files = [...]string{
144 "peg.peg",
145 "grammars/c/c.peg",
146 "grammars/calculator/calculator.peg",
147 "grammars/fexl/fexl.peg",
148 "grammars/java/java_1_7.peg",
149 }
150
151 func BenchmarkInitOnly(b *testing.B) {
152 pegs := []string{}
153 for _, file := range files {
154 input, err := ioutil.ReadFile(file)
155 if err != nil {
156 b.Error(err)
157 }
158 pegs = append(pegs, string(input))
159 }
160
161 b.ResetTimer()
162 for i := 0; i < b.N; i++ {
163 for _, peg := range pegs {
164 p := &Peg{Tree: tree.New(true, true, false), Buffer: peg}
165 p.Init(Size(1<<15))
166 }
167 }
168 }
169
83170 func BenchmarkParse(b *testing.B) {
84 files := [...]string{
85 "peg.peg",
86 "grammars/c/c.peg",
87 "grammars/calculator/calculator.peg",
88 "grammars/fexl/fexl.peg",
89 "grammars/java/java_1_7.peg",
90 }
91171 pegs := make([]*Peg, len(files))
92172 for i, file := range files {
93173 input, err := ioutil.ReadFile(file)
95175 b.Error(err)
96176 }
97177
98 p := &Peg{Tree: New(true, true), Buffer: string(input)}
99 p.Init()
178 p := &Peg{Tree: tree.New(true, true, false), Buffer: string(input)}
179 p.Init(Size(1<<15))
100180 pegs[i] = p
101181 }
102182
103183 b.ResetTimer()
104184 for i := 0; i < b.N; i++ {
105185 for _, peg := range pegs {
186 if err := peg.Parse(); err != nil {
187 b.Error(err)
188 }
189 b.StopTimer()
106190 peg.Reset()
107 if err := peg.Parse(); err != nil {
108 b.Error(err)
109 }
110 }
111 }
112 }
191 b.StartTimer()
192 }
193 }
194 }
195
196 func BenchmarkResetAndParse(b *testing.B) {
197 pegs := make([]*Peg, len(files))
198 for i, file := range files {
199 input, err := ioutil.ReadFile(file)
200 if err != nil {
201 b.Error(err)
202 }
203
204 p := &Peg{Tree: tree.New(true, true, false), Buffer: string(input)}
205 p.Init(Size(1<<15))
206 pegs[i] = p
207 }
208
209 b.ResetTimer()
210 for i := 0; i < b.N; i++ {
211 for _, peg := range pegs {
212 if err := peg.Parse(); err != nil {
213 b.Error(err)
214 }
215 peg.Reset()
216 }
217 }
218 }
219
220 func BenchmarkInitAndParse(b *testing.B) {
221 strs := []string{}
222 for _, file := range files {
223 input, err := ioutil.ReadFile(file)
224 if err != nil {
225 b.Error(err)
226 }
227 strs = append(strs, string(input))
228 }
229
230 b.ResetTimer()
231 for i := 0; i < b.N; i++ {
232 for _, str := range strs {
233 peg := &Peg{Tree: tree.New(true, true, false), Buffer: str}
234 peg.Init(Size(1<<15))
235 if err := peg.Parse(); err != nil {
236 b.Error(err)
237 }
238 }
239 }
240 }
241
242 func BenchmarkInitResetAndParse(b *testing.B) {
243 strs := []string{}
244 for _, file := range files {
245 input, err := ioutil.ReadFile(file)
246 if err != nil {
247 b.Error(err)
248 }
249 strs = append(strs, string(input))
250 }
251
252 b.ResetTimer()
253 for i := 0; i < b.N; i++ {
254 for _, str := range strs {
255 peg := &Peg{Tree: tree.New(true, true, false), Buffer: str}
256 peg.Init(Size(1<<15))
257 if err := peg.Parse(); err != nil {
258 b.Error(err)
259 }
260 peg.Reset()
261 }
262 }
263 }
0 // Copyright 2010 The Go Authors. All rights reserved.
1 // Use of this source code is governed by a BSD-style
2 // license that can be found in the LICENSE file.
3
4 package tree
5
6 import (
7 "bytes"
8 "fmt"
9 "go/parser"
10 "go/printer"
11 "go/token"
12 "io"
13 "math"
14 "os"
15 "sort"
16 "strconv"
17 "strings"
18 "text/template"
19
20 "github.com/pointlander/jetset"
21 )
22
23 const pegHeaderTemplate = `package {{.PackageName}}
24
25 // Code generated by {{.Generator}} DO NOT EDIT.
26
27
28 import (
29 {{range .Imports}}"{{.}}"
30 {{end}}
31 )
32
33 const endSymbol rune = {{.EndSymbol}}
34
35 /* The rule types inferred from the grammar are below. */
36 type pegRule {{.PegRuleType}}
37
38 const (
39 ruleUnknown pegRule = iota
40 {{range .RuleNames}}rule{{.String}}
41 {{end}}
42 )
43
44 var rul3s = [...]string {
45 "Unknown",
46 {{range .RuleNames}}"{{.String}}",
47 {{end}}
48 }
49
50 type token32 struct {
51 pegRule
52 begin, end uint32
53 }
54
55 func (t *token32) String() string {
56 return fmt.Sprintf("\x1B[34m%v\x1B[m %v %v", rul3s[t.pegRule], t.begin, t.end)
57 }
58
59 {{if .Ast}}
60 type node32 struct {
61 token32
62 up, next *node32
63 }
64
65 func (node *node32) print(w io.Writer, pretty bool, buffer string) {
66 var print func(node *node32, depth int)
67 print = func(node *node32, depth int) {
68 for node != nil {
69 for c := 0; c < depth; c++ {
70 fmt.Fprintf(w, " ")
71 }
72 rule := rul3s[node.pegRule]
73 quote := strconv.Quote(string(([]rune(buffer)[node.begin:node.end])))
74 if !pretty {
75 fmt.Fprintf(w, "%v %v\n", rule, quote)
76 } else {
77 fmt.Fprintf(w, "\x1B[36m%v\x1B[m %v\n", rule, quote)
78 }
79 if node.up != nil {
80 print(node.up, depth + 1)
81 }
82 node = node.next
83 }
84 }
85 print(node, 0)
86 }
87
88 func (node *node32) Print(w io.Writer, buffer string) {
89 node.print(w, false, buffer)
90 }
91
92 func (node *node32) PrettyPrint(w io.Writer, buffer string) {
93 node.print(w, true, buffer)
94 }
95
96 type tokens32 struct {
97 tree []token32
98 }
99
100 func (t *tokens32) Trim(length uint32) {
101 t.tree = t.tree[:length]
102 }
103
104 func (t *tokens32) Print() {
105 for _, token := range t.tree {
106 fmt.Println(token.String())
107 }
108 }
109
110 func (t *tokens32) AST() *node32 {
111 type element struct {
112 node *node32
113 down *element
114 }
115 tokens := t.Tokens()
116 var stack *element
117 for _, token := range tokens {
118 if token.begin == token.end {
119 continue
120 }
121 node := &node32{token32: token}
122 for stack != nil && stack.node.begin >= token.begin && stack.node.end <= token.end {
123 stack.node.next = node.up
124 node.up = stack.node
125 stack = stack.down
126 }
127 stack = &element{node: node, down: stack}
128 }
129 if stack != nil {
130 return stack.node
131 }
132 return nil
133 }
134
135 func (t *tokens32) PrintSyntaxTree(buffer string) {
136 t.AST().Print(os.Stdout, buffer)
137 }
138
139 func (t *tokens32) WriteSyntaxTree(w io.Writer, buffer string) {
140 t.AST().Print(w, buffer)
141 }
142
143 func (t *tokens32) PrettyPrintSyntaxTree(buffer string) {
144 t.AST().PrettyPrint(os.Stdout, buffer)
145 }
146
147 func (t *tokens32) Add(rule pegRule, begin, end, index uint32) {
148 tree, i := t.tree, int(index)
149 if i >= len(tree) {
150 t.tree = append(tree, token32{pegRule: rule, begin: begin, end: end})
151 return
152 }
153 tree[i] = token32{pegRule: rule, begin: begin, end: end}
154 }
155
156 func (t *tokens32) Tokens() []token32 {
157 return t.tree
158 }
159 {{end}}
160
161 type {{.StructName}} struct {
162 {{.StructVariables}}
163 Buffer string
164 buffer []rune
165 rules [{{.RulesCount}}]func() bool
166 parse func(rule ...int) error
167 reset func()
168 Pretty bool
169 {{if .Ast -}}
170 tokens32
171 {{end -}}
172 }
173
174 func (p *{{.StructName}}) Parse(rule ...int) error {
175 return p.parse(rule...)
176 }
177
178 func (p *{{.StructName}}) Reset() {
179 p.reset()
180 }
181
182 type textPosition struct {
183 line, symbol int
184 }
185
186 type textPositionMap map[int] textPosition
187
188 func translatePositions(buffer []rune, positions []int) textPositionMap {
189 length, translations, j, line, symbol := len(positions), make(textPositionMap, len(positions)), 0, 1, 0
190 sort.Ints(positions)
191
192 search: for i, c := range buffer {
193 if c == '\n' {line, symbol = line + 1, 0} else {symbol++}
194 if i == positions[j] {
195 translations[positions[j]] = textPosition{line, symbol}
196 for j++; j < length; j++ {if i != positions[j] {continue search}}
197 break search
198 }
199 }
200
201 return translations
202 }
203
204 type parseError struct {
205 p *{{.StructName}}
206 max token32
207 }
208
209 func (e *parseError) Error() string {
210 tokens, err := []token32{e.max}, "\n"
211 positions, p := make([]int, 2 * len(tokens)), 0
212 for _, token := range tokens {
213 positions[p], p = int(token.begin), p + 1
214 positions[p], p = int(token.end), p + 1
215 }
216 translations := translatePositions(e.p.buffer, positions)
217 format := "parse error near %v (line %v symbol %v - line %v symbol %v):\n%v\n"
218 if e.p.Pretty {
219 format = "parse error near \x1B[34m%v\x1B[m (line %v symbol %v - line %v symbol %v):\n%v\n"
220 }
221 for _, token := range tokens {
222 begin, end := int(token.begin), int(token.end)
223 err += fmt.Sprintf(format,
224 rul3s[token.pegRule],
225 translations[begin].line, translations[begin].symbol,
226 translations[end].line, translations[end].symbol,
227 strconv.Quote(string(e.p.buffer[begin:end])))
228 }
229
230 return err
231 }
232
233 {{if .Ast}}
234 func (p *{{.StructName}}) PrintSyntaxTree() {
235 if p.Pretty {
236 p.tokens32.PrettyPrintSyntaxTree(p.Buffer)
237 } else {
238 p.tokens32.PrintSyntaxTree(p.Buffer)
239 }
240 }
241
242 func (p *{{.StructName}}) WriteSyntaxTree(w io.Writer) {
243 p.tokens32.WriteSyntaxTree(w, p.Buffer)
244 }
245
246 func (p *{{.StructName}}) SprintSyntaxTree() string {
247 var bldr strings.Builder
248 p.WriteSyntaxTree(&bldr)
249 return bldr.String()
250 }
251
252 {{if .HasActions}}
253 func (p *{{.StructName}}) Execute() {
254 buffer, _buffer, text, begin, end := p.Buffer, p.buffer, "", 0, 0
255 for _, token := range p.Tokens() {
256 switch (token.pegRule) {
257 {{if .HasPush}}
258 case rulePegText:
259 begin, end = int(token.begin), int(token.end)
260 text = string(_buffer[begin:end])
261 {{end}}
262 {{range .Actions}}case ruleAction{{.GetId}}:
263 {{.String}}
264 {{end}}
265 }
266 }
267 _, _, _, _, _ = buffer, _buffer, text, begin, end
268 }
269 {{end}}
270 {{end}}
271
272 func Pretty(pretty bool) func(*{{.StructName}}) error {
273 return func(p *{{.StructName}}) error {
274 p.Pretty = pretty
275 return nil
276 }
277 }
278
279 {{if .Ast -}}
280 func Size(size int) func(*{{.StructName}}) error {
281 return func(p *{{.StructName}}) error {
282 p.tokens32 = tokens32{tree: make([]token32, 0, size)}
283 return nil
284 }
285 }
286 {{end -}}
287
288 func (p *{{.StructName}}) Init(options ...func(*{{.StructName}}) error) error {
289 var (
290 max token32
291 position, tokenIndex uint32
292 buffer []rune
293 {{if not .Ast -}}
294 {{if .HasPush -}}
295 text string
296 {{end -}}
297 {{end -}}
298 )
299 for _, option := range options {
300 err := option(p)
301 if err != nil {
302 return err
303 }
304 }
305 p.reset = func() {
306 max = token32{}
307 position, tokenIndex = 0, 0
308
309 p.buffer = []rune(p.Buffer)
310 if len(p.buffer) == 0 || p.buffer[len(p.buffer) - 1] != endSymbol {
311 p.buffer = append(p.buffer, endSymbol)
312 }
313 buffer = p.buffer
314 }
315 p.reset()
316
317 _rules := p.rules
318 {{if .Ast -}}
319 tree := p.tokens32
320 {{end -}}
321 p.parse = func(rule ...int) error {
322 r := 1
323 if len(rule) > 0 {
324 r = rule[0]
325 }
326 matches := p.rules[r]()
327 {{if .Ast -}}
328 p.tokens32 = tree
329 {{end -}}
330 if matches {
331 {{if .Ast -}}
332 p.Trim(tokenIndex)
333 {{end -}}
334 return nil
335 }
336 return &parseError{p, max}
337 }
338
339 add := func(rule pegRule, begin uint32) {
340 {{if .Ast -}}
341 tree.Add(rule, begin, position, tokenIndex)
342 {{end -}}
343 tokenIndex++
344 if begin != position && position > max.end {
345 max = token32{rule, begin, position}
346 }
347 }
348
349 {{if .HasDot}}
350 matchDot := func() bool {
351 if buffer[position] != endSymbol {
352 position++
353 return true
354 }
355 return false
356 }
357 {{end}}
358
359 {{if .HasCharacter}}
360 /*matchChar := func(c byte) bool {
361 if buffer[position] == c {
362 position++
363 return true
364 }
365 return false
366 }*/
367 {{end}}
368
369 {{if .HasString}}
370 matchString := func(s string) bool {
371 i := position
372 for _, c := range s {
373 if buffer[i] != c {
374 return false
375 }
376 i++
377 }
378 position = i
379 return true
380 }
381 {{end}}
382
383 {{if .HasRange}}
384 /*matchRange := func(lower byte, upper byte) bool {
385 if c := buffer[position]; c >= lower && c <= upper {
386 position++
387 return true
388 }
389 return false
390 }*/
391 {{end}}
392
393 _rules = [...]func() bool {
394 nil,`
395
396 type Type uint8
397
398 const (
399 TypeUnknown Type = iota
400 TypeRule
401 TypeName
402 TypeDot
403 TypeCharacter
404 TypeRange
405 TypeString
406 TypePredicate
407 TypeStateChange
408 TypeCommit
409 TypeAction
410 TypePackage
411 TypeImport
412 TypeState
413 TypeAlternate
414 TypeUnorderedAlternate
415 TypeSequence
416 TypePeekFor
417 TypePeekNot
418 TypeQuery
419 TypeStar
420 TypePlus
421 TypePeg
422 TypePush
423 TypeImplicitPush
424 TypeNil
425 TypeLast
426 )
427
428 var TypeMap = [...]string{
429 "TypeUnknown",
430 "TypeRule",
431 "TypeName",
432 "TypeDot",
433 "TypeCharacter",
434 "TypeRange",
435 "TypeString",
436 "TypePredicate",
437 "TypeStateChange",
438 "TypeCommit",
439 "TypeAction",
440 "TypePackage",
441 "TypeImport",
442 "TypeState",
443 "TypeAlternate",
444 "TypeUnorderedAlternate",
445 "TypeSequence",
446 "TypePeekFor",
447 "TypePeekNot",
448 "TypeQuery",
449 "TypeStar",
450 "TypePlus",
451 "TypePeg",
452 "TypePush",
453 "TypeImplicitPush",
454 "TypeNil",
455 "TypeLast"}
456
457 func (t Type) GetType() Type {
458 return t
459 }
460
461 type Node interface {
462 fmt.Stringer
463 debug()
464
465 Escaped() string
466 SetString(s string)
467
468 GetType() Type
469 SetType(t Type)
470
471 GetId() int
472 SetId(id int)
473
474 Init()
475 Front() *node
476 Next() *node
477 PushFront(value *node)
478 PopFront() *node
479 PushBack(value *node)
480 Len() int
481 Copy() *node
482 Slice() []*node
483 }
484
485 type node struct {
486 Type
487 string
488 id int
489
490 front *node
491 back *node
492 length int
493
494 /* use hash table here instead of Copy? */
495 next *node
496 }
497
498 func (n *node) String() string {
499 return n.string
500 }
501
502 func (n *node) debug() {
503 if len(n.string) == 1 {
504 fmt.Printf("%v %v '%v' %d\n", n.id, TypeMap[n.Type], n.string, n.string[0])
505 } else {
506 fmt.Printf("%v %v '%v'\n", n.id, TypeMap[n.Type], n.string)
507 }
508 }
509
510 func (n *node) Escaped() string {
511 return escape(n.string)
512 }
513
514 func (n *node) SetString(s string) {
515 n.string = s
516 }
517
518 func (n *node) SetType(t Type) {
519 n.Type = t
520 }
521
522 func (n *node) GetId() int {
523 return n.id
524 }
525
526 func (n *node) SetId(id int) {
527 n.id = id
528 }
529
530 func (n *node) Init() {
531 n.front = nil
532 n.back = nil
533 n.length = 0
534 }
535
536 func (n *node) Front() *node {
537 return n.front
538 }
539
540 func (n *node) Next() *node {
541 return n.next
542 }
543
544 func (n *node) PushFront(value *node) {
545 if n.back == nil {
546 n.back = value
547 } else {
548 value.next = n.front
549 }
550 n.front = value
551 n.length++
552 }
553
554 func (n *node) PopFront() *node {
555 front := n.front
556
557 switch true {
558 case front == nil:
559 panic("tree is empty")
560 case front == n.back:
561 n.front, n.back = nil, nil
562 default:
563 n.front, front.next = front.next, nil
564 }
565
566 n.length--
567 return front
568 }
569
570 func (n *node) PushBack(value *node) {
571 if n.front == nil {
572 n.front = value
573 } else {
574 n.back.next = value
575 }
576 n.back = value
577 n.length++
578 }
579
580 func (n *node) Len() (c int) {
581 return n.length
582 }
583
584 func (n *node) Copy() *node {
585 return &node{Type: n.Type, string: n.string, id: n.id, front: n.front, back: n.back, length: n.length}
586 }
587
588 func (n *node) Slice() []*node {
589 s := make([]*node, n.length)
590 for element, i := n.Front(), 0; element != nil; element, i = element.Next(), i+1 {
591 s[i] = element
592 }
593 return s
594 }
595
596 /* A tree data structure into which a PEG can be parsed. */
597 type Tree struct {
598 Rules map[string]Node
599 rulesCount map[string]uint
600 node
601 inline, _switch, Ast bool
602 Strict bool
603
604 Generator string
605 RuleNames []Node
606 PackageName string
607 Imports []string
608 EndSymbol rune
609 PegRuleType string
610 StructName string
611 StructVariables string
612 RulesCount int
613 Bits int
614 HasActions bool
615 Actions []Node
616 HasPush bool
617 HasCommit bool
618 HasDot bool
619 HasCharacter bool
620 HasString bool
621 HasRange bool
622 }
623
624 func New(inline, _switch, noast bool) *Tree {
625 return &Tree{
626 Rules: make(map[string]Node),
627 rulesCount: make(map[string]uint),
628 inline: inline,
629 _switch: _switch,
630 Ast: !noast,
631 }
632 }
633
634 func (t *Tree) AddRule(name string) {
635 t.PushFront(&node{Type: TypeRule, string: name, id: t.RulesCount})
636 t.RulesCount++
637 }
638
639 func (t *Tree) AddExpression() {
640 expression := t.PopFront()
641 rule := t.PopFront()
642 rule.PushBack(expression)
643 t.PushBack(rule)
644 }
645
646 func (t *Tree) AddName(text string) {
647 t.PushFront(&node{Type: TypeName, string: text})
648 }
649
650 func (t *Tree) AddDot() { t.PushFront(&node{Type: TypeDot, string: "."}) }
651 func (t *Tree) AddCharacter(text string) {
652 t.PushFront(&node{Type: TypeCharacter, string: text})
653 }
654 func (t *Tree) AddDoubleCharacter(text string) {
655 t.PushFront(&node{Type: TypeCharacter, string: strings.ToLower(text)})
656 t.PushFront(&node{Type: TypeCharacter, string: strings.ToUpper(text)})
657 t.AddAlternate()
658 }
659 func (t *Tree) AddHexaCharacter(text string) {
660 hexa, _ := strconv.ParseInt(text, 16, 32)
661 t.PushFront(&node{Type: TypeCharacter, string: string(rune(hexa))})
662 }
663 func (t *Tree) AddOctalCharacter(text string) {
664 octal, _ := strconv.ParseInt(text, 8, 8)
665 t.PushFront(&node{Type: TypeCharacter, string: string(rune(octal))})
666 }
667 func (t *Tree) AddPredicate(text string) { t.PushFront(&node{Type: TypePredicate, string: text}) }
668 func (t *Tree) AddStateChange(text string) { t.PushFront(&node{Type: TypeStateChange, string: text}) }
669 func (t *Tree) AddNil() { t.PushFront(&node{Type: TypeNil, string: "<nil>"}) }
670 func (t *Tree) AddAction(text string) { t.PushFront(&node{Type: TypeAction, string: text}) }
671 func (t *Tree) AddPackage(text string) { t.PushBack(&node{Type: TypePackage, string: text}) }
672 func (t *Tree) AddImport(text string) { t.PushBack(&node{Type: TypeImport, string: text}) }
673 func (t *Tree) AddState(text string) {
674 peg := t.PopFront()
675 peg.PushBack(&node{Type: TypeState, string: text})
676 t.PushBack(peg)
677 }
678
679 func (t *Tree) addList(listType Type) {
680 a := t.PopFront()
681 b := t.PopFront()
682 var l *node
683 if b.GetType() == listType {
684 l = b
685 } else {
686 l = &node{Type: listType}
687 l.PushBack(b)
688 }
689 l.PushBack(a)
690 t.PushFront(l)
691 }
692 func (t *Tree) AddAlternate() { t.addList(TypeAlternate) }
693 func (t *Tree) AddSequence() { t.addList(TypeSequence) }
694 func (t *Tree) AddRange() { t.addList(TypeRange) }
695 func (t *Tree) AddDoubleRange() {
696 a := t.PopFront()
697 b := t.PopFront()
698
699 t.AddCharacter(strings.ToLower(b.String()))
700 t.AddCharacter(strings.ToLower(a.String()))
701 t.addList(TypeRange)
702
703 t.AddCharacter(strings.ToUpper(b.String()))
704 t.AddCharacter(strings.ToUpper(a.String()))
705 t.addList(TypeRange)
706
707 t.AddAlternate()
708 }
709
710 func (t *Tree) addFix(fixType Type) {
711 n := &node{Type: fixType}
712 n.PushBack(t.PopFront())
713 t.PushFront(n)
714 }
715 func (t *Tree) AddPeekFor() { t.addFix(TypePeekFor) }
716 func (t *Tree) AddPeekNot() { t.addFix(TypePeekNot) }
717 func (t *Tree) AddQuery() { t.addFix(TypeQuery) }
718 func (t *Tree) AddStar() { t.addFix(TypeStar) }
719 func (t *Tree) AddPlus() { t.addFix(TypePlus) }
720 func (t *Tree) AddPush() { t.addFix(TypePush) }
721
722 func (t *Tree) AddPeg(text string) { t.PushFront(&node{Type: TypePeg, string: text}) }
723
724 func join(tasks []func()) {
725 length := len(tasks)
726 done := make(chan int, length)
727 for _, task := range tasks {
728 go func(task func()) { task(); done <- 1 }(task)
729 }
730 for d := <-done; d < length; d += <-done {
731 }
732 }
733
734 func escape(c string) string {
735 switch c {
736 case "'":
737 return "\\'"
738 case "\"":
739 return "\""
740 default:
741 c = strconv.Quote(c)
742 return c[1 : len(c)-1]
743 }
744 }
745
746 func (t *Tree) Compile(file string, args []string, out io.Writer) (err error) {
747 t.AddImport("fmt")
748 if t.Ast {
749 t.AddImport("io")
750 t.AddImport("os")
751 t.AddImport("strings")
752 }
753 t.AddImport("sort")
754 t.AddImport("strconv")
755 t.EndSymbol = 0x110000
756 t.RulesCount++
757
758 t.Generator = strings.Join(args, " ")
759
760 var werr error
761 warn := func(e error) {
762 if werr == nil {
763 werr = fmt.Errorf("warning: %s.", e)
764 } else {
765 werr = fmt.Errorf("%s\nwarning: %s", werr, e)
766 }
767 }
768
769 counts := [TypeLast]uint{}
770 countsByRule := make([]*[TypeLast]uint, t.RulesCount)
771 {
772 var rule *node
773 var link func(countsForRule *[TypeLast]uint, node Node)
774 link = func(countsForRule *[TypeLast]uint, n Node) {
775 nodeType := n.GetType()
776 id := counts[nodeType]
777 counts[nodeType]++
778 countsForRule[nodeType]++
779 switch nodeType {
780 case TypeAction:
781 n.SetId(int(id))
782 copy, name := n.Copy(), fmt.Sprintf("Action%v", id)
783 t.Actions = append(t.Actions, copy)
784 n.Init()
785 n.SetType(TypeName)
786 n.SetString(name)
787 n.SetId(t.RulesCount)
788
789 emptyRule := &node{Type: TypeRule, string: name, id: t.RulesCount}
790 implicitPush := &node{Type: TypeImplicitPush}
791 emptyRule.PushBack(implicitPush)
792 implicitPush.PushBack(copy)
793 implicitPush.PushBack(emptyRule.Copy())
794 t.PushBack(emptyRule)
795 t.RulesCount++
796
797 t.Rules[name] = emptyRule
798 t.RuleNames = append(t.RuleNames, emptyRule)
799 countsByRule = append(countsByRule, &[TypeLast]uint{})
800 case TypeName:
801 name := n.String()
802 if _, ok := t.Rules[name]; !ok {
803 emptyRule := &node{Type: TypeRule, string: name, id: t.RulesCount}
804 implicitPush := &node{Type: TypeImplicitPush}
805 emptyRule.PushBack(implicitPush)
806 implicitPush.PushBack(&node{Type: TypeNil, string: "<nil>"})
807 implicitPush.PushBack(emptyRule.Copy())
808 t.PushBack(emptyRule)
809 t.RulesCount++
810
811 t.Rules[name] = emptyRule
812 t.RuleNames = append(t.RuleNames, emptyRule)
813 countsByRule = append(countsByRule, &[TypeLast]uint{})
814 }
815 case TypePush:
816 copy, name := rule.Copy(), "PegText"
817 copy.SetString(name)
818 if _, ok := t.Rules[name]; !ok {
819 emptyRule := &node{Type: TypeRule, string: name, id: t.RulesCount}
820 emptyRule.PushBack(&node{Type: TypeNil, string: "<nil>"})
821 t.PushBack(emptyRule)
822 t.RulesCount++
823
824 t.Rules[name] = emptyRule
825 t.RuleNames = append(t.RuleNames, emptyRule)
826 countsByRule = append(countsByRule, &[TypeLast]uint{})
827 }
828 n.PushBack(copy)
829 fallthrough
830 case TypeImplicitPush:
831 link(countsForRule, n.Front())
832 case TypeRule, TypeAlternate, TypeUnorderedAlternate, TypeSequence,
833 TypePeekFor, TypePeekNot, TypeQuery, TypeStar, TypePlus:
834 for _, node := range n.Slice() {
835 link(countsForRule, node)
836 }
837 }
838 }
839 /* first pass */
840 for _, node := range t.Slice() {
841 switch node.GetType() {
842 case TypePackage:
843 t.PackageName = node.String()
844 case TypeImport:
845 t.Imports = append(t.Imports, node.String())
846 case TypePeg:
847 t.StructName = node.String()
848 t.StructVariables = node.Front().String()
849 case TypeRule:
850 if _, ok := t.Rules[node.String()]; !ok {
851 expression := node.Front()
852 copy := expression.Copy()
853 expression.Init()
854 expression.SetType(TypeImplicitPush)
855 expression.PushBack(copy)
856 expression.PushBack(node.Copy())
857
858 t.Rules[node.String()] = node
859 t.RuleNames = append(t.RuleNames, node)
860 }
861 }
862 }
863 /* sort imports to satisfy gofmt */
864 sort.Strings(t.Imports)
865
866 /* second pass */
867 for _, node := range t.Slice() {
868 if node.GetType() == TypeRule {
869 rule = node
870 counts := [TypeLast]uint{}
871 countsByRule[node.GetId()] = &counts
872 link(&counts, node)
873 }
874 }
875 }
876
877 usage := [TypeLast]uint{}
878 join([]func(){
879 func() {
880 var countRules func(node Node)
881 ruleReached := make([]bool, t.RulesCount)
882 countRules = func(node Node) {
883 switch node.GetType() {
884 case TypeRule:
885 name, id := node.String(), node.GetId()
886 if count, ok := t.rulesCount[name]; ok {
887 t.rulesCount[name] = count + 1
888 } else {
889 t.rulesCount[name] = 1
890 }
891 if ruleReached[id] {
892 return
893 }
894 ruleReached[id] = true
895 countRules(node.Front())
896 case TypeName:
897 countRules(t.Rules[node.String()])
898 case TypeImplicitPush, TypePush:
899 countRules(node.Front())
900 case TypeAlternate, TypeUnorderedAlternate, TypeSequence,
901 TypePeekFor, TypePeekNot, TypeQuery, TypeStar, TypePlus:
902 for _, element := range node.Slice() {
903 countRules(element)
904 }
905 }
906 }
907 for _, node := range t.Slice() {
908 if node.GetType() == TypeRule {
909 countRules(node)
910 break
911 }
912 }
913 for id, reached := range ruleReached {
914 if reached {
915 for i, count := range countsByRule[id] {
916 usage[i] += count
917 }
918 }
919 }
920 },
921 func() {
922 var checkRecursion func(node Node) bool
923 ruleReached := make([]bool, t.RulesCount)
924 checkRecursion = func(node Node) bool {
925 switch node.GetType() {
926 case TypeRule:
927 id := node.GetId()
928 if ruleReached[id] {
929 warn(fmt.Errorf("possible infinite left recursion in rule '%v'", node))
930 return false
931 }
932 ruleReached[id] = true
933 consumes := checkRecursion(node.Front())
934 ruleReached[id] = false
935 return consumes
936 case TypeAlternate:
937 for _, element := range node.Slice() {
938 if !checkRecursion(element) {
939 return false
940 }
941 }
942 return true
943 case TypeSequence:
944 for _, element := range node.Slice() {
945 if checkRecursion(element) {
946 return true
947 }
948 }
949 case TypeName:
950 return checkRecursion(t.Rules[node.String()])
951 case TypePlus, TypePush, TypeImplicitPush:
952 return checkRecursion(node.Front())
953 case TypeCharacter, TypeString:
954 return len(node.String()) > 0
955 case TypeDot, TypeRange:
956 return true
957 }
958 return false
959 }
960 for _, node := range t.Slice() {
961 if node.GetType() == TypeRule {
962 checkRecursion(node)
963 }
964 }
965 }})
966
967 if t._switch {
968 var optimizeAlternates func(node Node) (consumes bool, s jetset.Set)
969 cache, firstPass := make([]struct {
970 reached, consumes bool
971 s jetset.Set
972 }, t.RulesCount), true
973 optimizeAlternates = func(n Node) (consumes bool, s jetset.Set) {
974 /*n.debug()*/
975 switch n.GetType() {
976 case TypeRule:
977 cache := &cache[n.GetId()]
978 if cache.reached {
979 consumes, s = cache.consumes, cache.s
980 return
981 }
982
983 cache.reached = true
984 consumes, s = optimizeAlternates(n.Front())
985 cache.consumes, cache.s = consumes, s
986 case TypeName:
987 consumes, s = optimizeAlternates(t.Rules[n.String()])
988 case TypeDot:
989 consumes = true
990 /* TypeDot set doesn't include the EndSymbol */
991 s = s.Add(uint64(t.EndSymbol))
992 s = s.Complement(uint64(t.EndSymbol))
993 case TypeString, TypeCharacter:
994 consumes = true
995 s = s.Add(uint64([]rune(n.String())[0]))
996 case TypeRange:
997 consumes = true
998 element := n.Front()
999 lower := []rune(element.String())[0]
1000 element = element.Next()
1001 upper := []rune(element.String())[0]
1002 s = s.AddRange(uint64(lower), uint64(upper))
1003 case TypeAlternate:
1004 consumes = true
1005 mconsumes, properties, c :=
1006 consumes, make([]struct {
1007 intersects bool
1008 s jetset.Set
1009 }, n.Len()), 0
1010 for _, element := range n.Slice() {
1011 mconsumes, properties[c].s = optimizeAlternates(element)
1012 consumes = consumes && mconsumes
1013 s = s.Union(properties[c].s)
1014 c++
1015 }
1016
1017 if firstPass {
1018 break
1019 }
1020
1021 intersections := 2
1022 compare:
1023 for ai, a := range properties[0 : len(properties)-1] {
1024 for _, b := range properties[ai+1:] {
1025 if a.s.Intersects(b.s) {
1026 intersections++
1027 properties[ai].intersects = true
1028 continue compare
1029 }
1030 }
1031 }
1032 if intersections >= len(properties) {
1033 break
1034 }
1035
1036 c, unordered, ordered, max :=
1037 0, &node{Type: TypeUnorderedAlternate}, &node{Type: TypeAlternate}, 0
1038 for _, element := range n.Slice() {
1039 if properties[c].intersects {
1040 ordered.PushBack(element.Copy())
1041 } else {
1042 class := &node{Type: TypeUnorderedAlternate}
1043 for d := 0; d < 256; d++ {
1044 if properties[c].s.Has(uint64(d)) {
1045 class.PushBack(&node{Type: TypeCharacter, string: string(rune(d))})
1046 }
1047 }
1048
1049 sequence, predicate, length :=
1050 &node{Type: TypeSequence}, &node{Type: TypePeekFor}, properties[c].s.Len()
1051 if length == 0 {
1052 class.PushBack(&node{Type: TypeNil, string: "<nil>"})
1053 }
1054 predicate.PushBack(class)
1055 sequence.PushBack(predicate)
1056 sequence.PushBack(element.Copy())
1057
1058 if element.GetType() == TypeNil {
1059 unordered.PushBack(sequence)
1060 } else if length > max {
1061 unordered.PushBack(sequence)
1062 max = length
1063 } else {
1064 unordered.PushFront(sequence)
1065 }
1066 }
1067 c++
1068 }
1069 n.Init()
1070 if ordered.Front() == nil {
1071 n.SetType(TypeUnorderedAlternate)
1072 for _, element := range unordered.Slice() {
1073 n.PushBack(element.Copy())
1074 }
1075 } else {
1076 for _, element := range ordered.Slice() {
1077 n.PushBack(element.Copy())
1078 }
1079 n.PushBack(unordered)
1080 }
1081 case TypeSequence:
1082 classes, elements :=
1083 make([]struct {
1084 s jetset.Set
1085 }, n.Len()), n.Slice()
1086
1087 for c, element := range elements {
1088 consumes, classes[c].s = optimizeAlternates(element)
1089 if consumes {
1090 elements, classes = elements[c+1:], classes[:c+1]
1091 break
1092 }
1093 }
1094
1095 for c := len(classes) - 1; c >= 0; c-- {
1096 s = s.Union(classes[c].s)
1097 }
1098
1099 for _, element := range elements {
1100 optimizeAlternates(element)
1101 }
1102 case TypePeekNot, TypePeekFor:
1103 optimizeAlternates(n.Front())
1104 case TypeQuery, TypeStar:
1105 _, s = optimizeAlternates(n.Front())
1106 case TypePlus, TypePush, TypeImplicitPush:
1107 consumes, s = optimizeAlternates(n.Front())
1108 case TypeAction, TypeNil:
1109 //empty
1110 }
1111 return
1112 }
1113 for _, element := range t.Slice() {
1114 if element.GetType() == TypeRule {
1115 optimizeAlternates(element)
1116 break
1117 }
1118 }
1119
1120 for i := range cache {
1121 cache[i].reached = false
1122 }
1123 firstPass = false
1124 for _, element := range t.Slice() {
1125 if element.GetType() == TypeRule {
1126 optimizeAlternates(element)
1127 break
1128 }
1129 }
1130 }
1131
1132 var buffer bytes.Buffer
1133 defer func() {
1134 if t.Strict && werr != nil && err == nil {
1135 // Treat warnings as errors.
1136 err = werr
1137 }
1138 if !t.Strict && werr != nil {
1139 // Display warnings.
1140 fmt.Fprintln(os.Stderr, werr)
1141 }
1142 if err != nil {
1143 return
1144 }
1145 fileSet := token.NewFileSet()
1146 code, err := parser.ParseFile(fileSet, file, &buffer, parser.ParseComments)
1147 if err != nil {
1148 buffer.WriteTo(out)
1149 err = fmt.Errorf("%v: %v", file, err)
1150 return
1151 }
1152 formatter := printer.Config{Mode: printer.TabIndent | printer.UseSpaces, Tabwidth: 8}
1153 err = formatter.Fprint(out, fileSet, code)
1154 if err != nil {
1155 buffer.WriteTo(out)
1156 err = fmt.Errorf("%v: %v", file, err)
1157 return
1158 }
1159
1160 }()
1161
1162 _print := func(format string, a ...interface{}) { fmt.Fprintf(&buffer, format, a...) }
1163 printSave := func(n uint) { _print("\n position%d, tokenIndex%d := position, tokenIndex", n, n) }
1164 printRestore := func(n uint) { _print("\n position, tokenIndex = position%d, tokenIndex%d", n, n) }
1165 printTemplate := func(s string) error {
1166 return template.Must(template.New("peg").Parse(s)).Execute(&buffer, t)
1167 }
1168
1169 t.HasActions = usage[TypeAction] > 0
1170 t.HasPush = usage[TypePush] > 0
1171 t.HasCommit = usage[TypeCommit] > 0
1172 t.HasDot = usage[TypeDot] > 0
1173 t.HasCharacter = usage[TypeCharacter] > 0
1174 t.HasString = usage[TypeString] > 0
1175 t.HasRange = usage[TypeRange] > 0
1176
1177 var printRule func(n Node)
1178 var compile func(expression Node, ko uint) (labelLast bool)
1179 var label uint
1180 labels := make(map[uint]bool)
1181 printBegin := func() { _print("\n {") }
1182 printEnd := func() { _print("\n }") }
1183 printLabel := func(n uint) bool {
1184 _print("\n")
1185 if labels[n] {
1186 _print(" l%d:\t", n)
1187 return true
1188 }
1189 return false
1190 }
1191 printJump := func(n uint) {
1192 _print("\n goto l%d", n)
1193 labels[n] = true
1194 }
1195 printRule = func(n Node) {
1196 switch n.GetType() {
1197 case TypeRule:
1198 _print("%v <- ", n)
1199 printRule(n.Front())
1200 case TypeDot:
1201 _print(".")
1202 case TypeName:
1203 _print("%v", n)
1204 case TypeCharacter:
1205 _print("'%v'", escape(n.String()))
1206 case TypeString:
1207 s := escape(n.String())
1208 _print("'%v'", s[1:len(s)-1])
1209 case TypeRange:
1210 element := n.Front()
1211 lower := element
1212 element = element.Next()
1213 upper := element
1214 _print("[%v-%v]", escape(lower.String()), escape(upper.String()))
1215 case TypePredicate:
1216 _print("&{%v}", n)
1217 case TypeStateChange:
1218 _print("!{%v}", n)
1219 case TypeAction:
1220 _print("{%v}", n)
1221 case TypeCommit:
1222 _print("commit")
1223 case TypeAlternate:
1224 _print("(")
1225 elements := n.Slice()
1226 printRule(elements[0])
1227 for _, element := range elements[1:] {
1228 _print(" / ")
1229 printRule(element)
1230 }
1231 _print(")")
1232 case TypeUnorderedAlternate:
1233 _print("(")
1234 elements := n.Slice()
1235 printRule(elements[0])
1236 for _, element := range elements[1:] {
1237 _print(" | ")
1238 printRule(element)
1239 }
1240 _print(")")
1241 case TypeSequence:
1242 _print("(")
1243 elements := n.Slice()
1244 printRule(elements[0])
1245 for _, element := range elements[1:] {
1246 _print(" ")
1247 printRule(element)
1248 }
1249 _print(")")
1250 case TypePeekFor:
1251 _print("&")
1252 printRule(n.Front())
1253 case TypePeekNot:
1254 _print("!")
1255 printRule(n.Front())
1256 case TypeQuery:
1257 printRule(n.Front())
1258 _print("?")
1259 case TypeStar:
1260 printRule(n.Front())
1261 _print("*")
1262 case TypePlus:
1263 printRule(n.Front())
1264 _print("+")
1265 case TypePush, TypeImplicitPush:
1266 _print("<")
1267 printRule(n.Front())
1268 _print(">")
1269 case TypeNil:
1270 default:
1271 warn(fmt.Errorf("illegal node type: %v", n.GetType()))
1272 }
1273 }
1274 compile = func(n Node, ko uint) (labelLast bool) {
1275 switch n.GetType() {
1276 case TypeRule:
1277 warn(fmt.Errorf("internal error #1 (%v)", n))
1278 case TypeDot:
1279 _print("\n if !matchDot() {")
1280 /*print("\n if buffer[position] == endSymbol {")*/
1281 printJump(ko)
1282 /*print("}\nposition++")*/
1283 _print("}")
1284 case TypeName:
1285 name := n.String()
1286 rule := t.Rules[name]
1287 if t.inline && t.rulesCount[name] == 1 {
1288 compile(rule.Front(), ko)
1289 return
1290 }
1291 _print("\n if !_rules[rule%v]() {", name /*rule.GetId()*/)
1292 printJump(ko)
1293 _print("}")
1294 case TypeRange:
1295 element := n.Front()
1296 lower := element
1297 element = element.Next()
1298 upper := element
1299 /*print("\n if !matchRange('%v', '%v') {", escape(lower.String()), escape(upper.String()))*/
1300 _print("\n if c := buffer[position]; c < rune('%v') || c > rune('%v') {", escape(lower.String()), escape(upper.String()))
1301 printJump(ko)
1302 _print("}\nposition++")
1303 case TypeCharacter:
1304 /*print("\n if !matchChar('%v') {", escape(n.String()))*/
1305 _print("\n if buffer[position] != rune('%v') {", escape(n.String()))
1306 printJump(ko)
1307 _print("}\nposition++")
1308 case TypeString:
1309 _print("\n if !matchString(%v) {", strconv.Quote(n.String()))
1310 printJump(ko)
1311 _print("}")
1312 case TypePredicate:
1313 _print("\n if !(%v) {", n)
1314 printJump(ko)
1315 _print("}")
1316 case TypeStateChange:
1317 _print("\n %v", n)
1318 case TypeAction:
1319 case TypeCommit:
1320 case TypePush:
1321 fallthrough
1322 case TypeImplicitPush:
1323 ok, element := label, n.Front()
1324 label++
1325 nodeType, rule := element.GetType(), element.Next()
1326 printBegin()
1327 if nodeType == TypeAction {
1328 if t.Ast {
1329 _print("\nadd(rule%v, position)", rule)
1330 } else {
1331 // There is no AST support, so inline the rule code
1332 _print("\n%v", element)
1333 }
1334 } else {
1335 _print("\nposition%d := position", ok)
1336 compile(element, ko)
1337 if n.GetType() == TypePush && !t.Ast {
1338 // This is TypePush and there is no AST support,
1339 // so inline capture to text right here
1340 _print("\nbegin := position%d", ok)
1341 _print("\nend := position")
1342 _print("\ntext = string(buffer[begin:end])")
1343 } else {
1344 _print("\nadd(rule%v, position%d)", rule, ok)
1345 }
1346 }
1347 printEnd()
1348 case TypeAlternate:
1349 ok := label
1350 label++
1351 printBegin()
1352 elements := n.Slice()
1353 printSave(ok)
1354 for _, element := range elements[:len(elements)-1] {
1355 next := label
1356 label++
1357 compile(element, next)
1358 printJump(ok)
1359 printLabel(next)
1360 printRestore(ok)
1361 }
1362 compile(elements[len(elements)-1], ko)
1363 printEnd()
1364 labelLast = printLabel(ok)
1365 case TypeUnorderedAlternate:
1366 done, ok := ko, label
1367 label++
1368 printBegin()
1369 _print("\n switch buffer[position] {")
1370 elements := n.Slice()
1371 elements, last := elements[:len(elements)-1], elements[len(elements)-1].Front().Next()
1372 for _, element := range elements {
1373 sequence := element.Front()
1374 class := sequence.Front()
1375 sequence = sequence.Next()
1376 _print("\n case")
1377 comma := false
1378 for _, character := range class.Slice() {
1379 if comma {
1380 _print(",")
1381 } else {
1382 comma = true
1383 }
1384 _print(" '%s'", escape(character.String()))
1385 }
1386 _print(":")
1387 if compile(sequence, done) {
1388 _print("\nbreak")
1389 }
1390 }
1391 _print("\n default:")
1392 if compile(last, done) {
1393 _print("\nbreak")
1394 }
1395 _print("\n }")
1396 printEnd()
1397 labelLast = printLabel(ok)
1398 case TypeSequence:
1399 for _, element := range n.Slice() {
1400 labelLast = compile(element, ko)
1401 }
1402 case TypePeekFor:
1403 ok := label
1404 label++
1405 printBegin()
1406 printSave(ok)
1407 compile(n.Front(), ko)
1408 printRestore(ok)
1409 printEnd()
1410 case TypePeekNot:
1411 ok := label
1412 label++
1413 printBegin()
1414 printSave(ok)
1415 compile(n.Front(), ok)
1416 printJump(ko)
1417 printLabel(ok)
1418 printRestore(ok)
1419 printEnd()
1420 case TypeQuery:
1421 qko := label
1422 label++
1423 qok := label
1424 label++
1425 printBegin()
1426 printSave(qko)
1427 compile(n.Front(), qko)
1428 printJump(qok)
1429 printLabel(qko)
1430 printRestore(qko)
1431 printEnd()
1432 labelLast = printLabel(qok)
1433 case TypeStar:
1434 again := label
1435 label++
1436 out := label
1437 label++
1438 printLabel(again)
1439 printBegin()
1440 printSave(out)
1441 compile(n.Front(), out)
1442 printJump(again)
1443 printLabel(out)
1444 printRestore(out)
1445 printEnd()
1446 case TypePlus:
1447 again := label
1448 label++
1449 out := label
1450 label++
1451 compile(n.Front(), ko)
1452 printLabel(again)
1453 printBegin()
1454 printSave(out)
1455 compile(n.Front(), out)
1456 printJump(again)
1457 printLabel(out)
1458 printRestore(out)
1459 printEnd()
1460 case TypeNil:
1461 default:
1462 warn(fmt.Errorf("illegal node type: %v", n.GetType()))
1463 }
1464 return labelLast
1465 }
1466
1467 /* lets figure out which jump labels are going to be used with this dry compile */
1468 printTemp, _print := _print, func(format string, a ...interface{}) {}
1469 for _, element := range t.Slice() {
1470 if element.GetType() != TypeRule {
1471 continue
1472 }
1473 expression := element.Front()
1474 if expression.GetType() == TypeNil {
1475 continue
1476 }
1477 ko := label
1478 label++
1479 if count, ok := t.rulesCount[element.String()]; !ok {
1480 continue
1481 } else if t.inline && count == 1 && ko != 0 {
1482 continue
1483 }
1484 compile(expression, ko)
1485 }
1486 _print, label = printTemp, 0
1487
1488 /* now for the real compile pass */
1489 t.PegRuleType = "uint8"
1490 if length := int64(t.Len()); length > math.MaxUint32 {
1491 t.PegRuleType = "uint64"
1492 } else if length > math.MaxUint16 {
1493 t.PegRuleType = "uint32"
1494 } else if length > math.MaxUint8 {
1495 t.PegRuleType = "uint16"
1496 }
1497 if err = printTemplate(pegHeaderTemplate); err != nil {
1498 return err
1499 }
1500 for _, element := range t.Slice() {
1501 if element.GetType() != TypeRule {
1502 continue
1503 }
1504 expression := element.Front()
1505 if implicit := expression.Front(); expression.GetType() == TypeNil || implicit.GetType() == TypeNil {
1506 if element.String() != "PegText" {
1507 warn(fmt.Errorf("rule '%v' used but not defined", element))
1508 }
1509 _print("\n nil,")
1510 continue
1511 }
1512 ko := label
1513 label++
1514 _print("\n /* %v ", element.GetId())
1515 printRule(element)
1516 _print(" */")
1517 if count, ok := t.rulesCount[element.String()]; !ok {
1518 warn(fmt.Errorf("rule '%v' defined but not used", element))
1519 _print("\n nil,")
1520 continue
1521 } else if t.inline && count == 1 && ko != 0 {
1522 _print("\n nil,")
1523 continue
1524 }
1525 _print("\n func() bool {")
1526 if labels[ko] {
1527 printSave(ko)
1528 }
1529 compile(expression, ko)
1530 //print("\n fmt.Printf(\"%v\\n\")", element.String())
1531 _print("\n return true")
1532 if labels[ko] {
1533 printLabel(ko)
1534 printRestore(ko)
1535 _print("\n return false")
1536 }
1537 _print("\n },")
1538 }
1539 _print("\n }\n p.rules = _rules")
1540 _print("\n return nil")
1541 _print("\n}\n")
1542 return nil
1543 }