Move parse back from tdewolff/minify/parse to tdewolff/parse
Taco de Wolff
2 years ago
0 | # DEPRECATED | |
0 | # Parse [![API reference](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/github.com/tdewolff/parse/v2?tab=doc) | |
1 | 1 | |
2 | Use https://github.com/tdewolff/minify/tree/master/parse instead. | |
2 | This package contains several lexers and parsers written in [Go][1]. All subpackages are built to be streaming, high performance and to be in accordance with the official (latest) specifications. | |
3 | ||
4 | The lexers are implemented using `buffer.Lexer` in https://github.com/tdewolff/parse/buffer and the parsers work on top of the lexers. Some subpackages have hashes defined (using [Hasher](https://github.com/tdewolff/hasher)) that speed up common byte-slice comparisons. | |
5 | ||
6 | ## Buffer | |
7 | ### Reader | |
8 | Reader is a wrapper around a `[]byte` that implements the `io.Reader` interface. It is comparable to `bytes.Reader` but has slightly different semantics (and a slightly smaller memory footprint). | |
9 | ||
10 | ### Writer | |
11 | Writer is a buffer that implements the `io.Writer` interface and expands the buffer as needed. The reset functionality allows for better memory reuse. After calling `Reset`, it will overwrite the current buffer and thus reduce allocations. | |
12 | ||
13 | ### Lexer | |
14 | Lexer is a read buffer specifically designed for building lexers. It keeps track of two positions: a start and end position. The start position is the beginning of the current token being parsed, the end position is being moved forward until a valid token is found. Calling `Shift` will collapse the positions to the end and return the parsed `[]byte`. | |
15 | ||
16 | Moving the end position can go through `Move(int)` which also accepts negative integers. One can also use `Pos() int` to try and parse a token, and if it fails rewind with `Rewind(int)`, passing the previously saved position. | |
17 | ||
18 | `Peek(int) byte` will peek forward (relative to the end position) and return the byte at that location. `PeekRune(int) (rune, int)` returns UTF-8 runes and its length at the given **byte** position. Upon an error `Peek` will return `0`, the **user must peek at every character** and not skip any, otherwise it may skip a `0` and panic on out-of-bounds indexing. | |
19 | ||
20 | `Lexeme() []byte` will return the currently selected bytes, `Skip()` will collapse the selection. `Shift() []byte` is a combination of `Lexeme() []byte` and `Skip()`. | |
21 | ||
22 | When the passed `io.Reader` returned an error, `Err() error` will return that error even if not at the end of the buffer. | |
23 | ||
24 | ### StreamLexer | |
25 | StreamLexer behaves like Lexer but uses a buffer pool to read in chunks from `io.Reader`, retaining old buffers in memory that are still in use, and re-using old buffers otherwise. Calling `Free(n int)` frees up `n` bytes from the internal buffer(s). It holds an array of buffers to accommodate for keeping everything in-memory. Calling `ShiftLen() int` returns the number of bytes that have been shifted since the previous call to `ShiftLen`, which can be used to specify how many bytes need to be freed up from the buffer. If you don't need to keep returned byte slices around, call `Free(ShiftLen())` after every `Shift` call. | |
26 | ||
27 | ## Strconv | |
28 | This package contains string conversion function much like the standard library's `strconv` package, but it is specifically tailored for the performance needs within the `minify` package. | |
29 | ||
30 | For example, the floating-point to string conversion function is approximately twice as fast as the standard library, but it is not as precise. | |
31 | ||
32 | ## CSS | |
33 | This package is a CSS3 lexer and parser. Both follow the specification at [CSS Syntax Module Level 3](http://www.w3.org/TR/css-syntax-3/). The lexer takes an io.Reader and converts it into tokens until the EOF. The parser returns a parse tree of the full io.Reader input stream, but the low-level `Next` function can be used for stream parsing to returns grammar units until the EOF. | |
34 | ||
35 | [See README here](https://github.com/tdewolff/minify/tree/master/parse/css). | |
36 | ||
37 | ## HTML | |
38 | This package is an HTML5 lexer. It follows the specification at [The HTML syntax](http://www.w3.org/TR/html5/syntax.html). The lexer takes an io.Reader and converts it into tokens until the EOF. | |
39 | ||
40 | [See README here](https://github.com/tdewolff/minify/tree/master/parse/html). | |
41 | ||
42 | ## JS | |
43 | This package is a JS lexer (ECMA-262, edition 6.0). It follows the specification at [ECMAScript Language Specification](http://www.ecma-international.org/ecma-262/6.0/). The lexer takes an io.Reader and converts it into tokens until the EOF. | |
44 | ||
45 | [See README here](https://github.com/tdewolff/minify/tree/master/parse/js). | |
46 | ||
47 | ## JSON | |
48 | This package is a JSON parser (ECMA-404). It follows the specification at [JSON](http://json.org/). The parser takes an io.Reader and converts it into tokens until the EOF. | |
49 | ||
50 | [See README here](https://github.com/tdewolff/minify/tree/master/parse/json). | |
51 | ||
52 | ## SVG | |
53 | This package contains common hashes for SVG1.1 tags and attributes. | |
54 | ||
55 | ## XML | |
56 | This package is an XML1.0 lexer. It follows the specification at [Extensible Markup Language (XML) 1.0 (Fifth Edition)](http://www.w3.org/TR/xml/). The lexer takes an io.Reader and converts it into tokens until the EOF. | |
57 | ||
58 | [See README here](https://github.com/tdewolff/minify/tree/master/parse/xml). | |
59 | ||
60 | ## License | |
61 | Released under the [MIT license](LICENSE.md). | |
62 | ||
63 | [1]: http://golang.org/ "Go Language" |
0 | /* | |
1 | Package buffer contains buffer and wrapper types for byte slices. It is useful for writing lexers or other high-performance byte slice handling. | |
2 | ||
3 | The `Reader` and `Writer` types implement the `io.Reader` and `io.Writer` respectively and provide a thinner and faster interface than `bytes.Buffer`. | |
4 | The `Lexer` type is useful for building lexers because it keeps track of the start and end position of a byte selection, and shifts the bytes whenever a valid token is found. | |
5 | The `StreamLexer` does the same, but keeps a buffer pool so that it reads a limited amount at a time, allowing to parse from streaming sources. | |
6 | */ | |
0 | // Package buffer contains buffer and wrapper types for byte slices. It is useful for writing lexers or other high-performance byte slice handling. | |
1 | // The `Reader` and `Writer` types implement the `io.Reader` and `io.Writer` respectively and provide a thinner and faster interface than `bytes.Buffer`. | |
2 | // The `Lexer` type is useful for building lexers because it keeps track of the start and end position of a byte selection, and shifts the bytes whenever a valid token is found. | |
3 | // The `StreamLexer` does the same, but keeps a buffer pool so that it reads a limited amount at a time, allowing to parse from streaming sources. | |
7 | 4 | package buffer |
8 | 5 | |
9 | 6 | // defaultBufSize specifies the default initial length of internal buffers. |
0 | # CSS [![GoDoc](http://godoc.org/github.com/tdewolff/parse/css?status.svg)](http://godoc.org/github.com/tdewolff/parse/css) | |
0 | # CSS [![API reference](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/github.com/tdewolff/minify/v2/parse/css?tab=doc) | |
1 | 1 | |
2 | 2 | This package is a CSS3 lexer and parser written in [Go][1]. Both follow the specification at [CSS Syntax Module Level 3](http://www.w3.org/TR/css-syntax-3/). The lexer takes an io.Reader and converts it into tokens until the EOF. The parser returns a parse tree of the full io.Reader input stream, but the low-level `Next` function can be used for stream parsing to returns grammar units until the EOF. |
3 | 3 |
0 | # HTML [![GoDoc](http://godoc.org/github.com/tdewolff/parse/html?status.svg)](http://godoc.org/github.com/tdewolff/parse/html) | |
0 | # HTML [![API reference](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/github.com/tdewolff/minify/v2/parse/html?tab=doc) | |
1 | 1 | |
2 | 2 | This package is an HTML5 lexer written in [Go][1]. It follows the specification at [The HTML syntax](http://www.w3.org/TR/html5/syntax.html). The lexer takes an io.Reader and converts it into tokens until the EOF. |
3 | 3 |
325 | 325 | l.text = parse.ToLower(l.r.Lexeme()[1:]) |
326 | 326 | if h := ToHash(l.text); h == Textarea || h == Title || h == Style || h == Xmp || h == Iframe || h == Script || h == Plaintext || h == Svg || h == Math { |
327 | 327 | if h == Svg || h == Math { |
328 | data := l.shiftXml(h) | |
328 | data := l.shiftXML(h) | |
329 | 329 | if l.err != nil { |
330 | 330 | return ErrorToken, nil |
331 | 331 | } |
424 | 424 | return parse.ToLower(l.r.Shift()) |
425 | 425 | } |
426 | 426 | |
427 | // shiftXml parses the content of a svg or math tag according to the XML 1.1 specifications, including the tag itself. | |
427 | // shiftXML parses the content of a svg or math tag according to the XML 1.1 specifications, including the tag itself. | |
428 | 428 | // So far we have already parsed `<svg` or `<math`. |
429 | func (l *Lexer) shiftXml(rawTag Hash) []byte { | |
429 | func (l *Lexer) shiftXML(rawTag Hash) []byte { | |
430 | 430 | inQuote := false |
431 | 431 | for { |
432 | 432 | c := l.r.Peek(0) |
32 | 32 | if len(b) > 1 && (b[0] == '"' || b[0] == '\'') && b[0] == b[len(b)-1] { |
33 | 33 | b = b[1 : len(b)-1] |
34 | 34 | } |
35 | val := EscapeAttrVal(&buf, orig, []byte(b), false) | |
35 | val := EscapeAttrVal(&buf, orig, b, false) | |
36 | 36 | test.String(t, string(val), tt.expected) |
37 | 37 | }) |
38 | 38 | } |
54 | 54 | if len(b) > 1 && (b[0] == '"' || b[0] == '\'') && b[0] == b[len(b)-1] { |
55 | 55 | b = b[1 : len(b)-1] |
56 | 56 | } |
57 | val := EscapeAttrVal(&buf, orig, []byte(b), true) | |
57 | val := EscapeAttrVal(&buf, orig, b, true) | |
58 | 58 | test.String(t, string(val), tt.expected) |
59 | 59 | }) |
60 | 60 | } |
0 | # JS [![GoDoc](http://godoc.org/github.com/tdewolff/parse/js?status.svg)](http://godoc.org/github.com/tdewolff/parse/js) | |
0 | # JS [![API reference](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/github.com/tdewolff/minify/v2/parse/js?tab=doc) | |
1 | 1 | |
2 | 2 | This package is a JS lexer (ECMAScript 2020) written in [Go][1]. It follows the specification at [ECMAScript 2020 Language Specification](https://tc39.es/ecma262/). The lexer takes an io.Reader and converts it into tokens until the EOF. |
3 | 3 |
61 | 61 | // Var is a variable, where Decl is the type of declaration and can be var|function for function scoped variables, let|const|class for block scoped variables. |
62 | 62 | type Var struct { |
63 | 63 | Data []byte |
64 | Link *Var // is set when merging variable uses, as in: {a} {var a} where the first lins to the second | |
64 | Link *Var // is set when merging variable uses, as in: {a} {var a} where the first links to the second, only used for undeclared variables | |
65 | 65 | Uses uint16 |
66 | 66 | Decl DeclType |
67 | 67 | } |
306 | 306 | s.Undeclared = s.Undeclared[:0] |
307 | 307 | } |
308 | 308 | |
309 | // Unscope moves all declared variables of the current scope to the parent scope. Undeclared variables are already in the parent scope. | |
310 | func (s *Scope) Unscope() { | |
311 | for _, vorig := range s.Declared { | |
312 | // no need to evaluate vorig.Link as vorig.Data stays the same, and Link is always nil in Declared | |
313 | // vorig.Uses will be atleast 1 | |
314 | s.Parent.Declared = append(s.Parent.Declared, vorig) | |
315 | } | |
316 | s.Declared = s.Declared[:0] | |
317 | s.Undeclared = s.Undeclared[:0] | |
318 | } | |
319 | ||
309 | 320 | //////////////////////////////////////////////////////////////// |
310 | 321 | |
311 | 322 | // IStmt is a dummy interface for statements. |
403 | 414 | Init IExpr // can be nil |
404 | 415 | Cond IExpr // can be nil |
405 | 416 | Post IExpr // can be nil |
406 | Body BlockStmt | |
417 | Body *BlockStmt | |
407 | 418 | } |
408 | 419 | |
409 | 420 | func (n ForStmt) String() string { |
426 | 437 | type ForInStmt struct { |
427 | 438 | Init IExpr |
428 | 439 | Value IExpr |
429 | Body BlockStmt | |
440 | Body *BlockStmt | |
430 | 441 | } |
431 | 442 | |
432 | 443 | func (n ForInStmt) String() string { |
438 | 449 | Await bool |
439 | 450 | Init IExpr |
440 | 451 | Value IExpr |
441 | Body BlockStmt | |
452 | Body *BlockStmt | |
442 | 453 | } |
443 | 454 | |
444 | 455 | func (n ForOfStmt) String() string { |
460 | 471 | type SwitchStmt struct { |
461 | 472 | Init IExpr |
462 | 473 | List []CaseClause |
474 | Scope | |
463 | 475 | } |
464 | 476 | |
465 | 477 | func (n SwitchStmt) String() string { |
535 | 547 | |
536 | 548 | // TryStmt is a try statement. |
537 | 549 | type TryStmt struct { |
538 | Body BlockStmt | |
550 | Body *BlockStmt | |
539 | 551 | Binding IBinding // can be nil |
540 | 552 | Catch *BlockStmt // can be nil |
541 | 553 | Finally *BlockStmt // can be nil |
906 | 918 | Name *Var // can be nil |
907 | 919 | Extends IExpr // can be nil |
908 | 920 | Definitions []FieldDefinition |
909 | Methods []MethodDecl | |
921 | Methods []*MethodDecl | |
910 | 922 | } |
911 | 923 | |
912 | 924 | func (n ClassDecl) String() string { |
210 | 210 | |
211 | 211 | switch tt := p.tt; tt { |
212 | 212 | case OpenBraceToken: |
213 | blockStmt := p.parseBlockStmt("block statement") | |
214 | stmt = &blockStmt | |
213 | stmt = p.parseBlockStmt("block statement") | |
215 | 214 | case ConstToken, VarToken: |
216 | 215 | if !allowDeclaration && tt == ConstToken { |
217 | 216 | p.fail("statement") |
325 | 324 | return |
326 | 325 | } |
327 | 326 | |
328 | body := BlockStmt{} | |
327 | body := &BlockStmt{} | |
329 | 328 | parent := p.enterScope(&body.Scope, false) |
330 | 329 | |
331 | 330 | var init IExpr |
426 | 425 | return |
427 | 426 | } |
428 | 427 | |
429 | clauses := []CaseClause{} | |
428 | switchStmt := &SwitchStmt{Init: init} | |
429 | parent := p.enterScope(&switchStmt.Scope, false) | |
430 | 430 | for { |
431 | 431 | if p.tt == ErrorToken { |
432 | 432 | p.fail("switch statement") |
455 | 455 | for p.tt != CaseToken && p.tt != DefaultToken && p.tt != CloseBraceToken && p.tt != ErrorToken { |
456 | 456 | stmts = append(stmts, p.parseStmt(true)) |
457 | 457 | } |
458 | clauses = append(clauses, CaseClause{clause, list, stmts}) | |
459 | } | |
460 | stmt = &SwitchStmt{init, clauses} | |
458 | switchStmt.List = append(switchStmt.List, CaseClause{clause, list, stmts}) | |
459 | } | |
460 | p.exitScope(parent) | |
461 | stmt = switchStmt | |
461 | 462 | case FunctionToken: |
462 | 463 | if !allowDeclaration { |
463 | 464 | p.fail("statement") |
464 | 465 | return |
465 | 466 | } |
466 | funcDecl := p.parseFuncDecl() | |
467 | stmt = &funcDecl | |
467 | stmt = p.parseFuncDecl() | |
468 | 468 | case AsyncToken: // async function |
469 | 469 | if !allowDeclaration { |
470 | 470 | p.fail("statement") |
473 | 473 | async := p.data |
474 | 474 | p.next() |
475 | 475 | if p.tt == FunctionToken && !p.prevLT { |
476 | funcDecl := p.parseAsyncFuncDecl() | |
477 | stmt = &funcDecl | |
476 | stmt = p.parseAsyncFuncDecl() | |
478 | 477 | } else { |
479 | 478 | // expression |
480 | 479 | stmt = &ExprStmt{p.parseAsyncExpression(OpExpr, async)} |
488 | 487 | p.fail("statement") |
489 | 488 | return |
490 | 489 | } |
491 | classDecl := p.parseClassDecl() | |
492 | stmt = &classDecl | |
490 | stmt = p.parseClassDecl() | |
493 | 491 | case ThrowToken: |
494 | 492 | p.next() |
495 | 493 | var value IExpr |
521 | 519 | } |
522 | 520 | if p.tt == FinallyToken { |
523 | 521 | p.next() |
524 | blockStmt := p.parseBlockStmt("try-finally statement") | |
525 | finally = &blockStmt | |
522 | finally = p.parseBlockStmt("try-finally statement") | |
526 | 523 | } |
527 | 524 | stmt = &TryStmt{body, binding, catch, finally} |
528 | 525 | case DebuggerToken: |
586 | 583 | return |
587 | 584 | } |
588 | 585 | |
589 | func (p *Parser) parseBlockStmt(in string) (blockStmt BlockStmt) { | |
586 | func (p *Parser) parseBlockStmt(in string) (blockStmt *BlockStmt) { | |
587 | blockStmt = &BlockStmt{} | |
590 | 588 | parent := p.enterScope(&blockStmt.Scope, false) |
591 | 589 | blockStmt.List = p.parseStmtList(in) |
592 | 590 | p.exitScope(parent) |
732 | 730 | varDecl := p.parseVarDecl(tt) |
733 | 731 | exportStmt.Decl = &varDecl |
734 | 732 | } else if p.tt == FunctionToken { |
735 | funcDecl := p.parseFuncDecl() | |
736 | exportStmt.Decl = &funcDecl | |
733 | exportStmt.Decl = p.parseFuncDecl() | |
737 | 734 | } else if p.tt == AsyncToken { // async function |
738 | 735 | p.next() |
739 | 736 | if p.tt != FunctionToken || p.prevLT { |
740 | 737 | p.fail("export statement", FunctionToken) |
741 | 738 | return |
742 | 739 | } |
743 | funcDecl := p.parseAsyncFuncDecl() | |
744 | exportStmt.Decl = &funcDecl | |
740 | exportStmt.Decl = p.parseAsyncFuncDecl() | |
745 | 741 | } else if p.tt == ClassToken { |
746 | classDecl := p.parseClassDecl() | |
747 | exportStmt.Decl = &classDecl | |
742 | exportStmt.Decl = p.parseClassDecl() | |
748 | 743 | } else if p.tt == DefaultToken { |
749 | 744 | exportStmt.Default = true |
750 | 745 | p.next() |
751 | 746 | if p.tt == FunctionToken { |
752 | funcDecl := p.parseFuncExpr() | |
753 | exportStmt.Decl = &funcDecl | |
747 | exportStmt.Decl = p.parseFuncExpr() | |
754 | 748 | } else if p.tt == AsyncToken { // async function or async arrow function |
755 | 749 | async := p.data |
756 | 750 | p.next() |
757 | 751 | if p.tt == FunctionToken && !p.prevLT { |
758 | funcDecl := p.parseAsyncFuncExpr() | |
759 | exportStmt.Decl = &funcDecl | |
752 | exportStmt.Decl = p.parseAsyncFuncExpr() | |
760 | 753 | } else { |
761 | 754 | // expression |
762 | 755 | exportStmt.Decl = p.parseAsyncExpression(OpExpr, async) |
763 | 756 | } |
764 | 757 | } else if p.tt == ClassToken { |
765 | classDecl := p.parseClassExpr() | |
766 | exportStmt.Decl = &classDecl | |
758 | exportStmt.Decl = p.parseClassExpr() | |
767 | 759 | } else { |
768 | 760 | exportStmt.Decl = p.parseExpression(OpAssign) |
769 | 761 | } |
840 | 832 | return |
841 | 833 | } |
842 | 834 | |
843 | func (p *Parser) parseFuncDecl() (funcDecl FuncDecl) { | |
835 | func (p *Parser) parseFuncDecl() (funcDecl *FuncDecl) { | |
844 | 836 | return p.parseAnyFunc(false, false) |
845 | 837 | } |
846 | 838 | |
847 | func (p *Parser) parseAsyncFuncDecl() (funcDecl FuncDecl) { | |
839 | func (p *Parser) parseAsyncFuncDecl() (funcDecl *FuncDecl) { | |
848 | 840 | return p.parseAnyFunc(true, false) |
849 | 841 | } |
850 | 842 | |
851 | func (p *Parser) parseFuncExpr() (funcDecl FuncDecl) { | |
843 | func (p *Parser) parseFuncExpr() (funcDecl *FuncDecl) { | |
852 | 844 | return p.parseAnyFunc(false, true) |
853 | 845 | } |
854 | 846 | |
855 | func (p *Parser) parseAsyncFuncExpr() (funcDecl FuncDecl) { | |
847 | func (p *Parser) parseAsyncFuncExpr() (funcDecl *FuncDecl) { | |
856 | 848 | return p.parseAnyFunc(true, true) |
857 | 849 | } |
858 | 850 | |
859 | func (p *Parser) parseAnyFunc(async, inExpr bool) (funcDecl FuncDecl) { | |
851 | func (p *Parser) parseAnyFunc(async, inExpr bool) (funcDecl *FuncDecl) { | |
860 | 852 | // assume we're at function |
861 | 853 | p.next() |
854 | funcDecl = &FuncDecl{} | |
862 | 855 | funcDecl.Async = async |
863 | 856 | funcDecl.Generator = p.tt == MulToken |
864 | 857 | if funcDecl.Generator { |
899 | 892 | return |
900 | 893 | } |
901 | 894 | |
902 | func (p *Parser) parseClassDecl() (classDecl ClassDecl) { | |
895 | func (p *Parser) parseClassDecl() (classDecl *ClassDecl) { | |
903 | 896 | return p.parseAnyClass(false) |
904 | 897 | } |
905 | 898 | |
906 | func (p *Parser) parseClassExpr() (classDecl ClassDecl) { | |
899 | func (p *Parser) parseClassExpr() (classDecl *ClassDecl) { | |
907 | 900 | return p.parseAnyClass(true) |
908 | 901 | } |
909 | 902 | |
910 | func (p *Parser) parseAnyClass(inExpr bool) (classDecl ClassDecl) { | |
903 | func (p *Parser) parseAnyClass(inExpr bool) (classDecl *ClassDecl) { | |
911 | 904 | // assume we're at class |
912 | 905 | p.next() |
906 | classDecl = &ClassDecl{} | |
913 | 907 | if IsIdentifier(p.tt) || p.tt == YieldToken || p.tt == AwaitToken { |
914 | 908 | if !inExpr { |
915 | 909 | var ok bool |
948 | 942 | } |
949 | 943 | |
950 | 944 | method, definition := p.parseClassElement() |
951 | if method.Name.IsSet() { | |
945 | if method != nil { | |
952 | 946 | classDecl.Methods = append(classDecl.Methods, method) |
953 | 947 | } else { |
954 | 948 | classDecl.Definitions = append(classDecl.Definitions, definition) |
957 | 951 | return |
958 | 952 | } |
959 | 953 | |
960 | func (p *Parser) parseClassElement() (method MethodDecl, definition FieldDefinition) { | |
954 | func (p *Parser) parseClassElement() (method *MethodDecl, definition FieldDefinition) { | |
955 | method = &MethodDecl{} | |
961 | 956 | var data []byte |
962 | 957 | if p.tt == StaticToken { |
963 | 958 | method.Static = true |
1019 | 1014 | p.next() |
1020 | 1015 | definition.Init = p.parseExpression(OpAssign) |
1021 | 1016 | } |
1022 | method = MethodDecl{} | |
1017 | method = nil | |
1023 | 1018 | return |
1024 | 1019 | } |
1025 | 1020 | |
1391 | 1386 | return |
1392 | 1387 | } |
1393 | 1388 | |
1394 | func (p *Parser) parseAsyncArrowFunc() (arrowFunc ArrowFunc) { | |
1389 | func (p *Parser) parseAsyncArrowFunc() (arrowFunc *ArrowFunc) { | |
1395 | 1390 | // expect we're at Identifier or Yield or ( |
1391 | arrowFunc = &ArrowFunc{} | |
1396 | 1392 | parent := p.enterScope(&arrowFunc.Body.Scope, true) |
1397 | 1393 | parentAsync, parentGenerator := p.async, p.generator |
1398 | 1394 | p.async, p.generator = true, false |
1403 | 1399 | arrowFunc.Params.List = []BindingElement{{Binding: ref}} |
1404 | 1400 | } else { |
1405 | 1401 | arrowFunc.Params = p.parseFuncParams("arrow function") |
1402 | ||
1403 | // could be CallExpression of: async(params) | |
1404 | if p.tt != ArrowToken { | |
1405 | } | |
1406 | 1406 | } |
1407 | 1407 | |
1408 | 1408 | arrowFunc.Async = true |
1413 | 1413 | return |
1414 | 1414 | } |
1415 | 1415 | |
1416 | func (p *Parser) parseIdentifierArrowFunc(v *Var) (arrowFunc ArrowFunc) { | |
1416 | func (p *Parser) parseIdentifierArrowFunc(v *Var) (arrowFunc *ArrowFunc) { | |
1417 | 1417 | // expect we're at => |
1418 | arrowFunc = &ArrowFunc{} | |
1418 | 1419 | parent := p.enterScope(&arrowFunc.Body.Scope, true) |
1419 | 1420 | parentAsync, parentGenerator := p.async, p.generator |
1420 | 1421 | p.async, p.generator = false, false |
1475 | 1476 | precLeft := OpPrimary |
1476 | 1477 | if !p.prevLT && p.tt == FunctionToken { |
1477 | 1478 | // primary expression |
1478 | funcDecl := p.parseAsyncFuncExpr() | |
1479 | left = &funcDecl | |
1479 | left = p.parseAsyncFuncExpr() | |
1480 | 1480 | } else if !p.prevLT && prec <= OpAssign && (p.tt == OpenParenToken || IsIdentifier(p.tt) || !p.generator && p.tt == YieldToken || p.tt == AwaitToken) { |
1481 | 1481 | // async arrow function expression |
1482 | 1482 | if p.tt == AwaitToken { |
1483 | 1483 | p.fail("arrow function") |
1484 | 1484 | return nil |
1485 | } | |
1486 | arrowFunc := p.parseAsyncArrowFunc() | |
1487 | left = &arrowFunc | |
1485 | } else if p.tt == OpenParenToken { | |
1486 | return p.parseParenthesizedExpressionOrArrowFunc(prec, async) | |
1487 | } | |
1488 | left = p.parseAsyncArrowFunc() | |
1488 | 1489 | precLeft = OpAssign |
1489 | 1490 | } else { |
1490 | 1491 | left = p.scope.Use(async) |
1491 | 1492 | } |
1492 | left = p.parseExpressionSuffix(left, prec, precLeft) | |
1493 | return left | |
1493 | return p.parseExpressionSuffix(left, prec, precLeft) | |
1494 | 1494 | } |
1495 | 1495 | |
1496 | 1496 | // parseExpression parses an expression that has a precedence of prec or higher. |
1557 | 1557 | } |
1558 | 1558 | break |
1559 | 1559 | } |
1560 | suffix := p.parseParenthesizedExpressionOrArrowFunc(prec) | |
1560 | suffix := p.parseParenthesizedExpressionOrArrowFunc(prec, nil) | |
1561 | 1561 | p.exprLevel-- |
1562 | 1562 | return suffix |
1563 | 1563 | case NotToken, BitNotToken, TypeofToken, VoidToken, DeleteToken: |
1702 | 1702 | case ClassToken: |
1703 | 1703 | parentInFor := p.inFor |
1704 | 1704 | p.inFor = false |
1705 | classDecl := p.parseClassExpr() | |
1706 | left = &classDecl | |
1705 | left = p.parseClassExpr() | |
1707 | 1706 | p.inFor = parentInFor |
1708 | 1707 | case FunctionToken: |
1709 | 1708 | parentInFor := p.inFor |
1710 | 1709 | p.inFor = false |
1711 | funcDecl := p.parseFuncExpr() | |
1712 | left = &funcDecl | |
1710 | left = p.parseFuncExpr() | |
1713 | 1711 | p.inFor = parentInFor |
1714 | 1712 | case TemplateToken, TemplateStartToken: |
1715 | 1713 | parentInFor := p.inFor |
2027 | 2025 | return nil |
2028 | 2026 | } |
2029 | 2027 | |
2030 | arrowFunc := p.parseIdentifierArrowFunc(v) | |
2031 | left = &arrowFunc | |
2028 | left = p.parseIdentifierArrowFunc(v) | |
2032 | 2029 | precLeft = OpAssign |
2033 | 2030 | default: |
2034 | 2031 | return left |
2061 | 2058 | return p.parseExpression(OpAssign) |
2062 | 2059 | } |
2063 | 2060 | |
2064 | func (p *Parser) parseParenthesizedExpressionOrArrowFunc(prec OpPrec) IExpr { | |
2061 | func (p *Parser) parseParenthesizedExpressionOrArrowFunc(prec OpPrec, async []byte) IExpr { | |
2065 | 2062 | var left IExpr |
2066 | 2063 | precLeft := OpPrimary |
2067 | 2064 | |
2068 | 2065 | // expect to be at ( |
2069 | 2066 | p.next() |
2070 | 2067 | |
2071 | arrowFunc := ArrowFunc{} | |
2068 | isAsync := async != nil | |
2069 | arrowFunc := &ArrowFunc{} | |
2072 | 2070 | parent := p.enterScope(&arrowFunc.Body.Scope, true) |
2073 | 2071 | parentAssumeArrowFunc, parentInFor := p.assumeArrowFunc, p.inFor |
2074 | 2072 | p.assumeArrowFunc, p.inFor = true, false |
2075 | 2073 | |
2076 | // parse a parenthesized expression but assume we might be parsing an arrow function. If this is really an arrow function, parsing as a parenthesized expression cannot fail as AssignmentExpression, ArrayLiteral, and ObjectLiteral are supersets of SingleNameBinding, ArrayBindingPattern, and ObjectBindingPattern respectively. Any identifier that would be a BindingIdentifier in case of an arrow function, will be added as such. If finally this is not an arrow function, we will demote those variables an undeclared and merge them with the parent scope. | |
2074 | // parse a parenthesized expression but assume we might be parsing an (async) arrow function. If this is really an arrow function, parsing as a parenthesized expression cannot fail as AssignmentExpression, ArrayLiteral, and ObjectLiteral are supersets of SingleNameBinding, ArrayBindingPattern, and ObjectBindingPattern respectively. Any identifier that would be a BindingIdentifier in case of an arrow function, will be added as such. If finally this is not an arrow function, we will demote those variables an undeclared and merge them with the parent scope. | |
2077 | 2075 | |
2078 | 2076 | var list []IExpr |
2079 | 2077 | var rest IExpr |
2080 | 2078 | for p.tt != CloseParenToken && p.tt != ErrorToken { |
2081 | 2079 | if p.tt == EllipsisToken && p.assumeArrowFunc { |
2082 | 2080 | p.next() |
2083 | if p.isIdentifierReference(p.tt) { | |
2081 | if isAsync { | |
2082 | rest = p.parseAssignmentExpression() | |
2083 | if p.tt == CommaToken { | |
2084 | p.next() | |
2085 | } | |
2086 | } else if p.isIdentifierReference(p.tt) { | |
2084 | 2087 | rest, _ = p.scope.Declare(ArgumentDecl, p.data) // cannot fail |
2085 | 2088 | p.next() |
2086 | 2089 | } else if p.tt == OpenBracketToken { |
2112 | 2115 | |
2113 | 2116 | if isArrowFunc { |
2114 | 2117 | parentAsync, parentGenerator := p.async, p.generator |
2115 | p.async, p.generator = false, false | |
2118 | p.async, p.generator = isAsync, false | |
2116 | 2119 | |
2117 | 2120 | // arrow function |
2118 | 2121 | arrowFunc.Params = Params{List: make([]BindingElement, len(list))} |
2119 | 2122 | for i, item := range list { |
2120 | 2123 | arrowFunc.Params.List[i] = p.exprToBindingElement(item) // can not fail when assumArrowFunc is set |
2121 | 2124 | } |
2125 | arrowFunc.Async = isAsync | |
2122 | 2126 | arrowFunc.Params.Rest = p.exprToBinding(rest) |
2123 | 2127 | arrowFunc.Body.List = p.parseArrowFuncBody() |
2124 | 2128 | |
2125 | 2129 | p.async, p.generator = parentAsync, parentGenerator |
2126 | 2130 | p.exitScope(parent) |
2127 | 2131 | |
2128 | left = &arrowFunc | |
2132 | left = arrowFunc | |
2129 | 2133 | precLeft = OpAssign |
2130 | } else if len(list) == 0 || rest != nil { | |
2134 | } else if len(list) == 0 || !isAsync && rest != nil || isAsync && OpCall < prec { | |
2131 | 2135 | p.fail("arrow function", ArrowToken) |
2132 | 2136 | return nil |
2133 | 2137 | } else { |
2137 | 2141 | // Here we move all declared ArgumentDecls (in case of an arrow function) to its parent scope as undeclared variables (identifiers used in a parenthesized expression). |
2138 | 2142 | arrowFunc.Body.Scope.UndeclareScope() |
2139 | 2143 | |
2140 | // parenthesized expression | |
2141 | left = list[0] | |
2142 | for _, item := range list[1:] { | |
2143 | left = &BinaryExpr{CommaToken, left, item} | |
2144 | } | |
2145 | left = &GroupExpr{left} | |
2144 | if isAsync { | |
2145 | // call expression | |
2146 | args := Args{} | |
2147 | for _, item := range list { | |
2148 | args.List = append(args.List, Arg{Value: item, Rest: false}) | |
2149 | } | |
2150 | if rest != nil { | |
2151 | args.List = append(args.List, Arg{Value: rest, Rest: true}) | |
2152 | } | |
2153 | left = p.scope.Use(async) | |
2154 | left = &CallExpr{left, args} | |
2155 | precLeft = OpCall | |
2156 | } else { | |
2157 | // parenthesized expression | |
2158 | left = list[0] | |
2159 | for _, item := range list[1:] { | |
2160 | left = &BinaryExpr{CommaToken, left, item} | |
2161 | } | |
2162 | left = &GroupExpr{left} | |
2163 | } | |
2146 | 2164 | } |
2147 | 2165 | return p.parseExpressionSuffix(left, prec, precLeft) |
2148 | 2166 | } |
146 | 146 | {"async\n= a", "Stmt(async=a)"}, |
147 | 147 | {"async a => b", "Stmt(async Params(Binding(a)) => Stmt({ Stmt(return b) }))"}, |
148 | 148 | {"async (a) => b", "Stmt(async Params(Binding(a)) => Stmt({ Stmt(return b) }))"}, |
149 | {"async(a)", "Stmt(async(a))"}, | |
150 | {"async(a=6, ...b)", "Stmt(async((a=6), ...b))"}, | |
151 | {"async(function(){})", "Stmt(async(Decl(function Params() Stmt({ }))))"}, | |
149 | 152 | {"async\nawait => b", "Stmt(async) Stmt(Params(Binding(await)) => Stmt({ Stmt(return b) }))"}, |
150 | 153 | {"a + async\nb", "Stmt(a+async) Stmt(b)"}, |
151 | 154 | {"a + async\nfunction f(){}", "Stmt(a+async) Decl(function f Params() Stmt({ }))"}, |
490 | 493 | {"x={a", "unexpected EOF in object literal"}, |
491 | 494 | {"x=a[b", "expected ] instead of EOF in index expression"}, |
492 | 495 | {"x=async a", "expected => instead of EOF in arrow function"}, |
493 | {"x=async (a", "unexpected EOF in arrow function"}, | |
494 | {"x=async (a,", "unexpected EOF in arrow function"}, | |
496 | {"x=async (a", "unexpected EOF in expression"}, | |
497 | {"x=async (a,", "unexpected EOF in expression"}, | |
495 | 498 | {"x=async function", "expected Identifier or ( instead of EOF in function declaration"}, |
496 | 499 | {"x=async function *", "expected Identifier or ( instead of EOF in function declaration"}, |
497 | 500 | {"x=async function a", "expected ( instead of EOF in function declaration"}, |
548 | 551 | {"function*a(){ (yield=5) => yield }", "unexpected = in expression"}, |
549 | 552 | {"function*a(){ (...yield) => yield }", "unexpected yield in arrow function"}, |
550 | 553 | {"x = await\n=> a++", "unexpected => in expression"}, |
551 | {"x=async (await,", "unexpected await in binding"}, | |
554 | {"x=async (await,", "unexpected EOF in expression"}, | |
552 | 555 | {"async function a() { class a extends await", "unexpected await in expression"}, |
553 | 556 | {"async function a() { await: var a", "unexpected : in expression"}, |
554 | 557 | {"async function a() { let await", "unexpected await in binding"}, |
841 | 844 | case *ThrowStmt: |
842 | 845 | sv.AddExpr(stmt.Value) |
843 | 846 | case *ForStmt: |
844 | sv.AddStmt(&stmt.Body) | |
847 | sv.AddStmt(stmt.Body) | |
845 | 848 | case *ForInStmt: |
846 | sv.AddStmt(&stmt.Body) | |
849 | sv.AddStmt(stmt.Body) | |
847 | 850 | case *ForOfStmt: |
848 | sv.AddStmt(&stmt.Body) | |
851 | sv.AddStmt(stmt.Body) | |
849 | 852 | case *IfStmt: |
850 | 853 | sv.AddStmt(stmt.Body) |
851 | 854 | if stmt.Else != nil { |
853 | 856 | } |
854 | 857 | case *TryStmt: |
855 | 858 | if 0 < len(stmt.Body.List) { |
856 | sv.AddStmt(&stmt.Body) | |
859 | sv.AddStmt(stmt.Body) | |
857 | 860 | } |
858 | 861 | if stmt.Catch != nil { |
859 | 862 | sv.AddStmt(stmt.Catch) |
0 | # JSON [![GoDoc](http://godoc.org/github.com/tdewolff/parse/json?status.svg)](http://godoc.org/github.com/tdewolff/parse/json) | |
0 | # JSON [![API reference](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/github.com/tdewolff/minify/v2/parse/json?tab=doc) | |
1 | 1 | |
2 | 2 | This package is a JSON lexer (ECMA-404) written in [Go][1]. It follows the specification at [JSON](http://json.org/). The lexer takes an io.Reader and converts it into tokens until the EOF. |
3 | 3 |
4 | 4 | To run the tests, install `go-fuzz`: |
5 | 5 | |
6 | 6 | ``` |
7 | GO111MODULE=off go get -u github.com/dvyukov/go-fuzz/go-fuzz github.com/dvyukov/go-fuzz/go-fuzz-build | |
7 | go get -u github.com/dvyukov/go-fuzz/go-fuzz github.com/dvyukov/go-fuzz/go-fuzz-build | |
8 | 8 | |
9 | 9 | cd $GOPATH/github.com/tdewolff/parse/tests/number |
10 | 10 | |
11 | 11 | go-fuzz-build |
12 | go-fuzz | |
12 | go-fuzz -bin fuzz-fuzz.zip | |
13 | 13 | ``` |
14 | 14 | |
15 | 15 | If restarts is not close to `1/10000`, something is probably wrong. If not finding new corpus for a while, restart the fuzzer. |
0 | module github.com/tdewolff/parse/tests/css-token | |
1 | ||
2 | go 1.13 | |
3 | ||
4 | replace github.com/tdewolff/parse/v2 => ../../../parse | |
5 | ||
6 | require ( | |
7 | github.com/dvyukov/go-fuzz v0.0.0-20200318091601-be3528f3a813 // indirect | |
8 | github.com/tdewolff/parse/v2 v2.4.3 | |
9 | ) |
0 | github.com/dvyukov/go-fuzz v0.0.0-20200318091601-be3528f3a813 h1:NgO45/5mBLRVfiXerEFzH6ikcZ7DNRPS639xFg3ENzU= | |
1 | github.com/dvyukov/go-fuzz v0.0.0-20200318091601-be3528f3a813/go.mod h1:11Gm+ccJnvAhCNLlf5+cS9KjtbaD5I5zaZpFMsTHWTw= | |
2 | github.com/tdewolff/parse/v2 v2.4.3 h1:k24zHgTRGm7LkvbTEreuavyZTf0k8a/lIenggv62OiU= | |
3 | github.com/tdewolff/parse/v2 v2.4.3/go.mod h1:WzaJpRSbwq++EIQHYIRTpbYKNA3gn9it1Ik++q4zyho= | |
4 | github.com/tdewolff/test v1.0.6 h1:76mzYJQ83Op284kMT+63iCNCI7NEERsIN8dLM+RiKr4= | |
5 | github.com/tdewolff/test v1.0.6/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= |
0 | module github.com/tdewolff/parse/tests/data-uri | |
1 | ||
2 | go 1.13 | |
3 | ||
4 | replace github.com/tdewolff/parse/v2 => ../../../parse | |
5 | ||
6 | require ( | |
7 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 // indirect | |
8 | github.com/tdewolff/parse/v2 v2.3.10 | |
9 | ) |
0 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 h1:3WV5aRRj1ELP3RcLlBp/v0WJTuy47OQMkL9GIQq8QEE= | |
1 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681/go.mod h1:11Gm+ccJnvAhCNLlf5+cS9KjtbaD5I5zaZpFMsTHWTw= | |
2 | github.com/tdewolff/parse/v2 v2.3.10 h1:ipN/RjAeVaX7d3yQ+pKVeXKlTZOOEoWVQoC5UOrhPEY= | |
3 | github.com/tdewolff/parse/v2 v2.3.10/go.mod h1:pclWRpgD95an4pJvzjbp1A+bl6e7R9DemblveSm/Zo4= | |
4 | github.com/tdewolff/test v1.0.4 h1:ih38SXuQJ32Hng5EtSW32xqEsVeMnPp6nNNRPhBBDE8= | |
5 | github.com/tdewolff/test v1.0.4/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= | |
6 | github.com/tdewolff/test v1.0.6/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= |
0 | module github.com/tdewolff/parse/tests/dimension | |
1 | ||
2 | go 1.13 | |
3 | ||
4 | replace github.com/tdewolff/parse/v2 => ../../../parse | |
5 | ||
6 | require ( | |
7 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 // indirect | |
8 | github.com/tdewolff/parse/v2 v2.3.10 | |
9 | ) |
0 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 h1:3WV5aRRj1ELP3RcLlBp/v0WJTuy47OQMkL9GIQq8QEE= | |
1 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681/go.mod h1:11Gm+ccJnvAhCNLlf5+cS9KjtbaD5I5zaZpFMsTHWTw= | |
2 | github.com/tdewolff/parse/v2 v2.3.10 h1:ipN/RjAeVaX7d3yQ+pKVeXKlTZOOEoWVQoC5UOrhPEY= | |
3 | github.com/tdewolff/parse/v2 v2.3.10/go.mod h1:pclWRpgD95an4pJvzjbp1A+bl6e7R9DemblveSm/Zo4= | |
4 | github.com/tdewolff/test v1.0.4 h1:ih38SXuQJ32Hng5EtSW32xqEsVeMnPp6nNNRPhBBDE8= | |
5 | github.com/tdewolff/test v1.0.4/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= | |
6 | github.com/tdewolff/test v1.0.6/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= |
0 | module github.com/tdewolff/parse/tests/mediatype | |
1 | ||
2 | go 1.13 | |
3 | ||
4 | replace github.com/tdewolff/parse/v2 => ../../../parse | |
5 | ||
6 | require ( | |
7 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 // indirect | |
8 | github.com/tdewolff/parse/v2 v2.3.10 | |
9 | ) |
0 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 h1:3WV5aRRj1ELP3RcLlBp/v0WJTuy47OQMkL9GIQq8QEE= | |
1 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681/go.mod h1:11Gm+ccJnvAhCNLlf5+cS9KjtbaD5I5zaZpFMsTHWTw= | |
2 | github.com/tdewolff/parse/v2 v2.3.10 h1:ipN/RjAeVaX7d3yQ+pKVeXKlTZOOEoWVQoC5UOrhPEY= | |
3 | github.com/tdewolff/parse/v2 v2.3.10/go.mod h1:pclWRpgD95an4pJvzjbp1A+bl6e7R9DemblveSm/Zo4= | |
4 | github.com/tdewolff/test v1.0.4 h1:ih38SXuQJ32Hng5EtSW32xqEsVeMnPp6nNNRPhBBDE8= | |
5 | github.com/tdewolff/test v1.0.4/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= | |
6 | github.com/tdewolff/test v1.0.6/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= |
0 | module github.com/tdewolff/parse/tests/number | |
1 | ||
2 | go 1.13 | |
3 | ||
4 | replace github.com/tdewolff/parse/v2 => ../../../parse | |
5 | ||
6 | require ( | |
7 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 // indirect | |
8 | github.com/tdewolff/parse/v2 v2.3.10 | |
9 | ) |
0 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 h1:3WV5aRRj1ELP3RcLlBp/v0WJTuy47OQMkL9GIQq8QEE= | |
1 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681/go.mod h1:11Gm+ccJnvAhCNLlf5+cS9KjtbaD5I5zaZpFMsTHWTw= | |
2 | github.com/tdewolff/parse/v2 v2.3.10 h1:ipN/RjAeVaX7d3yQ+pKVeXKlTZOOEoWVQoC5UOrhPEY= | |
3 | github.com/tdewolff/parse/v2 v2.3.10/go.mod h1:pclWRpgD95an4pJvzjbp1A+bl6e7R9DemblveSm/Zo4= | |
4 | github.com/tdewolff/test v1.0.4 h1:ih38SXuQJ32Hng5EtSW32xqEsVeMnPp6nNNRPhBBDE8= | |
5 | github.com/tdewolff/test v1.0.4/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= | |
6 | github.com/tdewolff/test v1.0.6/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= |
0 | module github.com/tdewolff/parse/tests/replace-entities | |
1 | ||
2 | go 1.13 | |
3 | ||
4 | replace github.com/tdewolff/parse/v2 => ../../../parse | |
5 | ||
6 | require ( | |
7 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 // indirect | |
8 | github.com/tdewolff/parse/v2 v2.3.10 | |
9 | ) |
0 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681 h1:3WV5aRRj1ELP3RcLlBp/v0WJTuy47OQMkL9GIQq8QEE= | |
1 | github.com/dvyukov/go-fuzz v0.0.0-20191022152526-8cb203812681/go.mod h1:11Gm+ccJnvAhCNLlf5+cS9KjtbaD5I5zaZpFMsTHWTw= | |
2 | github.com/tdewolff/parse/v2 v2.3.10 h1:ipN/RjAeVaX7d3yQ+pKVeXKlTZOOEoWVQoC5UOrhPEY= | |
3 | github.com/tdewolff/parse/v2 v2.3.10/go.mod h1:pclWRpgD95an4pJvzjbp1A+bl6e7R9DemblveSm/Zo4= | |
4 | github.com/tdewolff/test v1.0.4 h1:ih38SXuQJ32Hng5EtSW32xqEsVeMnPp6nNNRPhBBDE8= | |
5 | github.com/tdewolff/test v1.0.4/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= | |
6 | github.com/tdewolff/test v1.0.6/go.mod h1:6DAvZliBAAnD7rhVgwaM7DE5/d9NMOAJ09SqYqeK4QE= |
0 | # XML [![GoDoc](http://godoc.org/github.com/tdewolff/parse/xml?status.svg)](http://godoc.org/github.com/tdewolff/parse/xml) | |
0 | # XML [![API reference](https://img.shields.io/badge/godoc-reference-5272B4)](https://pkg.go.dev/github.com/tdewolff/parse/v2/xml?tab=doc) | |
1 | 1 | |
2 | 2 | This package is an XML lexer written in [Go][1]. It follows the specification at [Extensible Markup Language (XML) 1.0 (Fifth Edition)](http://www.w3.org/TR/REC-xml/). The lexer takes an io.Reader and converts it into tokens until the EOF. |
3 | 3 |