mirror of
1
Fork 0
gotosocial/vendor/github.com/tdewolff/parse/v2/html
dependabot[bot] f301ec65f1
[chore]: Bump github.com/tdewolff/minify/v2 from 2.20.37 to 2.21.0 (#3468)
Bumps [github.com/tdewolff/minify/v2](https://github.com/tdewolff/minify) from 2.20.37 to 2.21.0.
- [Release notes](https://github.com/tdewolff/minify/releases)
- [Commits](https://github.com/tdewolff/minify/compare/v2.20.37...v2.21.0)

---
updated-dependencies:
- dependency-name: github.com/tdewolff/minify/v2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-21 11:39:43 +02:00
..
README.md [bugfix] Markdown formatting updates (#743) 2022-08-07 18:19:16 +02:00
hash.go [bugfix] Markdown formatting updates (#743) 2022-08-07 18:19:16 +02:00
lex.go [chore]: Bump github.com/tdewolff/minify/v2 from 2.20.37 to 2.21.0 (#3468) 2024-10-21 11:39:43 +02:00
util.go [chore]: Bump github.com/tdewolff/minify/v2 from 2.20.16 to 2.20.17 (#2661) 2024-02-19 07:23:08 +00:00

README.md

HTML API reference

This package is an HTML5 lexer written in Go. It follows the specification at The HTML syntax. The lexer takes an io.Reader and converts it into tokens until the EOF.

Installation

Run the following command

go get -u github.com/tdewolff/parse/v2/html

or add the following import and run project with go get

import "github.com/tdewolff/parse/v2/html"

Lexer

Usage

The following initializes a new Lexer with io.Reader r:

l := html.NewLexer(parse.NewInput(r))

To tokenize until EOF an error, use:

for {
	tt, data := l.Next()
	switch tt {
	case html.ErrorToken:
		// error or EOF set in l.Err()
		return
	case html.StartTagToken:
		// ...
		for {
			ttAttr, dataAttr := l.Next()
			if ttAttr != html.AttributeToken {
				break
			}
			// ...
		}
	// ...
	}
}

All tokens:

ErrorToken TokenType = iota // extra token when errors occur
CommentToken
DoctypeToken
StartTagToken
StartTagCloseToken
StartTagVoidToken
EndTagToken
AttributeToken
TextToken

Examples

package main

import (
	"os"

	"github.com/tdewolff/parse/v2/html"
)

// Tokenize HTML from stdin.
func main() {
	l := html.NewLexer(parse.NewInput(os.Stdin))
	for {
		tt, data := l.Next()
		switch tt {
		case html.ErrorToken:
			if l.Err() != io.EOF {
				fmt.Println("Error on line", l.Line(), ":", l.Err())
			}
			return
		case html.StartTagToken:
			fmt.Println("Tag", string(data))
			for {
				ttAttr, dataAttr := l.Next()
				if ttAttr != html.AttributeToken {
					break
				}

				key := dataAttr
				val := l.AttrVal()
				fmt.Println("Attribute", string(key), "=", string(val))
			}
		// ...
		}
	}
}

License

Released under the MIT license.