Search from vocabulary

Content language

Concept information

NLP > tokenisation

Preferred term

tokenisation  

Definition

  • process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens

Broader concept

Source

  • Wikipedia contributors, "Tokenization (lexical analysis)," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Tokenization_(lexical_analysis)&oldid=637328355 (accessed February 28, 2016).

Belongs to group

Notation

  • 48.01

URI

https://vocabs.acdh.oeaw.ac.at/dhataxonomy/Concept48.01

Download this concept:

RDF/XML TURTLE JSON-LD Created 10/1/15, last modified 6/20/16