tokenization
English
Noun
tokenization (countable and uncountable, plural tokenizations)
- The act or process of tokenizing.
- Something tokenized.
- This was an unlikely tokenization of the input string.
Anagrams
This article is issued from Wiktionary. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.