Tokenizer
Provides a consistent 'Transform' interface to tokenizers operating on 'DataFrame's and folders
Tokenizer( tok, rules = NULL, counter = NULL, lengths = NULL, mode = NULL, sep = " " )
tok |
tokenizer |
rules |
rules |
counter |
counter |
lengths |
lengths |
mode |
mode |
sep |
separator |
None
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.