Question
The tokens are passed through a Lucene ____________ to produce NGrams of the desired length.
Answer: Option B
Was this answer helpful ?
The tools that the collocation identification algorithm are embedded within either consume tokenized text as input or provide the ability to specify an implementation of the Lucene Analyzer class perform tokenization in order to form ngrams.
Was this answer helpful ?
Submit Solution