What does a tri-gram model look at to predict the next word?

Explore the crucial topics in AI Ethics. Study with thought-provoking flashcards and multiple-choice questions. Each question is accompanied by hints and detailed explanations to enhance your understanding. Prepare effectively for your upcoming evaluation!

Multiple Choice

What does a tri-gram model look at to predict the next word?

Explanation:
A tri-gram model predicts the next word using the two words that come before it. The idea is that the word to come is influenced by a short window of context, specifically the two immediately preceding words, so it models P(next word | word1, word2). This is why the correct choice is that it looks at the previous two words. Looking at only the previous word would be a bigram model, and trying to predict two future words or using all preceding words would go beyond the fixed-context setup that defines n-gram models. In practice, this fixed two-word context captures common short-range patterns, though it can miss longer dependencies and can suffer from data sparsity without smoothing.

A tri-gram model predicts the next word using the two words that come before it. The idea is that the word to come is influenced by a short window of context, specifically the two immediately preceding words, so it models P(next word | word1, word2). This is why the correct choice is that it looks at the previous two words. Looking at only the previous word would be a bigram model, and trying to predict two future words or using all preceding words would go beyond the fixed-context setup that defines n-gram models. In practice, this fixed two-word context captures common short-range patterns, though it can miss longer dependencies and can suffer from data sparsity without smoothing.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy