Skip-gram
The Skip-gram model is a technique used in natural language processing to learn word embeddings. It works by predicting the context words surrounding a target word in a sentence. For example, given the word "cat," the model tries to predict words like "furry," "meow," or "pet" that often appear nearby.
This approach is part of the Word2Vec framework developed by Google. By training on large text corpora, the Skip-gram model captures semantic relationships between words, allowing it to generate vector representations that reflect their meanings. This helps in various applications, such as text classification and sentiment analysis.