Word2vec is a word embedding method that finds characteristics of words in a very large number of documents. Word embedding is a technique that converts words into numerical vectors that represent their meaning, usage, or context. Word2vec learns a dense and continuous vector representation for each word based on its context in a large corpus of text. Word2vec can capture the semantic and syntactic similarity and relationships among words, such as synonyms, antonyms, analogies, or associations1.