A Word Embedding based Approach for Word Sense Disambiguation
Abstract
Word Sense Disambiguation (WSD) is main task in the area of natural language processing (NLP). Supervised learning methods are usually used to solve this problem. In this paper, we present a new supervised approach for Word Sense disambiguation based on word embedding which represents words, or concepts in a low-dimensional continuous space. It has the ability to capture semantic information from massive amounts of textual content. It’s shown how such representations of words and senses can be effectively applied for WSD purposes, as they encode rich semantic information.