Authors: SAVAŞ YILDIRIM
Abstract: Corpus-driven approaches can automatically explore is-a relations between the word pairs from corpus. This problem is also called hypernym extraction. Formerly, lexico-syntactic patterns have been used to solve hypernym relations. The language-specific syntactic rules have been manually crafted to build the patterns. On the other hand, recent studies have applied distributional approaches to word semantics. They extracted the semantic relations relying on the idea that similar words share similar contexts. Former distributional approaches have applied one-hot bag-of-word (BOW) encoding. The dimensionality problem of BOW has been solved by various neural network approaches, which represent words in very short and dense vectors, or word embeddings. In this study, we used word embeddings representation and employed the optimized projection algorithm to solve the hypernym problem. The supervised architecture learns a mapping function so that the embeddings (or vectors) of word pairs that are in hypernym relations can be projected to each other. In the training phase, the architecture first learns the embeddings of words and the projection function from a list of word pairs. In the test phase, the projection function maps the embeddings of a given word to a point that is the closest to its hypernym. We utilized the deep learning optimization methods to optimize the model and improve the performances by tuning hyperparameters. We discussed our results by carrying out many experiments based on cross-validation. We also addressed problem-specific loss function, monitored hyperparameters, and evaluated the results with respect to different settings. Finally, we successfully showed that our approach outperformed baseline functions and other studies in the Turkish language.
Keywords: Projection learning, word embeddings, hypernym relation, deep learning
Full Text: PDF