Improving Task specific Word embedding

DC Field Value Language
dc.contributor.advisor손경아-
dc.contributor.author김민수-
dc.date.accessioned2019-04-01T16:41:22Z-
dc.date.available2019-04-01T16:41:22Z-
dc.date.issued2019-02-
dc.identifier.other28597-
dc.identifier.urihttps://dspace.ajou.ac.kr/handle/2018.oak/15064-
dc.description학위논문(석사)--아주대학교 일반대학원 :컴퓨터공학과,2019. 2-
dc.description.tableofcontentsChapter 1. Introduction Section 1 1. Background 2. Research contents and method 3. Expected Effectiveness 4. Components of paper Chapter 2. The Word Embeddings Section 1. Context-based Word embedding 1. Count-based Word embedding 2. Word embedding using neural network Section 2. Bilingual Word embedding Section 3. Hierarchical word embedding Chapter 3. Improving Hierarchical word embedding using semantic meaning of word 1. Research Objective 2. Method 3. Evaluation Chapter 4. Conclusion Reference-
dc.language.isoeng-
dc.publisherThe Graduate School, Ajou University-
dc.rights아주대학교 논문은 저작권에 의해 보호받습니다.-
dc.titleImproving Task specific Word embedding-
dc.title.alternativeMinsoo Kim-
dc.typeThesis-
dc.contributor.affiliation아주대학교 일반대학원-
dc.contributor.alternativeNameMinsoo Kim-
dc.contributor.department일반대학원 컴퓨터공학과-
dc.date.awarded2019. 2-
dc.description.degreeMaster-
dc.identifier.localId905265-
dc.identifier.uciI804:41038-000000028597-
dc.identifier.urlhttp://dcoll.ajou.ac.kr:9080/dcollection/common/orgView/000000028597-
dc.description.alternativeAbstractUsing neural network models, it becomes possible to express words in a vector representation of a certain dimension. Existing word-embedding models have focused on creating semantic and syntactic features of words in vector-based textual data such as books, news, and so on. GLOVE [5], Word2Vec [6] and CBOW [8] as representative word embedding models, and recently, models such as FASTTEXT [7] have been introduced to further enhance existing models. Relational data such as graphs and networks also play an important role in the field of artificial intelligence. The network or graph embedding model such as latent space embeddings [9], DeepWalk [4] and Node2Vec [3] has also been studied and widely used in many applications. Although the text data embedding technique has become a common method of expressing the role of unit text in a context, it has solved the text mining problem using a machine learning model prior to neural network or a large lexical database such as Wordnet [14] [15]. Recently, there has been an effort to embed such a database in a graph embedding technique, since it is a type of data that does not match the expression intent of word embedding but cannot waste the value of an existing lexical database. There have been attempts to represent databases storing the semantics of words in a hierarchical manner through graph embedding techniques such as DeepWalk [4] and Node2Vec [3]. Recently, Poincare embedding has shown that it is possible to express the hierarchical relationship of words with sufficient performance even with low dimensional vector. In this paper, we embed hierarchical word data of WordNet [14] based on the Poincare embedding, and add an edge reflecting the sibling of the word to the hierarchical structure of WordNet data in order to represent more similar to the semantic meaning of itself. Converging the model with ideal objective function has been found to be difficult, but it can be improved through data-structured adjustments and more complex neural network models. Through this study, it is expected that we can solve the previously unresolved natural language processing problem through the expression of the vector of the hierarchical structure and the sibling relation of the word.-
dc.title.subtitleImproving Hierarchical Word embedding using semantic of word-
Appears in Collections:
Graduate School of Ajou University > Department of Computer Engineering > 3. Theses(Master)
Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse