Exploration of CNN-based and RNN-based neural networks for Text Classification

DC Field Value Language
dc.contributor.advisorTae-Sun Chung-
dc.contributor.authorSUN JING-
dc.date.accessioned2018-11-08T08:27:54Z-
dc.date.available2018-11-08T08:27:54Z-
dc.date.issued2018-08-
dc.identifier.other28113-
dc.identifier.urihttps://dspace.ajou.ac.kr/handle/2018.oak/14008-
dc.description학위논문(석사)--아주대학교 일반대학원 :컴퓨터공학과,2018. 8-
dc.description.tableofcontentsCHAPTER 1. Introduction 1 CHAPTER 2. Related works 5 2.1 Traditional approaches for text classification 5 2.1.1 Naive Bayes Classifier 5 2.1.2 Support Vector Machines 6 2.1.3 K-Nearest Neighbor 7 2.2 Convolutional neural networks for text classification 7 2.2 Recurrent neural networks for text classification 9 CHAPTER 3. Model Architectures 12 3.1 Gated Convolutional Neural Networks 12 3.1.1 Convolutional Neural Networks for Text classification 12 3.1.2 Residual Block in Neural Networks 14 3.1.3 Batch Normalization 15 3.1.4 Gated Convolutional Neural Networks 16 3.2 Hierarchical Attention Recurrent Neural Network 20 3.2.1 Recurrent Neural Network 20 3.2.2 Long Short-term Memory 21 3.2.3 Gate Recurrent Unit 23 3.2.4 Bi-directional RNNs and Deep (Bi-directional) RNNs 26 3.2.5 Attention Mechanism 28 3.2.6 Hierarchical Attention Recurrent Neural Network 31 CHAPTER 4. Experiments and Analysis 34 4.1 Datasets 34 4.2 Results and Analysis 35 4.2.1 Gated Convolutional Neural Networks 35 4.2.2 Hierarchical Attention Neural Networks 40 CHAPTER 5. Conclusion 43 REFERENCES 44-
dc.language.isoeng-
dc.publisherThe Graduate School, Ajou University-
dc.rights아주대학교 논문은 저작권에 의해 보호받습니다.-
dc.titleExploration of CNN-based and RNN-based neural networks for Text Classification-
dc.typeThesis-
dc.contributor.affiliation아주대학교 일반대학원-
dc.contributor.department일반대학원 컴퓨터공학과-
dc.date.awarded2018. 8-
dc.description.degreeMaster-
dc.identifier.localId887572-
dc.identifier.uciI804:41038-000000028113-
dc.identifier.urlhttp://dcoll.ajou.ac.kr:9080/dcollection/common/orgView/000000028113-
dc.subject.keywordNatural language processing-
dc.subject.keywordConvolutional neural networks-
dc.subject.keywordRecurrent neural networks-
dc.subject.keywordText classification-
dc.description.alternativeAbstractNature language processing (NLP) is an important research field in artificial intelligence compromising linguistics, computer science, and mathematics. Text classification is a basic task in various NLP tasks, which is the process of assigning a piece of text to one or more classes. Nowadays, the most widely used approaches for text classification are based on neural networks, namely convolutional neural networks (CNN) and recurrent neural networks (RNN). CNN is a class of deep, feed-forward neural networks which uses a variation of multiple layers of perceptron in order to minimize calculation. RNN is proposed to take advantage of sequential information, whose current output is not only influenced by the current input but also the previous input. This thesis explores both CNN and RNN based methods for text classification. For CNN based model, gate mechanism is introduced to better capture the information of the input, residual connection and batch normalization technics are also used for gaining further improvement. For RNN based method, we explored the different impacts of four kinds of different advanced RNN units, namely unidirectional Long short-term memory (LSTM), bidirectional LSTM, unidirectional Gated recurrent unit (RGU) and bidirectional GRU combined with hierarchical attention mechanism for text classification.-
Appears in Collections:
Graduate School of Ajou University > Department of Computer Engineering > 3. Theses(Master)
Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse