Exploration of CNN-based and RNN-based neural networks for Text Classification
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Tae-Sun Chung | - |
dc.contributor.author | SUN JING | - |
dc.date.accessioned | 2018-11-08T08:27:54Z | - |
dc.date.available | 2018-11-08T08:27:54Z | - |
dc.date.issued | 2018-08 | - |
dc.identifier.other | 28113 | - |
dc.identifier.uri | https://dspace.ajou.ac.kr/handle/2018.oak/14008 | - |
dc.description | 학위논문(석사)--아주대학교 일반대학원 :컴퓨터공학과,2018. 8 | - |
dc.description.tableofcontents | CHAPTER 1. Introduction 1 CHAPTER 2. Related works 5 2.1 Traditional approaches for text classification 5 2.1.1 Naive Bayes Classifier 5 2.1.2 Support Vector Machines 6 2.1.3 K-Nearest Neighbor 7 2.2 Convolutional neural networks for text classification 7 2.2 Recurrent neural networks for text classification 9 CHAPTER 3. Model Architectures 12 3.1 Gated Convolutional Neural Networks 12 3.1.1 Convolutional Neural Networks for Text classification 12 3.1.2 Residual Block in Neural Networks 14 3.1.3 Batch Normalization 15 3.1.4 Gated Convolutional Neural Networks 16 3.2 Hierarchical Attention Recurrent Neural Network 20 3.2.1 Recurrent Neural Network 20 3.2.2 Long Short-term Memory 21 3.2.3 Gate Recurrent Unit 23 3.2.4 Bi-directional RNNs and Deep (Bi-directional) RNNs 26 3.2.5 Attention Mechanism 28 3.2.6 Hierarchical Attention Recurrent Neural Network 31 CHAPTER 4. Experiments and Analysis 34 4.1 Datasets 34 4.2 Results and Analysis 35 4.2.1 Gated Convolutional Neural Networks 35 4.2.2 Hierarchical Attention Neural Networks 40 CHAPTER 5. Conclusion 43 REFERENCES 44 | - |
dc.language.iso | eng | - |
dc.publisher | The Graduate School, Ajou University | - |
dc.rights | 아주대학교 논문은 저작권에 의해 보호받습니다. | - |
dc.title | Exploration of CNN-based and RNN-based neural networks for Text Classification | - |
dc.type | Thesis | - |
dc.contributor.affiliation | 아주대학교 일반대학원 | - |
dc.contributor.department | 일반대학원 컴퓨터공학과 | - |
dc.date.awarded | 2018. 8 | - |
dc.description.degree | Master | - |
dc.identifier.localId | 887572 | - |
dc.identifier.uci | I804:41038-000000028113 | - |
dc.identifier.url | http://dcoll.ajou.ac.kr:9080/dcollection/common/orgView/000000028113 | - |
dc.subject.keyword | Natural language processing | - |
dc.subject.keyword | Convolutional neural networks | - |
dc.subject.keyword | Recurrent neural networks | - |
dc.subject.keyword | Text classification | - |
dc.description.alternativeAbstract | Nature language processing (NLP) is an important research field in artificial intelligence compromising linguistics, computer science, and mathematics. Text classification is a basic task in various NLP tasks, which is the process of assigning a piece of text to one or more classes. Nowadays, the most widely used approaches for text classification are based on neural networks, namely convolutional neural networks (CNN) and recurrent neural networks (RNN). CNN is a class of deep, feed-forward neural networks which uses a variation of multiple layers of perceptron in order to minimize calculation. RNN is proposed to take advantage of sequential information, whose current output is not only influenced by the current input but also the previous input. This thesis explores both CNN and RNN based methods for text classification. For CNN based model, gate mechanism is introduced to better capture the information of the input, residual connection and batch normalization technics are also used for gaining further improvement. For RNN based method, we explored the different impacts of four kinds of different advanced RNN units, namely unidirectional Long short-term memory (LSTM), bidirectional LSTM, unidirectional Gated recurrent unit (RGU) and bidirectional GRU combined with hierarchical attention mechanism for text classification. | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.