Revista: | Computación y Sistemas |
Base de datos: | PERIÓDICA |
Número de sistema: | 000423291 |
ISSN: | 1405-5546 |
Autores: | Zhou, Yujun1 Xu, Jiaming1 Cao, Jie1 Xu, Bo1 Li, Changliang1 Xu, Bo1 |
Instituciones: | 1Institute of Automation, Beijing. China 2University of Chinese Academy of Sciences, Beijing. China 3Jiangsu Jinling Science and Technology Group Co., Ltd, Nanjing, Jiangsu. China |
Año: | 2017 |
Periodo: | Oct-Dic |
Volumen: | 21 |
Número: | 4 |
País: | México |
Idioma: | Inglés |
Tipo de documento: | Artículo |
Enfoque: | Aplicado, descriptivo |
Resumen en inglés | To improve the classification performance for Chinese short text with automatic semantic feature selection, in this paper we propose the Hybrid Attention Networks (HANs) which combines the word- and character-level selective attentions. The model firstly applies RNN and CNN to extract the semantic features of texts. Then it captures class-related attentive representation from word- and character-level features. Finally, all of the features are concatenated and fed into the output layer for classification. Experimental results on 32-class and 5-class datasets show that, our model outperforms multiple baselines by combining not only the word- and character-level features of the texts, but also class-related semantic features by attentive mechanism |
Disciplinas: | Ciencias de la computación, Literatura y lingüística |
Palabras clave: | Lingüística aplicada, Clasificación de textos, Redes neuronales recurrentes, Red neuronal convolucional |
Keyword: | Applied linguistics, Text classification, Convolutional neural network, Recurrent neural network |
Texto completo: | Texto completo (Ver HTML) Texto completo (Ver PDF) |