Revista: | Computación y sistemas |
Base de datos: | |
Número de sistema: | 000560699 |
ISSN: | 1405-5546 |
Autores: | Uribe, Diego1 Cuan, Enrique1 |
Instituciones: | 1Instituto Tecnológico de la Laguna, Coahuila. México |
Año: | 2022 |
Periodo: | Abr-Jun |
Volumen: | 26 |
Número: | 2 |
Paginación: | 921-938 |
País: | México |
Idioma: | Inglés |
Resumen en inglés | Typical deep learning models defined in terms of multiple layers are based on the assumption that a better representation is obtained with a hierarchical model rather than with a shallow one. Nevertheless, increasing the depth of the model by increasing the number of layers can lead to the model being lost or stuck during the optimization process.This paper investigates the impact of linguistic complexity characteristics from text on a deep learning model defined in terms of a stacked architecture. As the optimal number of stacked recurrent neural layers is specific to each application, we examine the optimal number of stacked recurrent layers corresponding to each linguistic characteristic. Last but not least, we also analyze the computational cost demanded by increasing the depth of a stacked recurrent architecture implemented for a linguistic characteristic. |
Disciplinas: | Ciencias de la computación |
Palabras clave: | Inteligencia artificial |
Keyword: | Artificial intelligence |
Texto completo: | Texto completo (Ver HTML) Texto completo (Ver PDF) |