Accéder directement au contenu Accéder directement à la navigation
Communication dans un congrès

Supervised level-wise pretraining for sequential data classification

Abstract : Recurrent Neural Networks (RNNs) can be seriously impacted by the initial parameters assignment, which may result in poor generalization performances on new unseen data. With the objective to tackle this crucial issue, in the context of RNN based classification, we propose a new supervised layer-wise pretraining strategy to initialize network parameters. The proposed approach leverages a data-aware strategy that sets up a taxonomy of classification problems automatically derived by the model behavior. To the best of our knowledge, despite the great interest in RNN-based classification, this is the first data-aware strategy dealing with the initialization of such models. The proposed strategy has been tested on five benchmarks coming from three different domains, i.e., Text Classification, Speech Recognition and Remote Sensing. Results underline the benefit of our approach and point out that data-aware strategies positively support the initialization of Recurrent Neural Network based classification models.
Type de document :
Communication dans un congrès
Liste complète des métadonnées

https://hal.inrae.fr/hal-03031531
Déposant : Isabelle Nault <>
Soumis le : lundi 30 novembre 2020 - 15:04:28
Dernière modification le : lundi 7 décembre 2020 - 14:48:39

Identifiants

Citation

Dino Ienco, Roberto Interdonato, Raffaele Gaetano. Supervised level-wise pretraining for sequential data classification. 27th International Conference, ICONIP 2020, Nov 2020, Bangkok, Thailand. pp.449-457, ⟨10.1007/978-3-030-63823-8_52⟩. ⟨hal-03031531⟩

Partager

Métriques

Consultations de la notice

46