Supervised level-wise pretraining for sequential data classification - INRAE - Institut national de recherche pour l’agriculture, l’alimentation et l’environnement Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Supervised level-wise pretraining for sequential data classification

Résumé

Recurrent Neural Networks (RNNs) can be seriously impacted by the initial parameters assignment, which may result in poor generalization performances on new unseen data. With the objective to tackle this crucial issue, in the context of RNN based classification, we propose a new supervised layer-wise pretraining strategy to initialize network parameters. The proposed approach leverages a data-aware strategy that sets up a taxonomy of classification problems automatically derived by the model behavior. To the best of our knowledge, despite the great interest in RNN-based classification, this is the first data-aware strategy dealing with the initialization of such models. The proposed strategy has been tested on five benchmarks coming from three different domains, i.e., Text Classification, Speech Recognition and Remote Sensing. Results underline the benefit of our approach and point out that data-aware strategies positively support the initialization of Recurrent Neural Network based classification models.
Fichier non déposé

Dates et versions

hal-03031531 , version 1 (30-11-2020)

Identifiants

Citer

Dino Ienco, Roberto Interdonato, Raffaele Gaetano. Supervised level-wise pretraining for sequential data classification. ICONIP 2020 - 27th International Conference on Neural Information Processing, Nov 2020, Bangkok, Thailand. pp.449-457, ⟨10.1007/978-3-030-63823-8_52⟩. ⟨hal-03031531⟩
71 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More