Fast estimation for robust supervised classification with mixture model
Résumé
Label noise is known to negatively impact the performance of classification algorithms. In this paper, we develop a model robust to label noise that uses both labelled and unlabelled samples. In particular, we propose a novel algorithm to optimize the model parameters that scales efficiently w.r.t. the number of training samples. Our contribution relies on a consensus formulation of the original objective function that is highly parallelizable. The optimization is performed with the Alternating Direction Method of Multipliers framework. Experimental results on synthetic datasets show an improvement of several orders of magnitude in terms of processing time, with no loss in terms of accuracy. Our method appears also tailored to handle real data with significant label noise.
Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|