Binary Feature Selection with Conditional Mutual Information
Résumé
In a context of classification, we propose to use conditional mutual information to select a family of binary features which are individually discriminating and weakly dependent. We show that on a task of image classification, despite its simplicity, a naive Bayesian classifier based on features selected with this Conditional Mutual Information Maximization (CMIM) criterion performs as well as a classifier built with AdaBoost. We also show that this classification method is more robust than boosting when trained on a noisy data set.
Domaines
Autre [cs.OH]
Loading...