Improved Multi-Label Propagation for Small Data with Multi-Objective Optimization
Résumé
This paper focuses on multi-label learning from small amounts of labelled data. We demonstrate
that the binary-relevance extension of the interpolated label propagation algorithm, the harmonic
function, is a competitive learning method with respect to many widely-used evaluation measures.
This is achieved by a new transition matrix that better captures the underlying structure useful for
classification coupled with the use of data dependent thresholding strategies. Furthermore, we show
that in the case of label dependence, one can use the outputs of a competitive learning model as part
of the input to the harmonic function to improve the performance of this model. Finally, since we are
using multiple measures to thoroughly evaluate the performance of the algorithm, we propose to use
the game-theory based method of Kalai and Smorodinsky to output a single compromise solution
for all measures. This method can be applied to any learning model irrespective of the number of
evaluation metrics used.
Origine | Fichiers produits par l'(les) auteur(s) |
---|