||Sala conferenze-DISI-3° piano
||Some novel developments in dimensionality reduction for classification
||Department of Computer Science, University of Sheffield UK
||Common dimensionality reduction techniques such as PCA and
generalisations address the problem of finding lower dimensional
representation of data based on variance considerations. However, the most varying directions need not be the most interesting: for example, if a high dimensional data set is known to contain clusters, the best dimensionality reduction will extract features that best discriminate between clusters, rather than capturing the most variance. We exploit this idea and introduce a latent variable model that extracts at maximum
likelihood optimal discriminative features (in the sense of Fisher's
discriminant) without access to label information. We then extend the framework to address the semi-supervised problem and possible non-linear extensions.