Semi-Supervised Classifications via Elastic and Robust Embedding
Yun Liu, Yiming Guo, Hua Wang, Feiping Nie, Heng Huang
AAAI - 2017
Transductive semi-supervised learning can only predict labels for unlabeled data appearing in training data, and can not predict labels for testing data never appearing in training set. To handle this out-of-sample problem, many inductive methods make a constraint such that the predicted label matrix should be exactly equal to a linear model. In practice, this constraint might be too rigid to capture the manifold structure of data. In this paper, we relax this rigid constraint and propose to use an elastic constraint on the predicted label matrix such that the manifold structure can be better explored. Moreover, since unlabeled data are often very abundant in practice and usually there are some outliers, we use a non-squared loss instead of the traditional squared loss to learn a robust model. The derived problem, although is convex, has so many non-smooth terms, which make it very challenging to solve. In the paper, we propose an efficient optimization algorithm to solve a more general problem, based on which we find the optimal solution to the derived problem.
Links
- View publications from Hua Wang
- View publications presented in AAAI
- View publications researching Embeddings
- View publications researching Optimization
- View publications researching Robust Learning Models
Cite this paper
MLA
iu, Yun, et al. "Semi-supervised classifications via elastic and robust embedding." Thirty-First AAAI Conference on Artificial Intelligence. 2017.
BibTeX
@inproceedings{liu2017semi, title={Semi-supervised classifications via elastic and robust embedding}, author={Liu, Yun and Guo, Yiming and Wang, Hua and Nie, Feiping and Huang, Heng}, booktitle={Thirty-First AAAI Conference on Artificial Intelligence}, year={2017} }