Efficient Non-Parametric Function Induction in Semi-Supervised Learning

Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics |

Published by Society for Artificial Intelligence and Statistics

There has been an increase of interest for semi-supervised learning recently, because of the many datasets with large amounts of unlabeled examples and only a few labeled ones. This paper follows up on proposed nonparametric algorithms which provide an estimated continuous label for the given unlabeled examples. First, it extends them to function induction algorithms that minimize a regularization criterion applied to an out-of-sample example, and happen to have the form of Parzen windows regressors. This allows to predict test labels without solving again a linear system of dimension n (the number of unlabeled and labeled training examples), which can cost O(n3). Second, this function induction procedure gives rise to an efficient approximation of the training process, reducing the linear system to be solved to m << n unknowns, using only a subset of m examples. An improvement of O(n2/m2) in time can thus be obtained. Comparative experiments are presented, showing the good performance of the induction formula and approximation algorithm.