TITLE

Smooth Harmonic Transductive Learning

AUTHOR(S)
Ying Xie; Bin Luo; Rongbin Xu; Sibao Chen
PUB. DATE
December 2013
SOURCE
Journal of Computers;Dec2013, Vol. 8 Issue 12, p3079
SOURCE TYPE
Academic Journal
DOC. TYPE
Article
ABSTRACT
In this paper, we present a novel semi-supervised smooth harmonic transductive learning algorithm that can get closed-form solution. Our method introduces the unlabeled class information to the learning process and tries to exploit the similar configurations shared by the label distribution of data. After discovering the property of smooth harmonic function based on spectral clustering in classification task, we design an adaptive thresholding method for smooth harmonic transductive learning based on classification error. The proposed adaptive thresholding method can select the most suitable thresholds flexibly. Plentiful experiments on data sets show our proposed closed- form smooth harmonic transductive learning framework get excellent improvement compared with two baseline methods.
ACCESSION #
93332605

 

Related Articles

  • Covariate Shift Adaptation by Importance Weighted Cross Validation. Sugiyama, Masashi; Krauledat, Matthias; Müller, Klaus-Robert // Journal of Machine Learning Research;5/1/2007, Vol. 8 Issue 5, p985 

    A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points that will be given in the future test phase. However, this assumption is not satisfied, for example, when the outside of the training region is...

  • UNSUPERVISED LEARNING WITH EXPECTED MAXIMIZATION ALGORITHM. Ruxanda, Gheorghe; Smeureanu, Ion // Economic Computation & Economic Cybernetics Studies & Research;Jan2012, Vol. 46 Issue 1, p17 

    The article discusses the theoretical and numerical aspects of expected maximization (EM) algorithm used in estimating the mixture probability distribution parameters, incomplete data and unsupervised statistical learning. It offers an analysis of the probabilistic context of algorithm and the...

  • Supervised, semi-supervised and unsupervised inference of gene regulatory networks. Maetschke, Stefan R.; Madhamshettiwar, Piyush B.; Davis, Melissa J.; Ragan, Mark A. // Briefings in Bioinformatics;Mar2014, Vol. 15 Issue 2, p195 

    Inference of gene regulatory network from expression data is a challenging task. Many methods have been developed to this purpose but a comprehensive evaluation that covers unsupervised, semi-supervised and supervised methods, and provides guidelines for their practical application, is...

  • Generalization Error Bounds in Semi-supervised Classification Under the Cluster Assumption. Rigollet, Philippe // Journal of Machine Learning Research;7/1/2007, Vol. 8 Issue 7, p1369 

    We consider semi-supervised classification when part of the available data is unlabeled. These unlabeled data can be useful for the classification problem when we make an assumption relating the behavior of the regression function to that of the marginal distribution. Seeger (2000) proposed the...

  • Classifier calibration using splined empirical probabilities in clinical risk prediction. Gaudoin, René; Montana, Giovanni; Jones, Simon; Aylin, Paul; Bottle, Alex // Health Care Management Science;Jun2015, Vol. 18 Issue 2, p156 

    The aims of supervised machine learning (ML) applications fall into three broad categories: classification, ranking, and calibration/probability estimation. Many ML methods and evaluation techniques relate to the first two. Nevertheless, there are many applications where having an accurate...

  • Regularization-Free Principal Curve Estimation. Gerber, Samuel; Whitaker, Ross // Journal of Machine Learning Research;May2013, Vol. 14 Issue 5, p1285 

    Principal curves and manifolds provide a framework to formulate manifold learning within a statistical context. Principal curves define the notion of a curve passing through the middle of a distribution. While the intuition is clear, the formal definition leads to some technical and practical...

  • Distribution-Dependent Sample Complexity of Large Margin Learning. Sabato, Sivan; Srebro, Nathan; Tishby, Naftali // Journal of Machine Learning Research;Jul2013, Vol. 14 Issue 7, p2119 

    We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L2 regularization: We introduce the margin-adapted dimension, which is a simple function of the second order statistics of the data distribution, and show distribution-specific...

  • Semi-Supervised Learning by Local Behavioral Searching Strategy. Chun Zhang; Junan Yang; Jiyang Zhang; Dongsheng Li; Aixia Yong // Applied Mathematics & Information Sciences;2014, Vol. 8 Issue 4, p1781 

    Semi-supervised learning has attracted a significant amount of attention in pattern recognition and machine learning. Among these methods, a very popular type is semi-supervised support vector machines. However, parameter selection in heat kernel function during the learning process is...

  • A hybrid classification scheme for mining multisource geospatial data. Vatsavai, Ranga; Bhaduri, Budhendra // GeoInformatica;Mar2011, Vol. 15 Issue 1, p29 

    Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities)...

Share

Read the Article

Courtesy of THE LIBRARY OF VIRGINIA

Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics