Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

MCPLDA

Maximum Contrastive Pessimistic Likelihood Estimation for Linear Discriminant Analysis


Description

Maximum Contrastive Pessimistic Likelihood (MCPL) estimation (Loog 2016) attempts to find a semi-supervised solution that has a higher likelihood compared to the supervised solution on the labeled and unlabeled data even for the worst possible labeling of the data. This is done by attempting to find a saddle point of the maximin problem, where the max is over the parameters of the semi-supervised solution and the min is over the labeling, while the objective is the difference in likelihood between the semi-supervised and the supervised solution measured on the labeled and unlabeled data. The implementation is a translation of the Matlab code of Loog (2016).

Usage

MCPLDA(X, y, X_u, x_center = FALSE, scale = FALSE, max_iter = 1000)

Arguments

X

matrix; Design matrix for labeled data

y

factor or integer vector; Label vector

X_u

matrix; Design matrix for unlabeled data

x_center

logical; Should the features be centered?

scale

logical; Should the features be normalized? (default: FALSE)

max_iter

integer; Maximum number of iterations

References

Loog, M., 2016. Contrastive Pessimistic Likelihood Estimation for Semi-Supervised Classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(3), pp.462-475.

See Also


RSSL

Implementations of Semi-Supervised Learning Approaches for Classification

v0.9.3
GPL (>= 2)
Authors
Jesse Krijthe [aut, cre]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.