Kernelized Least Squares Classifier
Use least squares regression as a classification technique using a numeric encoding of classes as targets. Note this method minimizes quadratic loss, not the truncated quadratic loss.
KernelLeastSquaresClassifier(X, y, lambda = 0, kernel = vanilladot(), x_center = TRUE, scale = TRUE, y_scale = TRUE)
X |
Design matrix, intercept term is added within the function |
y |
Vector or factor with class assignments |
lambda |
Regularization parameter of the l2 penalty in regularized least squares |
kernel |
kernlab kernel function |
x_center |
TRUE, whether the dependent variables (features) should be centered |
scale |
If TRUE, apply a z-transform to the design matrix X before running the regression |
y_scale |
TRUE center the target vector |
S4 object of class LeastSquaresClassifier with the following slots:
theta |
weight vector |
classnames |
the names of the classes |
modelform |
formula object of the model used in regression |
scaling |
a scaling object containing the parameters of the z-transforms applied to the data |
Other RSSL classifiers:
EMLeastSquaresClassifier
,
EMLinearDiscriminantClassifier
,
GRFClassifier
,
ICLeastSquaresClassifier
,
ICLinearDiscriminantClassifier
,
LaplacianKernelLeastSquaresClassifier()
,
LaplacianSVM
,
LeastSquaresClassifier
,
LinearDiscriminantClassifier
,
LinearSVM
,
LinearTSVM()
,
LogisticLossClassifier
,
LogisticRegression
,
MCLinearDiscriminantClassifier
,
MCNearestMeanClassifier
,
MCPLDA
,
MajorityClassClassifier
,
NearestMeanClassifier
,
QuadraticDiscriminantClassifier
,
S4VM
,
SVM
,
SelfLearning
,
TSVM
,
USMLeastSquaresClassifier
,
WellSVM
,
svmlin()
library(RSSL) library(ggplot2) library(dplyr) # Two class problem df <- generateCrescentMoon(200) class_lin <- KernelLeastSquaresClassifier(Class~.,df, kernel=kernlab::vanilladot(), lambda=1) class_rbf1 <- KernelLeastSquaresClassifier(Class~.,df, kernel=kernlab::rbfdot(), lambda=1) class_rbf5 <- KernelLeastSquaresClassifier(Class~.,df, kernel=kernlab::rbfdot(5), lambda=1) class_rbf10 <- KernelLeastSquaresClassifier(Class~.,df, kernel=kernlab::rbfdot(10), lambda=1) df %>% ggplot(aes(x=X1,y=X2,color=Class,shape=Class)) + geom_point() + coord_equal() + stat_classifier(aes(linetype=..classifier..), classifiers = list("Linear"=class_lin, "RBF sigma=1"=class_rbf1, "RBF sigma=5"=class_rbf5, "RBF sigma=10"=class_rbf10), color="black") # Second Example dmat<-model.matrix(Species~.-1,iris[51:150,]) tvec<-droplevels(iris$Species[51:150]) testdata <- data.frame(tvec,dmat[,1:2]) colnames(testdata)<-c("Class","X1","X2") precision<-100 xgrid<-seq(min(dmat[,1]),max(dmat[,1]),length.out=precision) ygrid<-seq(min(dmat[,2]),max(dmat[,2]),length.out=precision) gridmat <- expand.grid(xgrid,ygrid) g_kernel<-KernelLeastSquaresClassifier(dmat[,1:2],tvec, kernel=kernlab::rbfdot(0.01), lambda=0.000001,scale = TRUE) plotframe <- cbind(gridmat, decisionvalues(g_kernel,gridmat)) colnames(plotframe)<- c("x","y","Output") ggplot(plotframe, aes(x=x,y=y)) + geom_tile(aes(fill = Output)) + scale_fill_gradient(low="yellow", high="red",limits=c(0,1)) + geom_point(aes(x=X1,y=X2,shape=Class),data=testdata,size=3) + stat_classifier(classifiers=list(g_kernel)) # Multiclass problem dmat<-model.matrix(Species~.-1,iris) tvec<-iris$Species testdata <- data.frame(tvec,dmat[,1:2]) colnames(testdata)<-c("Class","X1","X2") precision<-100 xgrid<-seq(min(dmat[,1]),max(dmat[,1]),length.out=precision) ygrid<-seq(min(dmat[,2]),max(dmat[,2]),length.out=precision) gridmat <- expand.grid(xgrid,ygrid) g_kernel<-KernelLeastSquaresClassifier(dmat[,1:2],tvec, kernel=kernlab::rbfdot(0.1),lambda=0.00001, scale = TRUE,x_center=TRUE) plotframe <- cbind(gridmat, maxind=apply(decisionvalues(g_kernel,gridmat),1,which.max)) ggplot(plotframe, aes(x=Var1,y=Var2)) + geom_tile(aes(fill = factor(maxind,labels=levels(tvec)))) + geom_point(aes(x=X1,y=X2,shape=Class),data=testdata,size=4,alpha=0.5)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.