Adaboost.M1 algorithm
Implements Freund and Schapire's Adaboost.M1 algorithm
adaboost(formula, data, nIter, ...)
formula |
Formula for models |
data |
Input dataframe |
nIter |
no. of classifiers |
... |
other optional arguments, not implemented now |
This implements the Adaboost.M1 algorithm for a binary classification task. The target variable must be a factor with exactly two levels. The final classifier is a linear combination of weak decision tree classifiers.
object of class adaboost
Freund, Y. and Schapire, R.E. (1996):“Experiments with a new boosting algorithm” . In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156, Morgan Kaufmann.
fakedata <- data.frame( X=c(rnorm(100,0,1),rnorm(100,1,1)), Y=c(rep(0,100),rep(1,100) ) ) fakedata$Y <- factor(fakedata$Y) test_adaboost <- adaboost(Y~X, data=fakedata,10)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.