Confusion matrix and overall accuracy of predicted binary response
Takes in actual binary response, predicted probabilities and cutoff value, and returns confusion matrix and overall accuracy
accuracy(y, yhat, cutoff)
y |
actual binary response variable |
yhat |
predicted probabilities corresponding to the actual binary response |
cutoff |
threshold value in the range 0 to 1 |
When we predict a binary response, first thing that we want to check is accuracy of the model for a particular cutoff value. This function does just that and provides confusion matrix (numbers and percentage) and overall accuracy. Overall accuracy is calculated as (TP + TN)/(P + N).
The output is a list from which the individual elements can be picked as shown in the example.
a three element list: confusion matrix as a table, confusion matrix (percentages) as a table and overall accuracy value
Akash Jain
# A 'data.frame' with y and yhat df <- data.frame(y = c(1, 0, 1, 1, 0), yhat = c(0.86, 0.23, 0.65, 0.92, 0.37)) # Accuracy tables and overall accuracy figures ltAccuracy <- accuracy(y = df[, 'y'], yhat = df[, 'yhat'], cutoff = 0.7) accuracyNumber <- ltAccuracy$accuracyNum accuracyPercentage <- ltAccuracy$accuracyPer overallAccuracy <- ltAccuracy$overallAcc
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.