Confusion Matrix

Ex: we have a classifier:

For the point selected:

In the end:

Say that we have an alarm system. The top right box would be where the 'false alarm' would fall:

We can imagine that there could be an assymetry in how much we care of each box. False alarm, or the opposite, there is a burlar and it doesn't detect it.

We can therefore move our prediction in one way or the other.

Other example:

In our confusion matrix, we are happy when the diagonal has the largest values, because all the values in the upper and lower triangle are misclassifications.

The terms a re

Recall = probability that the algo will correctly identify Hugo Chavel provided that person actually is H.Chav. In our case = 10/16

Precision: 1. Supposed our algorithm observes it is Hugo, what are the chances that it's really him?

Recall: True Positive / (True Positive + False Negative). Out of all the items that are truly positive, how many were correctly classified as positive. Or simply, how many positive items were 'recalled' from the dataset.

Precision: True Positive / (True Positive + False Positive). Out of all the items labeled as positive, how many truly belong to the positive class.

Other example:

Equations:

Confusion matrix example:

The predicted is across the columns, and the actual is across the rows. Therefore,

Predicted

Actual

0

1

0

23

1

1

14

2

  • Therefore, there are 23 non-admitted that we predict to be non-admitted.

  • There are 14 admitted that we predicted to be non-admitted.

  • There is 1 non-admitted that we predict to be admitted.

  • There are 2 admitted that we predict to be admitted.

Last updated