McNulty // w4d4

Winter 2015

Planned schedule and activities

9:00 am: Good morning

9:15 am: Discussion on the SQL setup

9:30 am: McNulty work

10:30 am: Classification Errors and Performance Metrics

11:15 pm: McNulty work

12:00pm: Lunch

1:30pm: McNulty work

5:00 pm: Speaker: Cassie

6:00 pm: Data Science Career Fair if you're interested

Lecture Notes

w4d4_Classification_Errors.pdf (1.3 MB)


Precision, recall, sensitivity, specificity
Wikipedia page on precision and recall
Scikit learn on classification metrics
Receiver Operating Characteristic
Area under curve (ROC)

What is the relationship between F1 and Fß?

If you have found the metrics function in sklearn that spits out your precision, recall, and F score, you might have found yourself asking: "What is Fß? Is it the same as F1?"

The answer is ... yes. F1 combines precision and recall. Fß does the same thing, but uses a weight so that you can weigh one of these two (precision or recall) more than the other when combining them. It is a way to tune your score if you care more about precision than recall, for example. F1 is the Fß for which ß = 1. In sklearn, the default value for ß is 1.

Here is a screen cap from the wikipedia page describing this relationship: