15.3 Case Study: Classification with k-Nearest Neighbors and the Digits Dataset, Part 2

In this section, we continue the digit classification case study. We’ll:

  • evaluate the k-NN classification estimator’s accuracy,

  • execute multiple estimators and can compare their results so you can choose the best one(s), and

  • show how to tune k-NN’s hyperparameter k to get the best performance out of a KNeighborsClassifier.

15.3.1 Metrics for Model Accuracy

Once you’ve trained and tested a model, you’ll want to measure its accuracy. Here, we’ll look at two ways of doing this—a classification estimator’s score method and a confusion matrix.

Estimator Method score

Each estimator has a score method that returns an indication of how well the estimator performs ...

Get Intro to Python for Computer Science and Data Science: Learning to Program with AI, Big Data and The Cloud now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.