Data Science Decoded

Data Science #18 - The k-nearest neighbors algorithm (1951)


Listen Later

In the 18th episode we go over the original k-nearest neighbors algorithm;

Fix, Evelyn; Hodges, Joseph L. (1951). Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties USAF School of Aviation Medicine, Randolph Field, Texas
They introduces a nonparametric method for classifying a new observation 𝑧 z as belonging to one of two distributions, 𝐹 F or 𝐺 G, without assuming specific parametric forms.
Using 𝑘 k-nearest neighbor density estimates, the paper implements a likelihood ratio test for classification and rigorously proves the method's consistency.


The work is a precursor to the modern 𝑘 k-Nearest Neighbors (KNN) algorithm and established nonparametric approaches as viable alternatives to parametric methods.

Its focus on consistency and data-driven learning influenced many modern machine learning techniques, including kernel density estimation and decision trees.


This paper's impact on data science is significant, introducing concepts like neighborhood-based learning and flexible discrimination.


These ideas underpin algorithms widely used today in healthcare, finance, and artificial intelligence, where robust and interpretable models are critical.

...more
View all episodesView all episodes
Download on the App Store

Data Science DecodedBy Mike E

  • 3.8
  • 3.8
  • 3.8
  • 3.8
  • 3.8

3.8

5 ratings


More shows like Data Science Decoded

View all
Radiolab by WNYC Studios

Radiolab

43,969 Listeners

My Favorite Theorem by Kevin Knudson & Evelyn Lamb

My Favorite Theorem

100 Listeners

WW2 Pod: We Have Ways of Making You Talk by Goalhanger

WW2 Pod: We Have Ways of Making You Talk

1,446 Listeners

The Rest Is History by Goalhanger

The Rest Is History

15,856 Listeners