Data Science Decoded

Data Science #33 - The Backpropagation method, Paul Werbos (1980)


Listen Later

On the 33rd episdoe we review Paul Werbos’s “Applications of Advances in Nonlinear Sensitivity Analysis” which presents efficient methods for computing derivatives in nonlinear systems, drastically reducing computational costs for large-scale models.


Werbos, Paul J. "Applications of advances in nonlinear sensitivity analysis." System Modeling and Optimization: Proceedings of the 10th IFIP Conference New York City, USA, August 31–September 4, 1981


These methods, especially the backward differentiation technique, enable better sensitivity analysis, optimization, and stochastic modeling across economics, engineering, and artificial intelligence.


The paper also introduces Generalized Dynamic Heuristic Programming (GDHP) for adaptive decision-making in uncertain environments.Its importance to modern data science lies in laying the foundation for backpropagation, the core algorithm behind training neural networks.


Werbos’s work bridged traditional optimization and today’s AI, influencing machine learning, reinforcement learning, and data-driven modeling.

...more
View all episodesView all episodes
Download on the App Store

Data Science DecodedBy Mike E

  • 3.8
  • 3.8
  • 3.8
  • 3.8
  • 3.8

3.8

5 ratings


More shows like Data Science Decoded

View all
Radiolab by WNYC Studios

Radiolab

43,991 Listeners

My Favorite Theorem by Kevin Knudson & Evelyn Lamb

My Favorite Theorem

100 Listeners

WW2 Pod: We Have Ways of Making You Talk by Goalhanger

WW2 Pod: We Have Ways of Making You Talk

1,429 Listeners

The Rest Is History by Goalhanger

The Rest Is History

15,632 Listeners