![]() |
Data Science DecodedAuthor: Mike E
We discuss seminal mathematical papers (sometimes really old ) that have shaped and established the fields of machine learning and data science as we know them today. The goal of the podcast is to introduce you to the evolution of these fields from a mathematical and slightly philosophical perspective. We will discuss the contribution of these papers, not just from pure a math aspect but also how they influenced the discourse in the field, which areas were opened up as a result, and so on. Our podcast episodes are also available on our youtube: https://youtu.be/wThcXx_vXjQ?sivnMfs Language: en Genres: Mathematics, Science Contact email: Get it Feed URL: Get it iTunes ID: Get it |
Listen Now...
Data Science #33 - The Backpropagation method, Paul Werbos (1980)
Episode 33
Monday, 3 November, 2025
On the 33rd episdoe we review Paul Werbos’s “Applications of Advances in Nonlinear Sensitivity Analysis” which presents efficient methods for computing derivatives in nonlinear systems, drastically reducing computational costs for large-scale models. Werbos, Paul J. "Applications of advances in nonlinear sensitivity analysis." System Modeling and Optimization: Proceedings of the 10th IFIP Conference New York City, USA, August 31–September 4, 1981These methods, especially the backward differentiation technique, enable better sensitivity analysis, optimization, and stochastic modeling across economics, engineering, and artificial intelligence. The paper also introduces Generalized Dynamic Heuristic Programming (GDHP) for adaptive decision-making in uncertain environments.Its importance to modern data science lies in laying the foundation for backpropagation, the core algorithm behind training neural networks. Werbos’s work bridged traditional optimization and today’s AI, influencing machine learning, reinforcement learning, and data-driven modeling.











