Large-scale optimization and machine learning shape modern data science, and Courtney Paquette, Ph.D., McGill University, studies how to design and analyze algorithms for large-scale optimization problems motivated by applications and data science. Paquette draws on probability, complexity theory, and convex and non-smooth optimization, and examines scaling limits of stochastic algorithms. Speaking with Saura Naderi, UC San Diego, Paquette describes an unconventional path from finance to pure mathematics and explains how persistence and comfort with uncertainty support long-term research. She highlights the challenge of building missing foundations while advancing through graduate training, and she connects that experience to the realities of doing original work. Paquette also reflects on rapid progress in machine learning and frames AI systems as tools that can be used thoughtfully. Series: "Science Like Me" [Science] [Show ID: 41119]