- About
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Spring 2025 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University / Cornell Tech - High School Programming Workshop and Contest 2025
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
Title "On generalization and uncertainty in learning with neural networks"
Bio Lenka Zdeborová is a Professor of Physics and of Computer Science in École Polytechnique Fédérale de Lausanne where she leads the Statistical Physics of Computation Laboratory. She received a PhD in physics from University Paris-Sud and from Charles University in Prague in 2008. She spent two years in the Los Alamos National Laboratory as the Director's Postdoctoral Fellow. Between 2010 and 2020 she was a researcher at CNRS working in the Institute of Theoretical Physics in CEA Saclay, France. In 2014, she was awarded the CNRS bronze medal, in 2016 Philippe Meyer prize in theoretical physics and an ERC Starting Grant, in 2018 the Irène Joliot-Curie prize, in 2021 the Gibbs lectureship of AMS and the Neuron Fund award. She served as an editorial board member for Journal of Physics A, Physical Review E, Physical Review X, SIMODS, Machine Learning: Science and Technology, and Information and Inference. Lenka's expertise is in applications of concepts from statistical physics, such as advanced mean field methods, replica method and related message-passing algorithms, to problems in machine learning, signal processing, inference and optimization. She enjoys erasing the boundaries between theoretical physics, mathematics and computer science.
Abstract: Statistical physics has studied exactly solvable models of neural networks for more than four decades. In this talk, we will examine this line of work in the perspective of recent questions stemming from deep learning. We will describe several types of phase transitions that appear in the high-dimensional limit as a function of the amount of data. Discontinuous phase transitions are linked to adjacent algorithmic hardness. This so-called hard phase influences the behavior of gradient-descent-like algorithms. We show a case where the hardness is mitigated by overparametrization, proposing that the benefits of overparametrization may be linked to the usage of a specific type of algorithm. We then discuss the overconfidence of overparametrized neural networks and evaluate methods to mitigate it and calibrate the uncertainty.