- About
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Spring 2025 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University / Cornell Tech - High School Programming Workshop and Contest 2025
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
When and why do efficient algorithms exist (for constraint satisfaction and beyond)? (via Zoom)
Abstract: Computational problems exhibit a diverse range of behaviors in terms of how quickly and effectively they can be solved. What underlying mathematical structure (or lack thereof) in a computational problem leads to an efficient algorithm for solving it (or dictates its intractability)? Given the vast landscape of problems and algorithmic approaches, it would be simplistic to hope for a universal theory explaining the underpinnings of easiness/hardness. Yet, in the realm of constraint satisfaction problems, the (recently proved) algebraic dichotomy theorem gives a definitive answer: a polynomial time algorithm exists when there are non-trivial local operations called polymorphisms under which the solution space is closed; otherwise the problem is NP-complete. Inspired and emboldened by this, one might speculate a polymorphic principle in more general contexts, with symmetries in the solution space being the genesis of efficient algorithms. Beginning with some background on constraint satisfaction problems (CSP) and the polymorphic approach to understanding their complexity, the talk will discuss some extensions beyond CSP where the polymorphic principle seems promising (yet far from understood). In particular, we will discuss ``promise CSP'' where one is allowed to satisfy a relaxed version of the constraints (a framework that includes important problems like approximate graph coloring and discrepancy minimization). We will also point out some of the many challenges that remain, and touch upon connections to fine-grained complexity and optimization.
Bio: Venkatesan Guruswami is a Professor of Computer Science and Mathematics at UC Berkeley and a senior scientist at the Simons Institute for the Theory of Computing. Venkat received his Bachelor's degree from IIT Madras, and his Ph.D. from MIT. Venkat's research interests lie in theoretical computer science and related mathematics and include error-correcting codes, approximate optimization, and computational complexity. He is currently the Editor-in-Chief of the Journal of the ACM. Venkat was a co-winner of the 2012 Presburger Award, and his other honors include a Simons Investigator award, Packard and Sloan Fellowships, the ACM Doctoral Dissertation Award, and an IEEE Information Theory Society Paper Award. He is a fellow of the ACM and the IEEE.