- About
- Events
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Spring 2025 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University / Cornell Tech - High School Programming Workshop and Contest 2025
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Robotics Ph. D. prgram
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
Point location with near-optimal bounds (via Zoom)
Abstract: The point location problem is a classical problem in discrete and computational geometry. Given a known partition of R^d by n hyperplanes into cells, and an unknown point, find as efficiently as possible the cell in which the point lies. Access to the unknown point is done indirectly, via linear queries with respect to hyperplanes.
This problem is equivalent to another well-studied problem, this time in machine learning: active learning of linear classifiers. Here, there are n known points in R^d, which are labeled by an unknown hyperplane. The goal is to as efficiently as possible find the labeling of all n points. Similarly here, access to the unknown hyperplane is done indirectly, in the form of label queries.
For both problems, there is a simple information-theoretic lower bound of Omega(d log n) on the number of queries needed. Despite 40 years of work, the best known upper bounds had a polynomial dependence on the dimension d. In this work, we achieve for the first time a near-linear dependence on the dimension. The proof combines tools from geometry, analysis and machine learning.
Joint work with Max Hopkins, Daniel Kane and Gaurav Mahajan.
Paper will appear in FOCS 2020
Arxiv: https://arxiv.org/abs/2004.11380
Bio: Shachar Lovett has a broad interest in theoretical computer science and mathematics. In particular, he is interested in computational complexity, randomness and pseudorandomness, algebraic constructions, coding theory, discrete mathematics and additive combinatorics. Lovett is currently an assistant professor at UC San Diego. Before that, he was a postdoc at the Institute of Advanced Study with Avi Wigderson, and a PhD student at the Weizmann Institute, where he was advised by Omer Reingold and Ran Raz.