The following exchange is part of an ongoing CS News series in which Cornell Computer Science faculty share some details about their research initiatives and teaching practice. For this session, we speak with assistant professor Anil Damle.
ANIL DAMLE: At Work on Perturbation Theory, Algorithms, and Teaching Computational Mathematics for Computer Science
Anil, can you please describe some of your ongoing research projects and also share a few representative publications on these topics?
Some of the questions I am most interested and invested in right now involve understanding how properties of mathematical models change when the underlying input is perturbed. Mathematical models help drive algorithm development, theoretical explorations, and scientific discovery. Nevertheless, they are just models, and not reality. Therefore, it is important to understand how sensitive the models and resulting algorithms are to changes in input. Practically, these changes can represent noise in measurements, trying to fit a misspecified model, evolving problem geometry, and more. Concretely, some of my recent work explores how invariant subspaces of matrices change when the matrix is perturbed [Damle and Sun 2019] and develops methods for rapidly solving PDEs in complex, evolving domains. [Ryan and Damle 2020].
To dive a bit more into the first example above (namely, on invariant subspaces), while there is a long line of work in so-called invariant subspace perturbation theory, modern applications necessitate new theoretical developments. In our research, we adapt how changes in invariant subspaces are measured. For example, in the simplest setting our work bounds how much any individual entry of an eigenvector can change when the matrix is perturbed. This metric is well suited to many modern applications, such as ranking the centrality of nodes in a graph. In this setting, our findings could be used to help understand how these rankings change as a graph is perturbed.
More generally, I am interested in a broad range of questions spanning a diverse set of application areas, including computational quantum chemistry, spectral methods, numerical methods for solving PDEs, kernel methods, and more. I have always had an inclination to think about wide ranging problems that I find interesting, a curiosity that drew me to applied and computational mathematics—and so often there are amazing mathematical connections at the heart of these problems as well (yet another draw). My technical perspective, rooted in (numerical) linear algebra, provides an exciting opportunity to explore connections between seemingly disparate application areas, transfer algorithmic and theoretical insights, and make scientific contributions.
Can you tell us what you are most excited about in this research? And why?
Across all of my work, I am invested in developing algorithms that perform provably well on simple mathematical or statistical models and also generalize nicely to problem instances that deviate from these models. For example, the theoretical work on eigenvectors furthers our understanding of how a broad class of spectral algorithms perform on both model and real-world problems. Furthermore, the work on invariant subspaces complements other aspects of my research developing algorithms for building local basis functions in computational quantum chemistry [Damle and Lin 2018] and graph clustering [Damle, Minden, and Ying 2019]. Independent of these reasons, I just find understanding invariant subspaces a fascinating theoretical question—this is truly a problem I enjoy working on.
Please say something about what topics motivate your current teaching practice. For instance, how have your interests shaped your development of course content?
One of the reasons that I sought a career in academia is that I like the blend of research and teaching that it enables; I deeply value education and teaching is an important part of my job. Motivated by my own love for the subject, I particularly enjoy teaching classes in numerical analysis and numerical linear algebra here at Cornell. Beyond this fact, many rapidly growing areas of computer science such as machine learning, robotics, and data science greatly benefit from core technical skills in topics such as linear algebra, probability, statistics, and optimization—topics at the core of my work. This collection of research and teaching interests has led me to develop a new course here at Cornell aimed at students transitioning from their introductory mathematics and computer science courses to advanced courses. The class, Computational Mathematics for Computer Science, CS 3220, is aimed at introducing students to a computational way of thinking about problems and equipping them with the core skills necessary to succeed on their educational journeys and beyond.
Thank you, Anil, for sharing this wonderful glimpse of your work in Computer Science at Cornell.