Noah Snavely, associate professor of Computer Science at Cornell Tech and a researcher at Google Research in New York City, will present “Capturing and Rendering the World from Photos" in the College of Information Sciences and Technology Distinguished Lecture Series at Pennsylvania State University. More details and a link to the talk, which will begin on Zoom at 4 PM, Thursday, March 25th.
Snavely, who is a recipient of a PECASE, a Microsoft New Faculty Fellowship, an Alfred P. Sloan Fellowship, a SIGGRAPH Significant New Researcher Award, and a Helmholtz Prize offers a glimpse of his presentation:
Imagine a futuristic mapping service that could dial up any possible view along any street in the world at any possible time. Effectively, such a service would be a recording of the plenoptic function—the hypothetical function described by Adelson and Bergen that captures all light rays passing through space at all times. While the plenoptic function is completely impractical to capture in its totality, every photo ever taken represents a sample of this function. I will present recent methods we've developed to attempt to reconstruct the plenoptic function from sparse space-time samples of photos—including Google Street View data and tourist photos on the internet. The results of this work include compelling new ways to render new views of the world in space and time.
In related news: read about Snavely's Hemholtz Prize from ICCV, and research on visual chirality with Cornell CS colleague Abe Davis and other collaborators.