ReCapture: AR-Guided Time-lapse Photography

UIST 2022


Ruyu Yan      Jiatian Sun      Longxiulin Deng      Abe Davis

PaperResults

We present ReCapture, a system that leverages AR-based guidance to help users capture time-lapse data with hand-held mobile devices. ReCapture works by repeatedly guiding users back to the precise location of previously captured images so they can record time- lapse videos one frame at a time without leaving their camera in the scene. Building on previous work in computational re-photography, we combine three different guidance modes to enable parallel hand- held time-lapse capture in general settings. We demonstrate this versatility on a wide variety of subjects and scenes captured over a year of development and regular use, and explore different visualizations of unstructured hand-held time lapse data.


Video


The ReCapture App

View Tutorial

ReCapture Time-lapse

Create a time-lapse with your phone

View on App Store

ReCapture Time-lapse is an iOS implementation of our system, which leverages the power of Apple ARKit and CoreImage to support 3D tracking and offline image processing. It supports all of the three capture modes described in the paper, and provides real-time time-lapse visualization and sharing features.


Sample Results

More Results

Overlay

3D Tracking

Light Field


BibTeX

@InProceedings{recapture2022,
    author = {Ruyu Yan and Jiatian Sun and Longxiulin Deng and Abe Davis},
    title = {ReCapture: AR-Guided Time-lapse Photography},
    booktitle = {ACM Symposium on User Interface Software and Technology (UIST)},
    url = {https://doi.org/10.1145/3526113.3545641}
    doi = {10.1145/3526113.3545641}
    month = {Nov},
    year = {2022}
}