We present ReCapture, a system that leverages AR-based guidance to help users capture time-lapse data with hand-held mobile devices. ReCapture works by repeatedly guiding users back to the precise location of previously captured images so they can record time- lapse videos one frame at a time without leaving their camera in the scene. Building on previous work in computational re-photography, we combine three different guidance modes to enable parallel hand- held time-lapse capture in general settings. We demonstrate this versatility on a wide variety of subjects and scenes captured over a year of development and regular use, and explore different visualizations of unstructured hand-held time lapse data.
ReCapture Time-lapse is an iOS implementation of our system, which leverages the power of Apple ARKit and CoreImage to support 3D tracking and offline image processing. It supports all of the three capture modes described in the paper, and provides real-time time-lapse visualization and sharing features.
@InProceedings{recapture2022,
author = {Ruyu Yan and Jiatian Sun and Longxiulin Deng and Abe Davis},
title = {ReCapture: AR-Guided Time-lapse Photography},
booktitle = {ACM Symposium on User Interface Software and Technology (UIST)},
url = {https://doi.org/10.1145/3526113.3545641}
doi = {10.1145/3526113.3545641}
month = {Nov},
year = {2022}
}