User-in-the-Loop View Sampling with Error Peaking Visualization 🧐
IEEE International Conference on Image Processing (ICIP) 2025 (To Appear)

Abstract Augmented reality (AR) provides ways to visualize missing view samples for novel view synthesis. Existing approaches present 3D annotations for new view samples and task users with taking images by aligning the AR display. This data collection task is known to be mentally demanding and limits capture areas to pre-defined small areas due to ideal but restrictive underlying sampling theory. To free users from 3D annotations and limited scene exploration, we propose using locally reconstructed light fields and visualizing errors to be removed by inserting new views. Our results show that the error-peaking visualization is less invasive, reduces disappointment in final results, and is satisfactory with fewer view samples in our mobile view synthesis system. We also show that our approach can contribute to recent radiance field reconstruction for larger scenes, such as 3D Gaussian splatting.
@inproceedings{yasunaga_icip25,
author={Yasunaga, Ayaka and Saito, Hideo and Mori, Shohei},
booktitle={IEEE International Conference on Image Processing (ICIP)},
title={User-in-the-Loop View Sampling with Error Peaking Visualization},
year={2025}
}
Acknowledgement This work was supported by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy – EXC 2120/1 – 390831618 and partly by a grant from JST Support for Pioneering Research Initiated by the Next Generation (# JPMJSP2123)