Live Projection Mapping
Student: Paulo Marques
Supervisor: António Ramires (UM) e Nelson Alves (CCG)
Abstract
Traditional Projection Mapping systems use a static scene that is reconstructed in an off-line step. This kind of setups not only can’t handle runtime interaction with the real scene but also can not be reused in other scenarios, as they need a recreation of the respective scene to be used in the virtual world, usually, hand measured and modeled.
The main goal of this work is to surpass this problem exploring a projection mapping system that reconstructs the surface in run-time, adapting the projection as the scene changes. To achieve this goal the system needs to handle the interaction between two distinct areas that have seen substantial progress within the past few years: Surface Reconstruction and Spatial Augmented Reality. With the recent advances in real-time Surface Reconstruction, this combination allows the development of a real time projection mapping system, in which the real world’s scene can be interactively modified and reconstructed in the virtual world. To recreate the scene’s surface model, a depth sensor is used, providing depth information alongside an RGB image. Subsequent to the scene’s reconstruction, the projection can have two purposes, one is to simply manipulate the surface appearance and the other is to add virtual objects to the scene. The last scenario is where Spatial Augmented Reality and its View Dependent Rendering concept are introduced.
Thesis Download (PDF)