Augmented Studio -Controlled Light Modulation in Television Studios-
Final Report Abstract
This project focused on investigating the general potentials and limitations of applying spatial, temporal, chrominance, and luminance modulated illumination in the context of digital video composition effects for enhanced live broadcasts and recordings. While digital video composition methods like chroma keying within virtual studios are standard methods for everyday television and film making, less research has been carried out on the possibilities of high frequent temporal and spatial lighting modulation for generating advanced effects. Synchronized illumination-camera systems were used in this project to support the robust generation of alpha mattes (i.e., separating foreground from background pixels), to simulate refraction effects, to support actors/moderators by spatially displaying visual hints within the studio environment, and to display camera-related information in such a way that it is not perceived by the human observers. The key technical concept applies synchronized illumination units (projectors and/or high-speed LEDs) and cameras to display information that is visible (e.g., illumination and visual effects) or invisible (e.g., coded patterns for acquisition of scene depth and camera pose, or dynamic direction information visible only to the actor/moderator) in broadcasted or recorded video streams. Several engineering and computer science oriented problems were solved throughout the project: Software as well as hardware frameworks were developed for synchronizing display- and capturing devices. New techniques for automatic projector calibration, as well as for displaying image content in complex and dynamic environments (not optimized for projections) were developed. Methods for offline and online scene acquisition and camera tracking were evaluated, improved and adapted to the proposed key concept. New projector-based and video-based digital video composition techniques were developed and implemented in proof-of-concept scenarios. New temporally and spatially intensity modulation methods were developed which enable a projection of coded information in such a way that its content is not perceived by human observers, but can be reconstructed by a synchronized camera. These imperceptible codes were dynamically adapted depending on the displayed image content, the applied hardware configuration as well as the properties of the human visual system. The basic technology which was investigated within the scope of this project offers a series of future application scenarios, such as: • The generation of new professional chroma keying methods to overcome the constraints of current methods in terms of color dependencies as well as spill compensation. • The integration of imperceptible codes into a complete virtual studio system to enable realtime scanning and detection of static as well as dynamic geometry during run-time. • The application of a synchronized illumination-camera system at arbitrary film sets to support the making of various digital video composition effects outside of film studios. The described techniques that were implemented as proof-of-concept prototypes in this project might be extended towards professional studio solutions and combined to a spatially and temporally controllable studio illumination system.
Publications
- "Digital Illumination for Augmented Studios", Journal of Virtual Reality and Broadcasting, vol. 3, no. 8, 2006
Bimber, O., Grundhöfer, A., Zollmann, S., and Kolster, D.
- "Coded Projection and Illumination for Television Studios", Bauhaus-University Weimar, Technical Report #843, 2007
Grundhöfer, A., Seeger, M., Häntsch, F., and Bimber, O.
- "Dynamic Adaptation of Projected Imperceptible Codes" Bauhaus-University Weimar, Technical Report #873, 2007
Grundhöfer, A., Seeger, M., Häntsch, F., and Bimber, O.
- "Dynamic Adaptation of Projected Imperceptible Codes", IEEE International Symposium on Mixed and Augmented Reality (ISMAR'07), 2007
Grundhöfer, A., Seeger, M., Häntsch, F., and Bimber, O.
- "Verfahren zur Erzeugung erweiterter Realität in einem Raum", German patent DE 10 2007 041 719.7
Bimber, O. Grundhöfer, A., Zollmann, S., and Kolster, D.
- "Dynamic Bluescreens", ACM Siggraph Research Posters and Talks, 2008
Grundhöfer, A and Bimber, O.
- "Dynamic Bluescreens", Bauhaus-University Weimar, Technical Report #1301, 2008
Grundhöfer, and Bimber, O.
- "VirtualStudio2Go: Digital Videocomposition for Real Environments", ACM Siggraph Asia, ACM Transactions on Graphics (TOG), December 2008
Grundhöfer, A. and Bimber, O.
- “Projector-Camera Systems in Entertainment and Art” (chapter), Handbook of Multimedia for Digital Entertainment and Arts, Springer (publisher), Borko Furht (editor), 2009
Bimber, O. and Yang, X.
- "Color Invariant Chroma Keying and Color Spill Neutralization for Dynamic Scenes and Cameras", Computer Graphics International (CGI), Springer The Visual Computer, 2
Grundhöfer, A., Kurz, D., Thiele, S., and Bimber, O.