Streaming Video Textures for Mixed Reality Applications in Interactive Ray Tracing Environments
Andreas Pomi, Gerd Marmitt, Ingo Wald and Philipp Slusallek
in Ertl, Girod, Greiner, Niemann, Seidel, Steinbach, Westermann (Eds.): Vision, Modeling and Visualization (VMV) 2003 Proceedings, Munich, Germany, November 19-21, 2003, pages=261-269
The realm of mixed reality applications lies in blending rendered images with images of the real world. This requires highly realistic rendered images in order to seamlessly blend between those two worlds. However, current rasterization technology severely limits the achievable realism and imposes strict limits on the scene complexity and the optical effects that can be simulated efficiently. Real-time ray tracing can overcome many of these constraints and enables completely new approaches for mixed reality applications.
This paper explores this design space based on a framework for live streaming of video textures in a real-time ray tracing engine. We also suggest a novel approach to video-based AR by integrating image compositing with shading computations. We demonstrate the approach with a number of VR/AR applications including video inserts, video billboards, and dynamic lighting from video and HDR video streams. Being seamlessly integrated into the ray tracing framework, all our applications feature ray traced effects, like shadows, reflections and refraction.