INSTITUTE OF PURE AND APPLIED MATHEMATICS Production framework for full panoramic scenes with photo-realistic augmented reality Dalai Felinto ⋆ , Aldo Zang † and Luiz Velho † ⋆ Fisheries Centre, UBC - Vancouver, Canada † Visgraf Laboratory, IMPA - Rio de Janeiro, Brazil October 2012 D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 1 / 34
Outline Part 1 1 Introduction Technique pipeline Environment capture Part 2 2 Rendering process Scene primitives Shadows and reflections accounts Part 3 3 Results Conclusions D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 2 / 34
Introduction From a full panoramic HDR image we reconstruct the real scene, insert and illuminate synthetic elements in a photorealistic way using accurate shadows and reflections. Real Panorama → Processing → Augmented Panorama D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 3 / 34
Technique pipeline Environment capture and calibration 1 Scene modeling 2 Scene depth image computation 3 Illumination setup 4 Synthetic elements 5 Integration and rendering 6 D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 4 / 34
Environment Capture Space partition for Nikon Capture equipment: Manfrotto Pano 10.5mm: 5 photos around z- Head, Nikon DX2S camera and axis, 1 zenith photo and 1 Nikon 10.5mm fisheye lens. nadir photo. D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 5 / 34
Panorama Stitching Stitching of multiple images Stitched and blended image D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 6 / 34
Panorama Calibration Advantages of a calibration system: Allow to move important sampling regions off the poles. 1 Less concern on tripod alignment for picture capturing. 2 It works with panoramas obtained from the internet. 3 Optimal aligned axis for the world reconstruction. 4 D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 7 / 34
Panorama Calibration Calibration system The four points on floor determines the orientations axis for the environment. The blue, red and green lines determines the xy (horizon), xz and yz slice planes. D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 8 / 34
Panorama Calibration Calibration account Orientation: x = ( � p 0 × � p 1 ) × ( � p 3 × � p 2 ) y = ( � p 1 × � p 2 ) × ( � p 0 × � p 3 ) z = x × y y = z × x We need, however, to gather more data - as the orientation alone is not sufficient! D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 9 / 34
Panorama Calibration Calibration account Orientation: x = ( � p 0 × � p 1 ) × ( � p 3 × � p 2 ) y = ( � p 1 × � p 2 ) × ( � p 0 × � p 3 ) z = x × y y = z × x We need, however, to gather more data - as the orientation alone is not sufficient! Any arbitrary positive non-zero value for the camera height will produce a different (scaled) reconstruction of the original geometry in the 3d world. We have a dual system with the rectangle dimensions and the camera height. The user can decide which one is the most accurate data available. D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 9 / 34
Illumination Setup: HDR environment map EV = 0 EV = -2 EV = -4 EV = -6 Full panoramic image in HDR format. D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 10 / 34
Environment modeling Used to compute scene depth, for visibility and shadows 1 D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 11 / 34
Environment modeling Used to compute scene depth, for visibility and shadows 1 Provides the light position in world space for the reflection rays 2 D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 11 / 34
Environment modeling Used to compute scene depth, for visibility and shadows 1 Provides the light position in world space for the reflection rays 2 The modeled environment serves as support surfaces for 3 shadows and reflections rendering of synthetic elements D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 11 / 34
Environment modeling Modeled base geometry (wireframe) D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 12 / 34
Environment modeling Modeled base geometry (polygons) D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 12 / 34
Environment modeling: Depth image of the scene Depth image reconstruction from base geometry D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 13 / 34
Environment modeling: Depth image of the scene Depth image reconstruction from base geometry D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 13 / 34
Illumination Setup: Light-depth environment map We combine panorama and depth image to obtain the Light-depth environment map D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 14 / 34
Illumination Setup: Light-depth environment map We combine panorama and depth image to obtain the Light-depth environment map D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 14 / 34
Modeling new synthetic elements Modeled elements (in blue) placed in the scene interact with the environment D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 15 / 34
Integration and rendering Augmented scene being rendered by our algorithm implemented in ARLuxrender D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 16 / 34
Integration and rendering Final scene after rendering process D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 16 / 34
RENDERING PROCESS D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 17 / 34
Primitives classification Synthetic primitives: the objects that are new to the scene. They 1 don’t exist in the original environment. Support primitives: surfaces present in the original environment 2 that needs to receive shadows and reflections from the synthetic primitives. Environment primitives: all the surfaces of the original 3 environment that need to be take into account for the reflection and shadow computation for the other primitive types. D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 18 / 34
Primitives classification Scene primitives D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 19 / 34
Primitives classification Scene primitives D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 19 / 34
Shadows Environment Map w i l i TW(wi)) r(p, li-p) m r(p, L a C O E support primitive p E n v i r o n m e n t synthetic p r i m i t i v e primitive Directional approach: The ray r ( p , LTW ( ω i )) is not occluded, so the light l i illuminates the point p . D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 20 / 34
Shadows Environment Map w i l i TW(wi)) r(p, li-p) m r(p, L a C O E support primitive p E n v i r o n m e n t synthetic p r i m i t i v e primitive Directional approach: The ray r ( p , LTW ( ω i )) is not occluded, so the light l i illuminates the point p . Light-positional approach: The ray r ( p , l i − p ) is occluded by a synthetic object, thus it gets the correct world-based estimation. D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 20 / 34
Shadows D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 21 / 34
Shadows Realistic shadows effects. Shadows are consistent with world light positions. D. Felinto, A. Zang, L. Velho (IMPA) Production framework for full panoramic scenes with photo-realistic augmented reality 22 / 34
Recommend
More recommend