Principles of Computer Graphics and Image Processing Textures, Mappings (06) RNDr. Martin Madaras, PhD. martin.madaras@stuba.sk
Overview Texture mapping 3D Models with texture coordinates UV map parametrization Perspective correction Aliasing Anti-aliasing Supersampling Mip Maps Advanced textures Environment mapping Bump mapping Normal mapping, Displacement mapping etc. 2
How the lectures should look like #1 Ask questions, please!!! - Be communicative - www.slido.com #PPGSO06 - More active you are, the better for you! - 3
Material Visually distinguishes 2 objects with identical geometry For now, we focus on object’s own color 4
Texture Used to define object’s color appearance 2D bitmap Volumetric - texels Procedural texture 5
Texture mapping Mapping between object space and 2D texture space New coordinate system: Texture coordinates 6
Texture mapping 7
Texture mapping 8
Texture mapping 9
Texture mapping 10
3D rendering pipeline 3D polygons Modeling 1 Transformation Lighting + color from image Lighting Could be implemented here Viewing Transformation Projection Transformation Clipping Scan Conversion 2D Image 11
3D rendering pipeline 3D polygons Modeling 1 Transformation Lighting Viewing Transformation Projection Transformation Clipping GPU Texture mapping Scan Conversion Fragment shader implementation 2D Image 12
Texture mapping Add visual detail to surfaces of 3D objects 1) Parameterized mesh 2) Final textured model 13
Intermediate pixels Remember polygon rasterization projected scanline S C A N L I N E Screen space Texture space 14
Texture usage object diffuse color patterns, decals modulate surface properties bumps, displacements modulate lighting properties e.g. shininess simulate physical phenomena reflection, refraction, global illumination 15
Texture mapping 16
Example – cartography Unwrapping earth into a plane 17
Parameterization 18
Parameterization Implicit parametrization by geometrical primitives 19
Parameterization XYZ to UV for sphere: x = u + + 2 2 2 x y z y = v + + 2 2 2 x y z http://tobias.preclik.de/codeblog/?p=9 20
Parameterization Parameterization using an intermediate surface 21
Parameterization 22
Texture mapping When drawing pixels, map from ... image coordinate system (x,y) to modeling coordinate system (u,v) to texture coordinate system (t,s) 23
UV mapping Scan conversion Interpolate texture coordinates down/across scan lines Distort due to bilinear interpolation approximation Cut polygons into smaller ones, or Perspective divide at each pixel 24
Perspective correction Scan conversion Interpolate texture coordinates down/across scan lines Distort due to bilinear interpolation approximation Cut polygons into smaller ones, or Perspective divide at each pixel 25
Perspective correction INCORRECT CORRECT P Q = − + = − + T ( 1 t ) t ( 1 ) T t P tQ P Q z z 26
Overview Texture mapping 3D Models with texture coordinates UV map parametrization Diffuse color textures Other textures Bump mapping Environment mapping Aliasing Anti-aliasing Supersampling Mip Maps 27
Aliasing “ Moire pattern” Nyquist frequency Sampling frequency >= 2x signal frequency 28
Texture filtering Ideally, use elliptically shaped convolution filter In practice we use rectangles 29
Texture filtering Size of filter depends on projective wrap Images can be pre-filtered Mip Maps Summed area tables 30
Mip maps Keep textures pre-filtered at multiple resolutions For each pixel, linearly interpolate between two closest levels (e.g., trilinear filtering) Fast and easy for hardware 31
Environment mapping https://www.youtube.com/watch?v=LOeEfkzZ1ps 32
Light maps 33
Light maps Pre-computed high-quality lighting Stored into special texture (light map) Light map combined with the texture Texture baking (permanent) 34
Light maps http://www.cs.bath.ac.uk/~pjw/NOTES/pics/lightmap.html 35
Bump mapping A modified surface normal is calculated from the height map Modified normal is used during shading Geometry is not altered 36
Multitexturing Combine multiple textures 37
Overview Advanced Shading and Mapping Deferred Shading Shadow Mapping Normal Mapping Displacement Mapping Vector Displacement Mapping 38
Deferred Shading Compute Lighting in Screen-Space Two pass approach Decoupling of geometry and lighting G- Buffer stores positions, normals, materials … Lighting is a per-pixel operation Problems with transparency and G-buffer size O(objects+lights) 39
Deferred Shading Diffuse Color Z Buffer Surface Normals Final Composition 40
Normal Mapping Fake lighting of bumps and dents “Dot3 bump mapping” Add lighting details without additional geometry Store normals from high-polygon object in texture Encode X,Y,Z as R,G,B color information 41
Normal Mapping 42
Normal Mapping Normal map a) (encoded in object space) Original high-res model b) Rendered low-res model c) Applied normal map d) 43
Displacement Mapping Move geometry as specified in texture Displacement in direction of surface normal Can add additional detail to a subdivided model Relies on dense geometry Usually used with adaptive tessellation techniques 44
Displacement Mapping 45
Displacement Mapping bump mapping displacement mapping 46
Vector Displacement Mapping Displace geometry in any direction Generalization of displacement mapping Possible to store detailed geometry in textures Excellent for sculpting purposes (Z-Brush) 47
Vector Displacement Mapping 48
Shadow Mapping Two pass technique Obtain Light view depth buffer Compare each pixel rendered to with light depth Pixels further away are in shadow Needs margin of error for lit pixels Implementation usually has artifacts 49
Shadow Mapping 50
Lights, visibility, texture... 51
What’s missing is shadow 52
Next Week Shadows 53
Acknowledgements Thanks to all the people, whose work is shown here and whose slides were used as a material for creation of these slides: Matej Novotný, GSVM lectures at FMFI UK Peter Drahoš, PPGSO lectures at FIIT STU 54
Questions ?! www.slido.com #PPGSO06 martin.madaras@stuba.sk 55
Recommend
More recommend