simulation engines tda571 dit030 3d graphics part 2
play

Simulation Engines TDA571|DIT030 3D Graphics - Part 2 Tommaso - PowerPoint PPT Presentation

Simulation Engines TDA571|DIT030 3D Graphics - Part 2 Tommaso Piazza 1 Administrative stuff Today 3D Graphics presentation Meet me in front of Grace Hopper Have you started coding? Where are your repositories? Pssst...


  1. Simulation Engines TDA571|DIT030 3D Graphics - Part 2 Tommaso Piazza 1

  2. Administrative stuff  Today 3D Graphics presentation  Meet me in front of Grace Hopper  Have you started coding?  Where are your repositories? Pssst... Wikipedia is accepting donations. Wikipedia is a nonprofit project that exists for one reason: the free and open sharing of knowledge. Your donations keep Wikipedia going. IDC | Interaction Design Collegium 2

  3. Camera management  Three basic kinds of cameras in games  First-person  Camera attached to the player and inherits the exact motion of the player  Examples: Farcry, Doom, Quake, etc  Scripted  Camera moves along pre-defined paths  Examples: Alone in the dark, Resident Evil  Third person  The camera is located outside the body of the player and shows both the avatar and the environment  Examples: Super Mario 64, Gears of War IDC | Interaction Design Collegium 3

  4. Third-person camera: Constraints  The camera should never be closer to a wall than the near plane  It should never go outside a level  It should translate and rotate smoothly to always try to stay at a specific point in relation to the player character  It should smooth out discontinuities in the character's movement  It should be tested for collision detection  It should be able to enter the character when needed IDC | Interaction Design Collegium 4

  5. Third-person camera: Algorithm  Calculate the destination point for the camera from the character's position  The destination point is calculated by applying a displacement and rotation from the position of the player character. Different camera views can have different displacements, and it may be possible to switch between them.  Check the validity of the destination point (it could be on the wrong side of a wall)  Perform a ray intersect between character and destination point (there should be no intersection with the world geometry)  If the point is invalid, move the camera back towards the character so that it is positioned on the correct side of the wall  Calculate approximate translation and rotation motion and animate it over a series of frames (animation speed is a tweakable constant)  Check for collision using the bounding box of the camera during the motion IDC | Interaction Design Collegium 5

  6. Shaders and shader languages  1990s  Development of hardware for rendering of textured 3D primitives  Hardware T&L introduced  Dynamic lighting done using Gouraud shading  2000s  Per-pixel shading  Real-time procedural textures  Advanced texture mapping techniques IDC | Interaction Design Collegium 6

  7. In pursuit of realism  Traditionally, research on realistic computer graphics has focused on global illumination methods such as ray tracing and radiosity  Do not work in real time  Pixar introduced the concept of shaders in RenderMan and showed that GI is not strictly necessary for realistic images  Instead, shader-based methods based on local reflections can be used IDC | Interaction Design Collegium 7

  8. What is a shader?  Three different interpretations from Watt & Policarpo, 2003  C-style module in the RenderMan API used for high-level control of rendering components (surface, volume and light shaders)  A combination of render states and texture maps for a multi- pass or multi-texture render of an object on fixed-pipeline GPUs  New hardware functionality for controlling the rendering of primitives on a per-pixel or per-vertex level on programmable-pipeline GPUs; these are called pixel shaders and vertex shaders, respectively  The last one is the important one IDC | Interaction Design Collegium 8

  9. Shader types Vertex shader   Called for every vertex in a 3D primitive (modifies color, lightning, position...)  Allows for effects such as hardware skinning, perturbation of water surface, etc  http://www.youtube.com/watch?v=QHXjhfxAns0 Pixel (fragment) shader   Called once for every fragment in a 3D primitive (not pixel, because a fragment in a 3D primitive could correspond to one or several pixels depending on filtering settings, etc)  Can be used for procedural texture, normal maps, etc  http://www.youtube.com/watch?v=91gjSmfSIgw Geometry shader   Can add to and remove vertices from a mesh and be used for adding geometry too costly to process on the CPU  Allows displacement mapping, etc  http://www.youtube.com/watch?v=IRzAxtBWtV8 Unified Shader Model in DirectX 10 (Shader Model 4.0)  IDC | Interaction Design Collegium 9

  10. Shader languages  OpenGL Shading Language ( GLSL )  Part of the OpenGL specification since OpenGL 1.4  High-level language similar to C/C++  Cg (C for Graphics)  Nvidia's proprietary shader language  Extremely similar to HLSL, but works on both OpenGL and DirectX  Microsoft HLSL  Works on DirectX 9 and 10  Nvidia and Microsoft collaborated on its development IDC | Interaction Design Collegium 10

  11. Reflective surfaces: Environment maps  Commonly used for reflections  Precomputed textures  Standard environment maps  Single texture representing the scene in one texture as if reflected from a steel ball  Can also be used for advanced lighting IDC | Interaction Design Collegium 11

  12. Reflective surfaces: Cube maps  Cubic environment maps  Cube map consisting of six textures unfolded on to a cube  The most used format on modern hardware  The texture coordinate is a vector that specifies which way to look from the center of the cube mapped cube to get the desired texel http://en.wikipedia.org/wiki/File:Panorama_cube_map.png IDC | Interaction Design Collegium 12

  13. Particle systems  One of the most useful tools available  Smoke, water, fire, etc  http://www.youtube.com/watch?v=nmd6hIjgexs  Each individual particle has a small or zero geometrical extent, but together form cloud- like objects  Each particle system can contain thousands or even tens of thousands of particles  Be extremely cautious of multiple render states IDC | Interaction Design Collegium 13

  14. Particle systems  Consists of one or several emitters and a number of particles  Each emitter is responsible for spawning new particles every time update according to some distribution  Each particle contains information about its current position, size, velocity, shape and lifetime  Each update Emitters generate new particles  New particles are assigned initial attributes  All new particles are injected into the particle system  Any particles that have exceeded their lifetime are extinguished (and  usually recycled) The current particles are updated according to their scripts  The current particles are rendered  IDC | Interaction Design Collegium 14

  15. Water  Ideally based on fluid simulations  Way to costly for real-time use  Mostly represented by a simple plane/quad  In more advanced scenarios represented by a displaced grid  Reflections are either entirely faked or by using a planar mirror  The scene is inverted and the reflection is rendered to a texture which is then rendered using projective texturing onto the water mesh IDC | Interaction Design Collegium 15

  16. Explosions  There is no universal method  Usually a combination of effects are used in combination until it “looks good”  Often animated billboards of prerecorded explosions are combined with debris that is either actual geometry or particles  Nowadays, often actual models are used which is moved by the physics engine IDC | Interaction Design Collegium 16

  17. Lightmaps  Precomputed lighting and shadows on static geometry in the scene  Advantages  Allows for baking advanced lighting such as radiosity  Extremely fast  Disadvantages  Only works on static geometry  Requires a lot of pre- computing for good quality  May require a lot of memory IDC | Interaction Design Collegium 17

  18. Stencil shadows (shadows as volumes)  Used with dynamic shadows  Uses the stencil buffer  Advantages  Creates crisp shadows without aliasing artifacts  Stable and potentially very fast algorithm  Disadvantages  Difficult to get soft shadows (but possible)  Extruding the shadow volumes on hardware is cumbersome and requires modifications to the meshes  Requires multiple render passes  Patent problems with Carmack's reverse IDC | Interaction Design Collegium 18

  19. Stencil shadows  Empty the stencil buffer  Draw the whole scene with ambient lighting  The z-buffer is filled and the color buffer is filled with the color of surfaces in shadow  Turn off updates to the z-buffer and color buffer and draw the front-facing polygons of the shadow volumes  Increments the stencil buffer; all pixels in or behind shadow volumes receive a positive value  Repeat for back-facing polygons  Decrement the stencil buffer; the values in the stencil buffer will decrease where we leave the shadow volumes  Draw the diffuse and specular materials in the scene where the value of the stencil buffer is zero IDC | Interaction Design Collegium 19

Recommend


More recommend