week 2 monday c monogame program creates a game1 or
play

Week 2 - Monday C# MonoGame Program creates a Game1 (or similar) - PowerPoint PPT Presentation

Week 2 - Monday C# MonoGame Program creates a Game1 (or similar) object and starts it running Game1 has: Initialize() LoadContent() Update() Draw() It runs an update-draw loop continuously until told to exit


  1. Week 2 - Monday

  2.  C#  MonoGame

  3.  Program creates a Game1 (or similar) object and starts it running  Game1 has:  Initialize()  LoadContent()  Update()  Draw()  It runs an update-draw loop continuously until told to exit

  4.  We're used to interacting with programs from the command line (console)  MonoGame was not designed with this in mind  It has pretty easy ways to read from the keyboard, the mouse, and also Xbox controllers  But you'll need a console for Project 1 so that you can tell it which file to load and what kind of manipulations to perform on it  So that Console.Write() and Console.Read() work  Go to the Properties page for your project  Go to the Application tab  Change Output Type to Console Application  More information: http://rbwhitaker.wikidot.com/console-windows  You'll need a separate thread to read and write to the console if you don't want your game to freeze up

  5.  To draw a picture on the screen, we need to load it first  Inside a MonoGame project, right-click the Content.mgcb file and choose Open with…  Select MonoGame Pipeline Tool  Add and then Existing Item…  Find an image you want on your hard drive  Make sure the Build Action is Build  The Importer should be Texture Importer - MonoGame  Create a Texture2D member variable to hold it  Assume the member variable is called cat and the content is called cat.jpg  In LoadContent() , add the line: cat = Content.Load<Texture2D>("cat.jpg");

  6.  Now the variable cat contains a loaded 2D texture  Inside the Draw() method, add the following code: spriteBatch.Begin(); spriteBatch.Draw(cat, new Vector2(x, y), Color.White); spriteBatch.End();  This will draw cat at location ( x , y )  All sprites need to be drawn between Begin() and End() spriteBatch calls

  7.  Modern TrueType and OpenType fonts are vector descriptions of the shapes of characters  Vector descriptions are good for quality, but bad for speed  MonoGame allows us to take a vector-based font and turn it into a picture of characters that can be rendered as a texture  Just like everything else

  8.  Inside a MonoGame project, right-click the Content.mgcb file and choose Open with…  Select MonoGame Pipeline Tool  Right click on Content in the tool, and select Add -> New Item…  Choose SpriteFont Description and give your new SpriteFont a name  Open the spritefont file, choosing a text editor like Notepad++  By default, the font is Arial at size 12  Edit the XML to pick the font, size, and spacing  You will need multiple Sprite Fonts even for different sizes of the same font  Repeat the process to make more fonts  Note: fonts have complex licensing and distribution requirements

  9.  Load the font similar to texture content font = Content.Load<SpriteFont>("Text");  Add a DrawString() call in the Draw() method: spriteBatch.Begin(); spriteBatch.DrawString(font, "Hello, World!", new Vector2(100, 100), Color.Black); spriteBatch.End();

  10.  They "float" above the background like fairies…  Multiple sprites are often stored on one texture  It's cheaper to store one big image than a lot of small ones  This is an idea borrowed from old video games that rendered characters as sprites

  11.  It is possible to apply all kinds of 3D transformations to a sprite  A sprite can be used for billboarding or other image-based techniques in a fully 3D environment  But, we can also simply rotate them using an overloaded call to Draw() spriteBatch.Draw(texture, location, sourceRectangle, Color.White, angle, origin, 1.0f, SpriteEffects.None, 1);

  12.  texture : Texture2D to draw  location : Location to draw it  sourceRectangle Portion of image  Color.White Full brightness  angle Angle in radians  origin Origin of rotation  1.0f Scaling  SpriteEffects.None No effects  1 Float level

  13.  For API design, practical top-down problem solving, and hardware design, and efficiency, rendering is described as a pipeline  This pipeline contains three conceptual stages: Decides Produces what, Renders material how, and the final Application Geometry Rasterizer to be where to image rendered render

  14.  The output of the Application Stage is polygons  The Geometry Stage processes these polygons using the following pipeline: Model and Vertex Screen View Projection Clipping Shading Mapping Transform

  15.  Each 3D model has its own coordinate system called model space  When combining all the models in a scene together, the models must be converted from model space to world space  After that, we still have to account for the position of the camera

  16.  We transform the models into camera space or eye space with a view transform  Then, the camera will sit at (0,0,0), looking into negative z  The z -axis comes out of the screen in the book's examples and in MonoGame (but not in older DirectX)

  17.  Figuring out the effect of light on a material is called shading  This involves computing a (sometimes complex) shading equation at different points on an object  Typically, information is computed on a per-vertex basis and may include:  Location  Normals  Colors

  18.  Projection transforms the view volume into a standardized unit cube  Vertices then have a 2D location and a z -value  There are two common forms of projection:  Orthographic : Parallel lines stay parallel, objects do not get smaller in the distance  Perspective : The farther away an object is, the smaller it appears

  19.  Clipping process the polygons based on their location relative to the view volume  A polygon completely inside the view volume is unchanged  A polygon completely outside the view volume is ignored (not rendered)  A polygon partially inside is clipped  New vertices on the boundary of the volume are created  Since everything has been transformed into a unit cube, dedicated hardware can do the clipping in exactly the same way, every time

  20.  Screen-mapping transforms the x and y coordinates of each polygon from the unit cube to screen coordinates  A few oddities:  DirectX has weird coordinate systems for pixels where the location is the center of the pixel  DirectX conforms to the Windows standard of pixel (0,0) being in the upper left of the screen  OpenGL conforms to the Cartesian system with pixel (0,0) in the lower left of the screen

  21.  Rendering pipeline  Rasterizer stage

  22.  Keep reading Chapter 2  Want a Williams-Sonoma internship?  Visit http://wsisupplychain.weebly.com/  Interested in coaching 7-18 year old kids in programming?  Consider working at theCoderSchool  For more information: ▪ Visit https://www.thecoderschool.com/locations/westerville/ ▪ Contact Kevin Choo at kevin@thecoderschool.com ▪ Ask me!

Recommend


More recommend