Introduction to Games Graphics Development in Direct3D Ross Brown Games Technology Teaching Group Leader Visual and Media Computing Research Group Leader Faculty of Information Technology Queensland University.
Download
Report
Transcript Introduction to Games Graphics Development in Direct3D Ross Brown Games Technology Teaching Group Leader Visual and Media Computing Research Group Leader Faculty of Information Technology Queensland University.
Introduction to Games
Graphics Development
in Direct3D
Ross Brown
Games Technology Teaching Group Leader
Visual and Media Computing Research Group Leader
Faculty of Information Technology
Queensland University of Technology
Presentation Contents
Aims
Direct3D Graphics
Pipeline
Modelling Objects
Animating Objects
Lighting Objects
Projecting Objects
Effects in HLSL
Aims
Introduce you to the main
components of a DirectX
10 program
Provide pointers to
further information
Outcomes
Basic principles of Direct3D
Graphics Pipeline
Setup and Utility Library
Object Modelling
Object Animation
Scene Lighting
Projections
HLSL
Approach
Direct3D sample
SimpleHLSL10.cpp “Tiny”
deconstructed
Mix of theory and practice
Will cover key Direct3D
calls
Will not cover small
details
Assumptions
Some maths
Windows programming knowledge
Assuming familiarity with callback functions and
event based programming
Graphics tutorial, not games engine tutorial
VC++ knowledge – games lingua franca
Direct3D Introduction
Part of the Microsoft
Direct X multimedia
architecture
Also has reference
software renderer
DirectX 10 fully integrated
into Vista
Exposes Graphics
Processing Unit (GPU)
functionality via COM API
Renderer is encapsulated
in a device object
Application
User-Mode Driver
Direct3D
User Mode
Kernel Mode
GPU
Scheduler
DXGKrnl
Video Memory
Manager
Kernel-Mode Driver
Graphics Hardware
http://download.microsoft.com/download/2/2/b/22bfadd8-01b0-4fc4-942b6e7b1635b214/Intro_to_Direct3D10.ppt
Direct3D Graphics Pipeline
Is the theoretical
framework for computer
graphics
Defines the processes
enacted upon 3D
geometry to produce a
realistic image
Each API implementation
has its own version
Two stages
Geometry stage
Rasterisation stage
Input
Assembler
Vertex Buffer
Index Buffer
Vertex
Shader
Texture
Geometry
Shader
Texture
Geometry
Stage
Stream Output
Rasterizer/
Interpolator
Pixel
Shader
Output
Merger
Texture
Rasterisation
Stage
Depth/Stencil
Render Target
http://download.microsoft.com/download/2/2/b/22bfadd8-01b0-4fc4-942b6e7b1635b214/Intro_to_Direct3D10.ppt
Geometry Stage
CPU
3D Geometry handed
over the bus to geometry
stage
Transforms to these
points to world
coordinates
Applies lighting to the 3D
points
Project them to 2D
normalised coordinates
Geometry
Input
Assembler
Vertex Buffer
Index Buffer
Vertex
Shader
Texture
Geometry
Shader
Texture
Stream Output
Rasterizer/
Interpolator
Pixel
Shader
Bus
2D Projected
Polygons
Texture
Geometry
Stage
Texture
Shader
Rasterisation Stage
Stream Output
Rasterizer/
Interpolator
Pixel
Shader
Transformed, lit and
projected points from the
geometry stage are now
turned into pixels
Rasterisation turns
polygons into a series of
pixel x, y coordinates
within the framebuffer
Generates final image
Output
Merger
Texture
Raster.
Stage
Depth/Stencil
Render Target
Pixels
Frame
Buffer
Direct3D Program Structure
A Direct3D program is thus a set of the following
steps:
1. Direct3D device setup
2. Modelling of objects as 3D polygons
3. Transformation of objects into world space
4. Lighting of objects
5. Projection of objects to 2D window
6. Rasterisation of polygons into framebuffer as pixels
Direct3D Device Setup
DirectX utility toolkit removes complexity
There are a number of key callbacks
Other calls superfluous to talk’s requirements
WinMain sets up the callbacks on key events
OnD3D10CreateDevice - device is created
MsgProc - handle windows’ event messages
OnD3D10FrameRender – render the geometry to the
screen
Direct X Device Setup
WINAPI wWinMain(…) // Program Entrance point
…
DXUTSetCallbackMsgProc(
MsgProc );
DXUTSetCallbackD3D10DeviceCreated(
OnD3D10CreateDevice );
…
DXUTSetCallbackD3D10FrameRender(
OnD3D10FrameRender );
…
Modelling Objects as 3D Points
Concept of Cartesian
coordinates
We define vertices v in
3D space via Cartesian
coordinate system x, y, z
Positive z axis is pointing
into the screen
Direct3D is a Left Hand
coordinate system
3D points are generated
by a modelling package,
program – or by hand for
the seriously masochistic
y
v
vy
z
pz
vx
x
Modelling of Objects as 3D Meshes
A simple cube can be
modelled with eight 3D
points
Two points make an edge
Multiple points form a
polygon
Polygons formed into
polyhedrons
Polyhedron is split into
mesh of triangles
y
e6
v6
e5
v7
e8
v3
e3
v2
e7
v5
e2
e4
v4
v8
e9
v1
e1
x
Polygon and Vertex Normals
Each polygon or vertex is
given a normal
Normals are direction
vectors n(x, y, z)
indicating polygon
orientation
Used in lighting
calculations and other
things
n1
n3
n2
Polygon Texture Coordinates
Each mesh is then given
2D texture coordinates
(s, t) to enable the
draping of images onto
the mesh
t(0,1)
t
t(0,0)
s
Texture Image Space
Mesh in 3D Space
t(1,1)
t(1,0)
Modelling of Objects as 3D Meshes
Texture
Mesh & Normals
DirectX provides the X
file format to store
triangle meshes and
other information
Direct3D commands to
create the mesh are
called to allocate memory
for that mesh within the
Direct3D device
Done on the CPU side
Mesh is handed to the
GPU to be rendered
Final Rendered
Mesh
Direct3D File Format - Tiny.x
Mesh {
4432;
-34.720058;-12.484819;48.088928;,
-25.565304;-9.924385;26.239328;,
…
MeshNormals {
4432;
-0.989571;-0.011953;-0.143551;,
-0.433214;-0.193876;-0.880192;,
…
MeshTextureCoords {
4432;
0.551713;0.158481;,
0.442939;0.266364;,
…
Direct3D Mesh Creation Calls
// Create our vertex input layout
const D3D10_INPUT_ELEMENT_DESC layout[] =
{
{ L"POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, …
{ L"NORMAL",
0, DXGI_FORMAT_R32G32B32_FLOAT, …
{ L"TEXCOORD0", 0, DXGI_FORMAT_R32G32_FLOAT, …
};
pd3dDevice->CreateInputLayout(… &g_pVertexLayout…
// Load the mesh
DXUTFindDXSDKMediaFileCch( str, … L"tiny\\tiny.x“ );
g_Mesh10.Create( pd3dDevice, str, … );
Extra Points on Modelling
Meshes are stored in complex data structures
called Scene Graphs
Tree structures containing Meshes, Textures,
Transformations
Scene graph is traversed and preprocessed to
optimise the transmission of geometry to the
GPU
Scene graph becomes a major component of a
Games Engine
Tiny mesh demo
Transformation – Object to World
Once the points have
crossed the bus to the
GPU, the points undergo
a process of geometric
transformation
Each object has been
modelled in its own
Model Space – usually
the origin for the sake of
sanity
You need to perform an
Model to World
Transform to make the
object take its place in
the World
y
Model Space
z
x
y
World Space
z
x
Transformation – World to Camera
yc
Then you orient the world
to make the world exist in
camera space
xc
Called the View
Transform
Camera view direction
and coordinate system is
made to match x, y, z
axis
y
z
zc
x
yc = y
zc = z
xc = x
Animation of Objects
To animate is to “give life”
Use parameterised
transformations that
change over time t
Thus for each frame the
object has move a little
further, just like a movie
Typical transformations
are Translate, Rotate and
Scale
y
Translate
z
x y
Rotate
z
y
x
Scale
z
x
DirectX Matrices
Transforms use a 4x4
matrix – homogeneous
coordinates
Transforms are updated
within onFrameRender
DirectX provides a
D3DXMatrix Class
World transformation is
contained in mWorld
View transformation is
contained in mView
x
x’
y’ = y
z
z’
1
1
x
x+1
y+1 = y
z
z+1
1
1
m11 m12 m13 m14
m21 m22 m23 m24
m31 m32 m33 m34
m41 m42 m43 m44
1
0
0
1
0
1
0
1
0
0
1
1
0
0
0
1
Translation Example
Lighting Objects
Light photons emanate from light sources and
bounce off into our eyes
Simulated on a computer to produce the lighting
effects we perceive
Computational models of lights are built into
GPU hardware
Provide a number of parameters
Normal
Light colour and direction
Material properties – essentially colour
Lighting Model
Integrated into the model
Vertex has light value
calculated
Vertex forms a position
relative to the light
direction
Normal is used to
calculate orientation of
polygon
If polygon facing the light
then brightly lit
If polygon facing away
from light then darker
Color
Direction
Normal
Normal
Normal
Polygon Material Property
Incident light value is multiplied with the Material
Property, or reflectance of the object to produce
its colour due to that light
Summed for all light sources for final lighting
value
Properties specified in the model file – previous
Slide Tiny.x
Direct3D Lighting
D3DXVECTOR3 vLightDir[MAX_LIGHTS];
D3DXVECTOR4 vLightDiffuse[MAX_LIGHTS]; // Light colour
// Material properties of the mesh
D3DXCOLOR colorMtrlDiffuse(1.0f, 1.0f, 1.0f, 1.0f);
D3DXCOLOR colorMtrlAmbient(0.35f, 0.35f, 0.35f, 0);
Projection of Objects
After polygon mesh has
been transformed and lit,
it is Projected to be
shown on the 2D screen
Performed by the
projection matrix mProj
Many projections:
parallel, oblique...
Show Perspective
projection here as
commonly used in games
Perspective Projection
Horizon
Vanishing Point
Perspective Projection
Perspective projection
was discovered in the
y
renaissance period
Sense of depth by
shrinking objects x, y by
the distance in z
Encoded into 4x4 mProj z = 0
matrix as last transform
for vertices in polygon
mesh
Vertices scaled to
viewport, for rasterisation
into the framebuffer
Tiny demo of
transformations
P (x, y,Pz)
(x, y, z)
yp
yp
z
z=d
Projection Plane
Coordinate Spaces in Direct3D
Pipeline
World Space
Model Space
Pmodel
Mworld
Clipping Space
Camera Space
Mview
Mproj
Mclip
Homogeneous Screen Space
Clipping Space
Clipping
Projection Space
Mvs
Divide
by W
Screen Space
Pscreen
Direct3D Projection
CModelViewerCamera
D3DXMATRIX
D3DXMATRIX
D3DXMATRIX
D3DXMATRIX
g_Camera;
// Camera object
// contains viewing
// matrices
mWorldViewProjection;
mWorld;
mView;
mProj;
g_Camera.SetProjParams( D3DX_PI/4, fAspectRatio, … );
// Get the projection & view matrix from the camera class
mWorld = g_mCenterMesh * *g_Camera.GetWorldMatrix();
mProj = *g_Camera.GetProjMatrix();
mView = *g_Camera.GetViewMatrix();
mWorldViewProjection = mWorld * mView * mProj;
Rasterisation of Polygons
Concludes the Geometry Stage of pipeline
We now enter the final Rasterisation stage of the
DirectX pipeline
So far all the work we have looked at has been
performed in a real valued mathematical space
We now have to make those real valued meshes
appear within the discrete valued framebuffer
within the video memory
Triangle Rasterisation
The hardware takes the
triangles and turns them
into scanlines to be put
into the framebuffer
This is why triangles are
used – good
mathematical properties
for rasterisation
Light colour calculated in
the geometry stage are
linearly interpolated
across its surface
y
Pixels Generated
Light Colour 1
Light Colour 3
Light Colour 2
x
Final Texture and Lighting Pixel
Value
This is where the texture
is sampled, so that the
surface of the triangle
has the image appear
upon it
So the final colour of the
pixel is a combination of
the lighting value and the
texture value
y
HLSL – Short for Eye Candy!
Previously we have
described the
components of the
DirectX Graphics Pipeline
as fixed functions
Modern video cards from
NVIDA and ATI are now
programmable via small
functions called Shaders
This programs are
inserted into the
geometry and
rasterisation stages of the
pipeline
Vertex Buffer
V(x,y,z)
Input
Assembler
V’(x,y,z)
Vertex
Shader
Texture
Geometry
Shader
Texture
Index Buffer
Stream Output
P(x,y)
Rasterizer/
Interpolator
Pixel
Shader
Output
Merger
Texture
Depth/Stencil
Render Target
HLSL – Just for the Effect!
Thus the previous stages described are
performed by shaders
More formally they are known as Kernel
Functions, and the processors are called Stream
Processors
For a while they had to be programmed in
assembler – first Xbox
Now they are programmed in a number of highlevel C++ like languages
HLSL is one of those offerings, jointly developed
by Microsoft and NVIDIA
HLSL - Structure
Two forms of shaders exist
Vertex – acting on each vertex that is loaded into the
pipeline, output passed to the pixel shader
Pixel – acting on each pixel that is generated in the
rasterisation stage output is placed into framebuffer
Each stored in a .fx file format
Loaded into the Direct3D device on the GPU and
compiled ready for execution
DirectX 10 is ALL shaders…
HLSL
Set the parameters to pass across the bus
First get a pointer to the parameter structures
Set them before you render the mesh
Shaders are automatically called when a polygon
is rendered
Can switch between shaders for visual effects
Allows for multipass algorithms via a layered
approach – like Photoshop
HLSL
For our example “Tiny”
Vertex shader implements transforms and lighting set
up by DirectX API on the CPU side
Pixel shader samples the texture used for the surface
and blends this with the lighting value to give a final
pixel value in the framebuffer
HLSL – Shader Load
(OnD3D10CreateDevice)
// Load and compile shaders as effects
DXUTFindDXSDKMediaFileCch( str, …
L"BasicHLSL10.fx" ));
D3DX10CreateEffectFromFile( str,
… pd3dDevice,
… &g_pEffect10));
// Obtain variables
g_pLightDir = g_pEffect10->GetVariableByName(
"g_LightDir" )->AsVector();
HLSL – Set Parameters
(onFrameRender)
g_pLightDir->SetRawValue( vLightDir, 0,
sizeof(D3DXVECTOR3)*MAX_LIGHTS ) );
g_pLightDiffuse->SetFloatVectorArray(
(float*)vLightDiffuse, 0, MAX_LIGHTS ) );
g_pmWorldViewProjection->SetMatrix(
(float*)&mWorldViewProjection ) );
g_pmWorld->SetMatrix( (float*)&mWorld ) );
g_pfTime->SetFloat( (float)fTime ) );
g_pnNumLights->SetInt( g_nNumActiveLights ) );
HLSL – Rendering with Shaders
(onFrameRender)
pd3dDevice->IASetInputLayout( g_pVertexLayout );
// Apply the technique contained in the effect
pRenderTechnique->GetDesc( &TechDesc );
ID3D10EffectPass *pCurrentPass =
pRenderTechnique->GetPassByIndex( iPass );
pCurrentPass->Apply( 0 );
// Render the mesh with the applied technique
g_Mesh10.Render( pd3dDevice );
HLSL – Vertex Shader (HLSL10.fx)
VS_OUTPUT RenderSceneVS( float4 vPos : POSITION,
float3 vNormal : NORMAL,
float2 vTexCoord0 : …
Output.Position = mul(vAnimatedPos,
g_mWorldViewProjection);
vTotalLightDiffuse += g_LightDiffuse[i] *
max(0,dot(vNormalWorldSpace, g_LightDir[i]));
Output.Diffuse.rgb = g_MaterialDiffuseColor *
vTotalLightDiffuse + g_MaterialAmbientColor *
g_LightAmbient;
return Output; // Pass light and coordinate info to
// Pixel shader
HLSL – Pixel Shader (HLSL10.fx)
PS_OUTPUT RenderScenePS( VS_OUTPUT In,
… )
// Get value from texture and multiply with light
Output.RGBColor = g_MeshTexture.Sample(
MeshTextureSampler, In.TextureUV)
* In.Diffuse;
return Output; // Final value in frame buffer
Final Output
More Eye Candy!
www.nvidia.com
Resources
www.microsoft.com/directx
developer.nvidia.com
developer.ati.com
www.gamedev.net
Beginning Direct3D Game Programming,
Second Edition (Game Programming) - Wolfgang
Engel
Shader X Series – Wolfgang Engel
© 2006 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries.
The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market conditions, it should not
be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation.
MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.