9

Click here to load reader

DirectX funda

Embed Size (px)

Citation preview

Page 1: DirectX funda

INTRODUCTION:

The performance of graphics was ever increasing since 1995 when the first 3dfx Voodoo card was

released which allowed the graphics to run faster but arguably not the best because of some restrictions

, as developers were limited to using hard coded specific fixed pipeline algorithms.

Thus for tackling such overheads & giving the freedom to the developers for implementing the basics of

programmable pipeline to achieve awesome real time rendering the principal 3d API’s (DirectX &

OpenGL) have evolved.

In DirectX API after the successful initialization of direct3d & direct3ddevice interface the object is then

rendered either with a fixed semi static flow of objects which restricts scenes to be single colored flat

polygons or integrate some depth to the scene to achieve realism via robust dynamic flow.

THE RENDERING PIPELINE:

The rendering process is a series of transformations that take the *object from the co-ordinate system

in which it is encoded to the system of the display surface.

*An object is represented as triangle meshes, a list of triangles that approximates the shape and contours of the object. Stages of Rendering Pipeline: (PLUMBING)

Fixed Semi Static

Flow Robust Dynamic Flow

Page 2: DirectX funda

The model used in Microsoft's Direct 3D is based on a transformation sequence that starts with polygon vertices being fed into a transformation pipeline. The pipeline performs world, view, projection, and clipping transformations before data is sent to the rasterizer for display. Some Techniques of rendering: Vertex: A vertex is a special kind of point of a polygon, polyhedron or polytopes formed by intersection of edges, faces or facets of the object. There are two different ways of processing vertices: 1. The “fixed-function” pipeline: This is the standard Transform & Lighting (T&L) pipeline, where the functionality is essentially fixed. The T&L pipeline can be controlled by setting render states, matrices, and lighting and material parameters. 2. Vertex shaders: This is the new mechanism introduced in DirectX 8. Instead of setting parameters to control the pipeline, you write a vertex shader program that executes on the graphics hardware.

Vertex shaders are small programs that are written mainly for transforming the vertices from the

*vertex buffer into 3D space. The vertex shader program will be called by the GPU for each vertex it

needs to process. For example a 5,000 polygon model will run your vertex shader program 15,000 times

each frame just to draw that single model. So if you lock your graphics program to 60 fps it will call your

vertex shader 900,000 times a second to draw just 5,000 triangles.

*Vertex Buffer: Each of the triangles in the sphere model has three points to

it, we call each point a vertex. So for us to render the sphere model we need

to put all the vertices that form the sphere into a special data array that we

call a vertex buffer.

WHY VERTEX SHADERS:

To bypass T&L pipeline

To avoid redundancy occurred when processing shared vertex attribute calculations b/w

geometry engine & CPU

To achieve unique & revolutionary graphical effects

More general syntax for common operations

Vertex shaders even gives users the ability to decompress compressed position, normal, color, matrix, and texture coordinate data and to save a lot of valuable bandwidth without any additional cost

More Scalable

Page 3: DirectX funda

APPLICATIONS OF VERTEX SHADERS:

Procedural geometry (cloth simulation, soap bubble)

Advanced vertex blending for skinning and vertex morphing (tweening)

Texture generation

Advanced key frame interpolation (complex facial expression and speech)

Particle system rendering

Real-time modifications of the perspective view (lens effects, underwater effects)

Advanced lighting models (often in cooperation with the pixel shader)

First steps to displacement mapping

& many more even effects realized only on SGI hardware

The first & foremost effect used with a vertex shader is vertex lighting. Earlier in T&L pipeline scenes

were rendered using default lighting which meant everything was lighted equally & brightly leaving no

room for realism.

Lighting in scene helps to set the mood for a scene, be it the dim lighting which instills fear & a sense of

unease in the player or bright lighting which rather shows the happiness & the comfort with which the

player plays.

Direct3D allows the developer with multiple types of lights to be used within the scene.

Ambient Light:

An ambient light source illuminates the entire scene. It has no orientation or position and is unaffected by surface characteristics of the illuminated objects. Because all objects are illuminated with equal strength, the position and orientation of the frame is inconsequential. Directional Light: A directional light has a specific orientation, but no position. It appears to illuminate all objects with equal intensity. This light source is often used to simulate the effect of distant sources, such as the sun. Rendering speed is maximum for directional lights. Parallel Point Light: The parallel point light is a variation of directional light. The orientation of a parallel point light is determined by the position of the source. Whereas a directional light source has orientation, but no position, a parallel point light source has orientation and position. The rendering performance is similar to the directional source.

Page 4: DirectX funda

Point Light: A point light source radiates light equally in all directions. This light is computationally expensive to render because retained mode must calculate a new lighting vector for every face it illuminates. On the other hand, a point light source produces a more faithful lighting effect. When visual fidelity is a concern a point light source is the best option. Spot Light: A spotlight is cone-shaped light with the source located at the cone's vertex. All objects within the cone are illuminated, but at two degrees of intensity. The central area of the cone is brightly lit. This section is called the umbra. The surrounding section, which is dimly lit, is called the penumbra. In retained mode the angles of the umbra and penumbra can be individually specified.

A scene in the world can include any combination of light sources mentioned above. Along with

selecting the light sources the developer needs to select the color of the object which in turn is

determined by its material.

Material is determined by 4 types of light that our object can absorb and the sharpness of specular

highlights. These types of lights are: diffuse, ambient, specular and emissive.

Page 5: DirectX funda

For creating Lights in direct X there is no such thing like a single function call, the developer needs to do

the following:

1. Setting Up a New Flexible Vertex Format

2. Turning the Lighting On

3. Setting the Ambient Light

4. Creating and Setting the Diffuse Light

5. Creating and Setting the Material

The detailed explanations of these steps are explained in the source code attached along with this

document.

In DirectX10 and above the robust dynamic flow of rendering is done through HLSL.

HLSL:

HLSL is the language we use in DirectX 10 to code these small vertex and pixel shader programs. The

syntax is pretty much identical to the C language with some pre-defined types. HLSL program files are

composed of global variables, type defines, vertex shaders, pixel shaders, and geometry shaders.

FRAMEWORK:

Page 6: DirectX funda

While calculating the lighting the only thing that the program should look at is its vertices.

For example when entering into a game scene it somewhat resembles this:

In this example the light has a small radius yet it

still manages to illuminate the entire triangle. This

is because the small light radius touches a vertex of

the triangle mesh.

Here the problem arises when the light is not able to

reach a vertex, it is unable to lit the triangle.

Shadowing is another factor in game when proper lighting is not rendered.

All these in game issues can be sorted out using proper tessellated geometry.

The mesh on the left might become completely

black in game for previously explained reasons.

The right mesh however is tessellated and will

react better to lighting because it contains more

vertices. The lighting can be more precise and it

is less of an issue if parts of the meshes are

stuck in other geometry.

*Note: Attached meshes (placing objects on one another) & complex triangulations can also lead to bad

in game lighting experiences.

*FIND THE SOURCE CODE ATTACHED ALONG WITH THIS DOCUMENT

Page 7: DirectX funda

TEXTURING:

Texturing allows us to add photorealism to our scenes by applying photographs and other images onto

polygon faces.

The format of the textures we will be using are .dds files. This is the Direct Draw Surface format that

DirectX uses. The tool used to produce .dds files comes with the DirectX SDK. It is under DirectX Utilities

and is called DirectX Texture Tool. You can create a new texture of any size and format and then cut and

paste your image or other format texture onto it and save it as a .dds file. It is very simple to use.

To map pixels from the .dds image onto the polygon we use what is called the Texel Coordinate System.

This system converts the integer value of the pixel into a floating point value between 0.0f and 1.0f. For

example if a texture width is 256 pixels wide then the first pixel will map to 0.0f, the 256th pixel will map

to 1.0f, and a middle pixel of 128 would map to 0.5f.

In the texel coordinate system the width value is named "U" and the height value is named "V". The

width goes from 0.0 on the left to 1.0 on the right. The height goes from 0.0 on the top to 1.0 on the

bottom. For example top left would be denoted as U 0.0, V 0.0 and bottom right would be denoted as U

1.0, V 1.0.

Framework:

The source code is explained in detail & attached along with this document.

Page 8: DirectX funda

BUMP MAPPPING:

The proper terminology for the bump mapping technique we use is called *normal mapping. The reason

being is that we use a special texture called a normal map which is essentially a look up table for surface

normals. Each pixel in this normal map indicates the light direction for the corresponding pixel on the

texture color map. This technique works by using a normal specified at each pixel, instead of using the

normal specified at each vertex, which is then "interpolated" across the surface.

*NORMAL:

Each face in a mesh has a perpendicular unit normal vector. The

vector's direction is determined by the order in which the vertices

are defined and by whether the coordinate system is right- or left-

handed. The face normal points away from the front side of the face.

In Direct3D, only the front of a face is visible. A front face is one in

which vertices are defined in clockwise order.

We get the normal from the normal specified for each vertex, which is then interpolated across the face

for the pixel shader, where we will do our normal mapping.

Once color mapping textures are applied to the scenes then we use normal mapping & giving lighting to

give a more realistic approach to the scene with less cost & more feasibility.

Page 9: DirectX funda

The tools that create normal maps take the x, y, z coordinates and translate them to red, green, blue

pixels with the intensity of each color indicating the angle of the normal they represent. The normal of

our polygon surface is still calculated the same way as before. However the two other normals we need

to calculate require the vertex and texture coordinates for that polygon surface. These two normals are

called the tangent and binormal.

We will need to do some precalculation to determine the binormal and tangent vector using the normal

and texture coordinates. Also note that you should never do this inside the shader as it is fairly

expensive with all the floating point math involved.

bumpNormal = (bumpMap.x * input.tangent) + (bumpMap.y * input.binormal) + (bumpMap.z * input.normal);

Framework:

The source code is explained & attached along with this document.

References:

Direct3D shaderX, vertex & pixel shader by Wolfgang F.Engel

Learn vertex & pixel shader programming by James C.Lelterman

Beginning DirectX 9 by Wendy Jones

www.rastertek.com