Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
CSc 155 Lecture Note Slides
Dr. John ClevengerComputer Science Dept.
CSUS
Bump Mapping
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
2
Overview
• Limitations of Texture Mapping
• Normal Perturbation Functions
• Obtaining Bump Functions Explicit functions
Height Fields
Normal Maps
• Lighting & Tangent Space
• Normalization Cube Maps
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
3
Limitations Of Texture Mapping
• Texture mapping “paints” surfaceso Texture image is typically fixed
• Some characteristics are difficult to texture o “Roughness”, “Wrinkles”
• Some characteristics might change over time
• Texture illumination direction is fixedo Texture looks “wrong” if scene lighting is different
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
4
Bumps: Perturbed Normals
Surface normals on a real “bump”
Surface normals on a flat polygon
Modified (“perturbed”) normals
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
5
Bump Functions
1.1)8sin()( uub
1.0
2.0
8π
0 1
u
Original parametric surface Bump function
Modified “bumpy” surface
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
6
Computing Perturbed Normals
• “New surface points” are implied by N and B(u)
)(uP
Implied “bumpy surface” )(*ˆ)()( ubNuPuP
N̂
u
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
7
Perturbed Normals (cont.)
• Approximating the normal at new surface point
du
ubdNN
)(ˆ
)(*ˆ)()( ubNuPuP
[Blinn]
N
T
N̂
)(uP
)(uP
T
Bump function b(u)
TN
du
ubdubofslopeT )(
A good approximation for the new normal is
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
8
Perturbed Normals (cont.)
• Relationship of bump function slope to modified normals
2N
Bump function b(u)
1N
3N
3N 2N
1N
slope = 0
negative slope
positive slope
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
9
Perturbed Normals (cont.)
• Similar steps apply for 3D surfaces:
N
Surface implied by Bump function b(u,v)
),( vuP
T
B
BTN
),( vuP u
vuPdirectionuinslopeT
),(
u
v
v
vuPdirectionvinslopeB
),(
),(ˆ),(),( vubNvuPvuP
Vectors in local (“tangent”) space
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
10
Perturbed Normals (cont.)
• Approximating :
NBu
vubTN
v
vubNN ˆ),(ˆ),(ˆ
u
vubN
u
vuP
u
vuPT
),(ˆ),(),(
BTN
v
vubN
v
vuP
v
vuPB
),(ˆ),(),(
Bump function slope in v
Vector in B direction
Bump function slope in u
Vector in T direction
N̂
N
),( vuP
T
B
T
B
),( vuP
u
v
N
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
11
Say What Again?
• In (somewhat) plain English:
o Given a parametric surface P(u,v),
o the existing normal N at (u,v) can be perturbed in each of two directions u and v
o by adding small changes Δu and Δv respectively
o where Δu and Δv are derived from the slope of a bump function, b(u,v).
o The new normal N’ is then used in the lighting and shading equations.
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Example Bump-Mapped Objects
12
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Obtaining Bump Functions
13
•
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Bump Function Frag. Shader/** This fragment shader generates a sinusoidal pattern defined by a bump function* f(x,y) = a*sin(bx)*sin(by). The partial derivatives of this function are used to* offset the true normal; the input texture coordinate values s&t are used for x and y.* Based on an example in Interactive Computer Graphics, 5th Ed., by Ed Angel. */
#version 330
in vec3 L ; //vector to Lightin vec3 V ; //vector to Viewpoint (eye)in vec2 texCoord ; //fragment texture coordinateout vec3 fragColor ; //fragment output color
void main() {
float a = 0.5; //wave heightfloat b = 10.0; //wave frequencyvec3 N = vec3 (0.0, 0.0, 1.0); //local space normal
//get the x,y values from the texturefloat x = texCoord.s;float y = texCoord.t;
//modify X & Y components of the normal using the partial derivatives of the functionN.x = a*(b*cos(b*x))*sin(b*y); //partial derivative with respect to sin(bx)N.y = a*sin(b*x)*(b*cos(b*y)); //partial derivative with respect to sin(by)N = normalize(N);
vec3 LL = normalize(L); //insure the light vector is normalized
float Kd = max ( dot(N,LL), 0.0); //compute simple Lambertian reflection termvec3 materialColor = vec3( 0.2, 0.2, 0.9); //assume a mostly-blue material
//output a color based on the hard-coded material, reduced by Lambertian termfragColor = Kd * materialColor;
}
14 Run
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
15
Height Fields• Consider a monochrome image where
pixel intensity represents “height” Low values (black) = low height
High values (white) = high height
“Height Field” Image
“Tilted” View
“Edge-on” View
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
16
Height Fields (cont.)
Original Height Field Image
Expanded Portion
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
17
Forward Differences As Slope• We can use the difference between adjacent pixels as
an approximation of the “rate of change”…
“rate of change” ≈ slope
Pixel value: 0 43 86 129 172 215 250 255 250 215 172 129 …
Forward Diffs: -43 -43 -43 -43 -43 -35 -5 5 35 43 43 …
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
18
Height Field Example
A bump map “height field” texture file A checker-board textured
bump-mapped cylinder
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
19
Normal Mapping
• Basic bump-mapping goal: alter the normal
• Alternative: replace normals
o Need a source of new normals…
• Normals (when normalized) are unit lengtho hence, (x*x)+(y*y)+(z*z) = 1, and therefore -1 <= x,y,z <= 1
• New normals can be stored in 3 bytes (e.g. in a color image):r = (Nx+1)/2 g = (Ny+1)/2 b = (Nz+1)/2
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
20
Encoding Normals In RGB
• A true unit normal is perpendicular to the surface (has value [0 0 1])o Need to represent ∆x/∆y “deviations” (positive or negative)
• RGB values can range from 0..255o Choose “midrange” R & G values to represent ∆x = ∆y = 0
X
Y
Z 100ˆ N
128 128 255
R G B
0.5 0.5 1.0 100
∆x=0 ∆y=0 z
As ints:
As floats:
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
21
Normal Mapping Example
Image credit: Paul Baker,www.paulsprojects.net
A Normal Map image file…
Diffusely lit textured torus
Normal-mapped torus
Run
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Lighting Calculations
• Problem: vectors frequently not in same “space”
o E.g. Light position is (typically) in WORLD coordinates
• Solution: transform vectors to common local or tangent space
o Mapping is different for each vertex22
L
NH
V
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Tangent Space
23
1000
0
0
0
zyx
zyx
zyx
NNN
BBB
TTT
TBN
N
T
B
L
Needed: Light vector in Tangent Space, for each vertex
Example:objSpaceLightPosition = inverseModelMatrix * worldLightPosition;objSpaceLightVector = objSpaceLightPosition - objSpaceVertPosition; vertTanSpaceLightVector = TBNobjSpace * objSpaceLightVector;
Object Space
World Space
Eye Space
Clip Space
Model Matrix
View Matrix
Proj Matrix
Mo
del
Vie
w
Mat
rix
Tangent Space
TBN Matrix
TBN Matrix
TBN Matrix
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Storing TBN values
• Must be computed/saved per vertex
24
Vertex3D
...
- normal : Vector3D
- tangent : Vector3D
- binormal : Vector3D
- tangentSpaceLight : Vector3D
...
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Torus With TBN Values
• Sweep a ring of vertices about Y
25
X
Y
Outer Radius
Inner Radius
Precision
N
N
B
B
X
Y
X
Z
Inner Radius Outer Radius
Ring 0
Precision
Front ViewTop View
T
T
T
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Saving Torus TBN Values/** This class defines a torus constructed as previously shown, but augmented with TBN values*/public class Torus extends Shape3D {
...private void initTorus(double innerRadius, double outerRadius, int precision) {
for(int i=0; i<precision+1; i++) {//...code here as before to compute the torus ring 0 vertices...
//set ring 0 coordinates for texture mappingvertices[i].setS(0.0f);vertices[i].setT((float)i/precision;
//set the tangent, binormal, and normal vectors for the vertexvertices[i].setTangent(0.0f, 0.0f, -1.0f); //"into screen”, for right-handedVector3D binormal = new Vector3D(0.0, -1.0, 0.0);binormal = binormal.mult(rotateZ);vertices[i].setBinormal(binormal);Vector3D normal = vertices[i].getBinormal().cross(vertices[i].getTangent());vertices[i].setNormal(normal);
}//end for each vertex in first ring...for(int ring=1; ring<precision+1; ring++) { //for each new ring
...for(int i=0; i<precision+1; i++) { //for each vertex of new ring
//...code here as before to generate other ring vertices//set tex coords: S increases around the torus while T is constantvertices[ring*(precision+1)+i].setS(2.0f*ring/precision);vertices[ring*(precision+1)+i].setT(vertices[i].getT());
//set the tangent space vectorsvertices[ring*(precision+1)+i].setTangent(vertices[i].getTangent().mult(rotate));vertices[ring*(precision+1)+i].setBinormal(vertices[i].getBinormal().mult(rotate));vertices[ring*(precision+1)+i].setNormal(vertices[i].getNormal().mult(rotate));
}}//...code here to generate torus indices as before...
}//end initTorus()...
} 26
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Saving Tangent Space Lights//compute the inverse MV matrixdouble [] mvValues = new double [16];gl.glGetDoublev(GL2.GL_MODELVIEW_MATRIX, mvValues, 0); //get current MVMatrix3D mv = new Matrix3D(mvValues);Matrix3D mvInverse = mv.inverse(); //form inverse of MV
//get the current position of light -- note that in the fixed-function pipeline this will be in // EYE coordinates; note also that this code assumes the light of interest is GL_LIGHT0float [] eyeLightValues = new float[4]; gl.glGetLightfv(GL2.GL_LIGHT0, GL2.GL_POSITION, eyeLightValues,0);Point3D eyeLightPos = new Point3D(eyeLightValues);
//apply mvInverse to convert light position from eye space to object spacePoint3D objSpaceLightPos = eyeLightPos.mult(mvInverse);
//for each vertex, compute and save the vector from that vertex to the light in tangent spacefor (int i=0; i<vertices.length; i++) {
//compute vector from vertex to light in object spaceVector3D objSpaceLightVector = new Vector3D(objSpaceLightPos.minus(vertices[i].getLocation()));
//multiply objSpaceLightVector by TBN matrix for this vertex giving tangent space light vector.//Note that multiplying a vector by a matrix is the same as forming the dot product of the// vector with each row of the matrix. Vector3D tangent = vertices[i].getTangent();Vector3D binormal = vertices[i].getBinormal();Vector3D normal = vertices[i].getNormal();
Vector3D tanSpaceLightVector = new Vector3D();tanSpaceLightVector.setX(tangent.dot(objSpaceLightVector));tanSpaceLightVector.setY(binormal.dot(objSpaceLightVector));tanSpaceLightVector.setZ(normal.dot(objSpaceLightVector));
vertices[i].setTangentSpaceLightVector(tanSpaceLightVector);}
27
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
28
Tangent Space In Shaders• Application code:
o Provide vertex tangent and normal vectors
• Vertex shader:o Use vertex tangent and normal to build TBN matrix
o Use TBN to convert light and view vectors to “tangent space”
o Compute vertex position using ModelViewProjection matrix
o Send to fragment processor: vertex position
texture coords
L and V in tangent space
• Fragment shaders (two Approaches):o Shader-defined “bump function” in fragment shader (seen previously)
o Store bump/normal map in a texture unit
L
NH
V
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Tangent Space Vertex Shader/** This vertex shader expects the application to provide input attributes for vertex position,* object-space normal and tangent, and texture coordinates. It also expects the application to * assign uniforms for the relevant matrices (Model, View, and Projection each separately, plus * the Normal matrix, and also for the world light position.* The shader transforms the Light and View vectors into tangent space and forwards them to the* fragment shader. Based on an example in Interactive Computer Graphics, 5th Ed., by Ed Angel. */#version 330
in vec3 vertPos; //vertex position in object (model) spacein vec3 vertNormal; //vertex normal in object spacein vec2 vertTexCoord; //vertex texture coordinatesin vec3 vertTangent; //vertex tangent in object space
out vec3 L; //output vector from vertex to light, in tangent spaceout vec3 V; //output vector from vertex to eye, in tangent spaceout vec2 texCoord; //output vertex texture coordinates
uniform mat4 modelMat; //the transform from object to world spaceuniform mat4 viewMat; //the transform from world to eye spaceuniform mat4 projMat; //the transform from eye to clip spaceuniform mat4 normalMat; //the transform for vectors from object space to eye spaceuniform vec3 worldSpaceLightPos; //the light position in the world
//...continued
29
L
NH
V
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Tangent Space Vertex Shader (cont.)
//vertex shader, cont.
void main() {
//get eye-space vertex and light positionsvec3 eyeSpaceVertexPos = vec3 (viewMat * modelMat * vec4(vertPos,1));vec3 eyeSpaceLightPos = vec3 (viewMat * (vec4(worldLightPos,1)) ;
//compute normal, tangent, and binormal vectors in eye spacevec3 N = normalize(normalMat * vec4(vertNormal,1));vec3 T = normalize(normalMat * vec4(vertTangent,1));vec3 B = cross (N, T);
//compute light vector L in tangent spacevec3 eyeSpaceVectorToLight = vec3( eyeSpaceLightPos – eyeSpaceVertexPos );L.x = dot (T, eyeSpaceVectorToLight);L.y = dot (B, eyeSpaceVectorToLight);L.z = dot (N, eyeSpaceVectorToLight);L = normalize(L);
//compute view vector V in tangent spacevec3 eyeSpaceVectorToEye = vec3(0,0,0) – eyeSpaceVertexPos;V.x = dot (T, eyeSpaceVectorToEye );V.y = dot (B, eyeSpaceVectorToEye );V.z = dot (N, eyeSpaceVectorToEye );V = normalize(V);
//output the clip-space vertex position and texture coordsgl_Position = projMat * viewMat * modelMat * vertPos;texCoord = vertTexCoord;
}
30
L
NH
V
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
31
Normal Mapping Frag. Shader/* This fragment shader computes a diffuse lighting value for a fragment based on combining * a texture map with a normal map. It expects to receive the texture and normal maps via * uniform sampler2D variables from the application, along with tangent space light and eye * (view) vectors from the vertex shader. * Based on an example in Interactive Computer Graphics, 5th Ed., by Edward Angel. */
in vec3 L ; //vector from fragment to light, in tangent spacein vec3 V; //vector from fragment to eye, in tangent spacein vec2 texCoord; //fragment texture coordinate
uniform sampler2D texMap; //texture to be applied to fragmentuniform sampler2D normalMap; //normal vectors for fragments
out vec4 fragColor;
void main () {
//get the normal from the normal mapvec3 normal = texture2D( normalMap, texCoord.st );
//unpack the normal: since normals are packed using (N+1)/2, we apply the inverse of thatvec3 N = normalize (2.0 * normal.xyz – 1.0) ;
//insure the input light vector is normalizedvec3 LL = normalize (L);
//compute Lambertian diffuse coefficient from the normal and light vectorsfloat Kd = max( dot(N,LL), 0.0 );
//get texel color from texture mapvec4 texColor = texture2D( texMap, texCoord.st );
//output the texture color modulated by the diffuse coefficientfragColor = Kd * texColor ;
}
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Normalization Cube Maps
• Normalization is expensiveo Especially when done for every pixel
32
222ˆ
zyx
zyx
NNN
NNN
N
NN
• Texture lookups are fast
• Solution: create a cube map such that accessing it with a vector (x,y,z) returns the normalized version of that vector
CSc 155 Lecture Note SlidesBump Mapping
John ClevengerCSc Dept, CSUS
Normalization Cube Maps (cont.)
33
X
Y
Z
V[x,y,z]
Texture Cube Map
“Texture coords”
zyx ˆˆˆ
The value stored at [x,y,z] is the normalized version of [x,y,z]; hence,
],,[ˆzyx NNNtextureN