Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
© Machiraju/Zhang/Möller/Klaffenböck
Image Formation
Introduction to Computer Graphics
© Machiraju/Zhang/Möller/Klaffenböck �2
Today
• Input and displays of a graphics system • Raster display basics: pixels, the frame
buffer, raster scan, LCD displays • Historical perspective • Image formation • The ray tracing algorithm • The z-buffer algorithm
Readings: Sections 1, 12.2 (Ray tracing), 5.8 (z-buffering)
© Machiraju/Zhang/Möller/Klaffenböck �3
Overview of a graphics system
Input devices
Image formed and stored in frame buffer
Output device
© Machiraju/Zhang/Möller/Klaffenböck �4
Input devices
• Pointing/locator devices: indicate location on screen – Mouse/trackball/spaceball – Data tablet – Joystick – Touch pad and touch screen
• Keyboard device: send character input • Choice devices: mouse buttons, function
keys
© Machiraju/Zhang/Möller/Klaffenböck �5
More sophisticated interaction
© Machiraju/Zhang/Möller/Klaffenböck �6
Input devices for this class
our class
Use simple keyboard and mouse commands to interact with your WebGL program
© Machiraju/Zhang/Möller/Klaffenböck �7
Output: cathode-ray tube (CRT)
© Machiraju/Zhang/Möller/Klaffenböck �8http://www.vintage-computer.com/vcforum/album.php?albumid=49&attachmentid=2721
© Machiraju/Zhang/Möller/Klaffenböck �9
CRT basics• The screen output is stored in the frame buffer and
is converted into voltages across the reflection plates via a digital-to-analog converter (DAG)
• Light is emitted when electrons hit phosphor • But light output from the phosphor decays
exponentially with time, typically in 10 – 60 microseconds – Thus the screen needs to be redrawn or refreshed – Refresh rate is typically 60 Hz to avoid flicker (“twinkling”) – Flicker: when the eye can no longer integrate individual light
pulses from a point on screen, e.g., due to low refresh rate
© Machiraju/Zhang/Möller/Klaffenböck �10
Shadow-mask color CRTs
• Three different colored phosphors (R, G, B) dots are arranged in very small groups (triads) on coating
We see a mixture of three colors Three electron guns (R, G, B) emit electron beams in a controlled fashion so that only phosphors of the proper colors are excited
Shadow-mask CRT
© Machiraju/Zhang/Möller/Klaffenböck �11
Flat-panel displays• CRTs are no longer the most widely used display
devices nowadays • Flat-panel displays, e.g., LCDs or liquid-crystal
displays and plasma displays dominate • Flat-panel displays can be emissive (plasma) or non-
emissive (LCD) — allowing for light to pass through • Colors can be produced using triads (sub-pixels) • Screen refresh no longer necessary • See Chapter 3.1 of Marschner et al. for more on
flat-panel technologies (or check out wikipedia)
© Machiraju/Zhang/Möller/Klaffenböck �12
A generic flat-panel display
Figure 3.2 + 3.3 from Shirley et al.
© Machiraju/Zhang/Möller/Klaffenböck �13
High Dynamic Range Displays
High resolution colour LCD
Low resolution Individually Modulated LED array
High Dynamic Range Display
Courtesy:
© Machiraju/Zhang/Möller/Klaffenböck �14
Raster display basics
• The screen is a rectangular array of picture elements, or pixels
• Resolution: determines the details you can see
• number of pixels in an image • e.g.,
1024×768, 1280x1024, 1366 x 768, etc. • also in ppi or dpi – pixel or dot per inch
© Machiraju/Zhang/Möller/Klaffenböck �15
Raster scan pattern
• Horizontal scan rate: # scan lines per second
• Interlaced (TV) vs. non-interlaced displays Scan line
Horizontal retrace
Vertical retrace
© Machiraju/Zhang/Möller/Klaffenböck �16
Video display standards• NTSC (National Television System
Committee): – 525 lines, 30 frames/sec interlaced (visual system
tricked into thinking that the refresh rate is 60 f/s) – 480 lines visible – North America, South America, Japan
• PAL/SECAM: – 625 lines, 25 frames/sec interlaced – Europe, Russian, Africa, etc.
• HDTV: 1,080 lines
© Machiraju/Zhang/Möller/Klaffenböck �17
The frame buffer• Stores per-pixel information
– Depth of a frame buffer: number of bits per pixel – E.g. for color representation, 1 bit => 2 colors,
8 bits => 256 colors, 24 bits => true color (16 million colors)
• Color buffer is only one of many buffers, other information, e.g., depth, can also be used
• Implemented with special type of memory in standard PCs or on a graphics card for fast redisplay
• Part of standard memory in earlier systems
© Machiraju/Zhang/Möller/Klaffenböck �18
Today
• Input and displays of a graphics system • Raster display basics: pixels, the frame
buffer, raster scan, LCD displays • Historical perspective • Image formation • The ray tracing algorithm • The z-buffer algorithm
Readings: Sections 1.2-1.5, 11.2 (Ray tracing), 6.11.5 (z-buffering)
© Machiraju/Zhang/Möller/Klaffenböck
Comp Graphics: 1950-1960
• Computer graphics goes back to the earliest days of computing – Strip charts – Pen plotters – Simple displays using A/D converters to go
from computer to calligraphic CRT • Cost of refresh for CRT too high
– Computers slow, expensive, unreliable�19adapted from Angel and Shreiner
http://en.wikipedia.org/wiki/File:Polygraaf.PNG
© Machiraju/Zhang/Möller/Klaffenböck
Comp Graphics: 1960-1970
• Wireframe graphics – Draw only lines
• Sketchpad • Display Processors • Storage tube
wireframe representation of sun object�20adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck
Sketchpad
• Ivan Sutherland’s PhD thesis at MIT – Recognized the potential of man-machine
interaction – Loop
• Display something • User moves light pen • Computer generates new display
– Sutherland also created many of the now common algorithms for computer graphics
�21adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck
Interactive Computer Graphics
�22
Ivan Sutherland, Sketchpad, 1963
Doug Engelbart, 1968http://web.stanford.edu/dept/SUL/library/extra4/sloan/MouseSite/1968Demo.html
© Machiraju/Zhang/Möller/Klaffenböck
Comp Graphics: 1970-1980
• Raster Graphics • Beginning of graphics standards
– IFIPS • GKS: European effort
– Becomes ISO 2D standard • Core: North American effort
– 3D but fails to become ISO standard
• Workstations and PCs
�23adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck
Raster Graphics
• Allows us to go from lines and wire frame images to filled polygons
�24adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck
Comp Graphics: 1980-1990
• Realism comes to computer graphics
smooth shading environmentmapping
bump mapping
�25adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck
Comp Graphics: 1980-1990
• Special purpose hardware – Silicon Graphics geometry engine
• VLSI implementation of graphics pipeline
• Industry-based standards – PHIGS – RenderMan
• Networked graphics: X Window System • Human-Computer Interfaction (HCI)
�26adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck
Comp Graphics: 1990-2000
• OpenGL API • Completely computer-generated feature-
length movies (Toy Story) are successful • New hardware capabilities
– Texture mapping – Blending – Accumulation, stencil buffers
�27adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck
Computer Graphics: 2000-
• Photorealism • Graphics cards for PCs dominate market
– Nvidia, ATI • Game boxes and game players
determine direction of market • Computer graphics routine in movie
industry: Maya, Lightwave • Programmable pipelines
�28adapted from Angel and Shreiner
© Machiraju/Zhang/Möller/Klaffenböck �29
Today
• Input and displays of a graphics system • Raster display basics: pixels, the frame
buffer, raster scan, LCD displays • Historical perspective • Image formation • The ray tracing algorithm • The z-buffer algorithm
Readings: Sections 1.2-1.5, 11.2 (Ray tracing), 6.11.5 (z-buffering)
© Machiraju/Zhang/Möller/Klaffenböck �30
Image formation (aside)
• In computer graphics, we form 2D images via a process analogous to how images are formed by physical imaging systems – Cameras – Microscopes – Human visual system
© Machiraju/Zhang/Möller/Klaffenböck �31
Elements of image formation• Object:
– Exists independent of image formation and viewer – defined in its own object space
– Formed by geometric primitives, e.g., polygons • Viewer:
– Forms the image of the objects via projection onto an image plane
– Image is produced in 2D, e.g., on retina, film, etc. – Objects need to be transformed into the image
space
© Machiraju/Zhang/Möller/Klaffenböck �32
Also, light and color• No light => nothing is visible • How do we, or the camera, see?
– Light reflected off objects in the scene, or – Light transmitted directly into our eyes, e.g.,
from light source • What is reflected color?
– Light and object have color – When colored light reaches a colored object,
some colors are reflected and some absorbed
© Machiraju/Zhang/Möller/Klaffenböck �33
Light
The electromagnetic spectrum
• Light is the (visible) part of the electromagnetic spectrum that causes a reaction in our visual systems
• Generally these are wavelengths in the range of about 350-750 nm (nanometers)
• Long wavelengthsappear as reds andshort wavelengths as blues
© Machiraju/Zhang/Möller/Klaffenböck �34
Simplistic color representation• RGB-color system (R: red, G: green, B: blue) • Each displayed color has 3 components:
R, G, B • 8 bits per component in 24-bit true-color
display; component value: 0 – 255
• In OpenGL/WebGL, RGB values are in [0,1] • More details on lights and colors later in course
(R,G,B) = (255,255,255) (0,0,0) (255,0,0) (255,255,0) (0,255,255)
© Machiraju/Zhang/Möller/Klaffenböck �35
Imaging system: pinhole camera
• Use trigonometry to find projectionof point at (x,y,z)
xp = �dx/zyp = �dy/zzp = d
© Machiraju/Zhang/Möller/Klaffenböck �36
Synthetic camera model
center of projection
image plane
projector
Imaging model adopted by 3D computer graphics
© Machiraju/Zhang/Möller/Klaffenböck �37
First imaging algorithm
• Directly derived from the synthetic camera model – Follow rays of light from a point light source – Determine which rays enter
the lens of the camera throughthe imaging window
– Compute color of projection • Why is this not a good idea?
© Machiraju/Zhang/Möller/Klaffenböck �38
Ray tracing• When following light from a light source,
many (reflected) rays do not intersect the image window and do not contribute to the scene
Reverse the process Cast one ray per pixel from the eye and shoot it into the scene
raysreflected ray shadow
ray
refracted ray
Image plane
eyepixel
© Machiraju/Zhang/Möller/Klaffenböck �39
Ray tracing: the basics
• A point on an object may be illuminated by – Light source directly – through shadow ray – Light reflected off an object – through
reflected ray – Light transmitted through a transparent object
– through refracted ray reflected ray shadow
ray
refracted ray
Image plane
eye pixel
© Machiraju/Zhang/Möller/Klaffenböck �40
Ray tracing: the algorithmfor each pixel on screen determine ray from eye through pixel if ray shoots into infinity, return a background color if ray shoots into light source, return light color appropriately find closest intersection of ray with an object cast off shadow ray (towards light sources) if shadow ray hits a light source, compute light contribution according to some illumination model cast reflected and refracted ray, recursively calculate pixel color contribution return pixel color after some absorption
© Machiraju/Zhang/Möller/Klaffenböck �41The WingsPov Civilisation Museum by Eric Ouvrard
© Machiraju/Zhang/Möller/Klaffenböck �42
Ray tracing: pros and cons• Pros:
– Follows (approximately) the physics of optical flow – High level of visual realism – can model both light-
object interactions and inter-surface reflections and refractions
• Cons – expensive: intersection tests and the levels of
recursion (rays generated) – Inadequate for modeling non-reflective (dull-
looking) objects (why is that?)
© Machiraju/Zhang/Möller/Klaffenböck
Which is the most important ray?
�43
reflected ray shadow
ray
refracted ray
Image plane
eye pixel
© Machiraju/Zhang/Möller/Klaffenböck �44
The z-buffer algorithm• Per-polygon operations vs. per-pixel for ray
tracing • Simple and often accelerated with hardware • Works regardless of the order in which the
polygons are processed — no need to sort them back to front
• A visibility algorithm and not designed to compute colors
• WebGL implements this fundamental algorithm – gl.enable(gl.DEPTH_TEST); … – gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT); …
© Machiraju/Zhang/Möller/Klaffenböck �45
The algorithmfor each polygon in the scene project its vertices onto viewing (image) plane for each pixel inside the polygon formed on viewing plane determine point on polygon corresponding to this pixel get pixel color according to some illumination model get depth value for this pixel (distance from point to plane) if depth value < stored depth value for the pixel update pixel color in frame buffer update depth value in depth buffer (z-buffer) end if end if
© Machiraju/Zhang/Möller/Klaffenböck �46