7
CS348b Final Project: Ray-tracing Interference and Diffraction Douglas V. Johnston [email protected] Paul N. G. Tarjan [email protected] June 12, 2006 Abstract Although inherently a wave phenomena, diffraction can be implemented in a ray tracer by tak- ing advantage of a combination of Geometric Diffraction Theory and the Huygens’-Fresnel Principle. In this paper we show how to implement interference and diffraction in a computationally tractable manner, while retaining the properties of the full volume integration solution. 1 Introduction Physical reality is a goal for many ray tracing ap- plications. The complex interaction of lighting is often one of the more challenging components for a ray tracer to render in a realistic fashion. When rendering small scenes with visible light, or when rendering long wavelengths, diffraction plays an important and often overlooked role. Diffraction, the bending of light around ob- jects, is a difficult phenomena to represent in the context of ray tracers[1]. Because ray tracers in- herently calculate straight line paths, and only compute ray object intersections, the continuous, wave-like attributes of diffraction must be dis- cretely approximated. Interference is caused by light waves having a phase associated with them, not just an inten- sity. When two light waves hit the same object they could be in phase or out of phase, which de- termines how much intensity is observed at that point. Ray tracers are well suited to associate phase with each light ray, which makes creating interference patterns a minor change to most ray tracers. We will first introduce the basic principles be- hind interference and diffraction, and then show a viable implementation strategy in the context of the PBRT ray tracer[4]. Finally we will show re- sults highlighting the benefits and increased phys- ical reality of the diffraction system described. 2 Interference Interference of light is superposition of multiple light waves. This manifests itself as a changing intensity due to the constructive and destructive interference caused by the interaction of coherent light sources In phase waves will add their respec- tive intensities, and out of phase waves will instead 1

CS348b Final Project: Ray-tracing Interference and Diffraction

  • Upload
    lamtu

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: CS348b Final Project: Ray-tracing Interference and Diffraction

CS348b Final Project:Ray-tracing Interference and Diffraction

Douglas V. [email protected]

Paul N. G. [email protected]

June 12, 2006

Abstract

Although inherently a wave phenomena, diffraction can be implemented in a ray tracer by tak-ing advantage of a combination of Geometric Diffraction Theory and the Huygens’-Fresnel Principle.In this paper we show how to implement interference and diffraction in a computationally tractablemanner, while retaining the properties of the full volume integration solution.

1 Introduction

Physical reality is a goal for many ray tracing ap-plications. The complex interaction of lighting isoften one of the more challenging components fora ray tracer to render in a realistic fashion. Whenrendering small scenes with visible light, or whenrendering long wavelengths, diffraction plays animportant and often overlooked role.

Diffraction, the bending of light around ob-jects, is a difficult phenomena to represent in thecontext of ray tracers[1]. Because ray tracers in-herently calculate straight line paths, and onlycompute ray object intersections, the continuous,wave-like attributes of diffraction must be dis-cretely approximated.

Interference is caused by light waves havinga phase associated with them, not just an inten-sity. When two light waves hit the same objectthey could be in phase or out of phase, which de-

termines how much intensity is observed at thatpoint. Ray tracers are well suited to associatephase with each light ray, which makes creatinginterference patterns a minor change to most raytracers.

We will first introduce the basic principles be-hind interference and diffraction, and then showa viable implementation strategy in the context ofthe PBRT ray tracer[4]. Finally we will show re-sults highlighting the benefits and increased phys-ical reality of the diffraction system described.

2 Interference

Interference of light is superposition of multiplelight waves. This manifests itself as a changingintensity due to the constructive and destructiveinterference caused by the interaction of coherentlight sources In phase waves will add their respec-tive intensities, and out of phase waves will instead

1

Page 2: CS348b Final Project: Ray-tracing Interference and Diffraction

(a) Without interference (b) With interference (c) Phase contours (d) Phase and intensity

Figure 1: Two monochromatic coherent point light sources spaced by 8λ projected onto a plane. (a) Ifinterference is neglected, we get the same image we would expect from a macroscopic scale. (b) However,when the effects of interference are taken into account, the pattern of constructive and destructive interfer-ence becomes apparent. (c) The combined phase can be computed at each point. Red indicates a phasebetween 0 and π , and blue between π and 2π . (d) The phase plot is shown with the proper intensities.

cancel.

Most ray tracers sum two samples by justadding their intensities. This is correct if the twowaves are of the same phase, but they could sumto zero if they are perfectly out of phase. This ef-fect is only noticeable in certain situations but isnonetheless part of the real world.

In order to simulate interference, each color isstored as a complex number (compared to mostray tracers, that just store the intensity). All op-erations are done in the complex plane. Whenthe sampling is completed, the final image may bewritten with just the length of the complex num-ber to show the intensity as we would see it in thereal world, or one can associate a coloring schemeto the phase which allows visualization of lightphases (see Figure 1).

Each complex number is stored in Cartesiancoordinates. This allows for quick addition whichis the most common operation in the ray tracer.The final number is given in polar notation, sincethe amplitude corresponds to the intensity of the

light, while the phase is unseen (unless using it tovisualize diffraction phenomena).

3 Diffraction

Diffraction is a physical phenomena of all waves.A simple, everyday example of diffraction canbe observed by listening to a speaker. One doesnot need to be in the direct path of the soundwaves for the speaker to be heard. The sound willtravel around a corner by bending due to diffrac-tion. Wave-particle duality states that light is also awave, and will bend around objects. Visible light,however, bends a relatively small amount due to itswavelength being much smaller than most objects.However, when an object’s size begins to approachthe size of the wavelength, the diffraction effectsare readily observed and are an important effect.

The diffraction pattern for an aperture can bedetermined by integrating the phase and intensityof all light incident on the aperture[2]. For a cir-cular aperture, the resulting pattern is the famous

2

Page 3: CS348b Final Project: Ray-tracing Interference and Diffraction

Figure 2: An area light source of radius 7000/6 λ

shining on a diffuse plane 10000/6 λ away, using1024 samples per pixel, where λ is the wavelengthof light. This pattern is known as the Airy Disk.

Figure 3: Huygens’-Fresnel Principle: A planerwave can be represented by point sources along thewavefront. The resulting wave can be regarded asthe sum of all the secondary wave-fronts arisingfrom the point sources.

Airy disk (See figure 3).

The theory of geometric diffraction states thatthe bending of light can be approximated bychanging the angle of a ray of light on the edgeof an object. In order to compute the proper an-gle and phase of the redirected light, a diffractioncoefficient must be computed for the material andshape. This is impractical for general objects andmaterials as coefficients are not easily found andmust be empirically measured. Although approx-imations can be used about the shapes, in general,the error is unbounded.

The Huygens’-Fresnel principle states that alight wave may be represented by an infinite num-ber of hemispherical light sources incident on thewave front. Their principle direction is normal tothe wavefront facing the direction of travel. Inthe general case, this would involve placing lightsources filling the volume of the scene. This iscomputationally intractable. To make this compu-tationally feasible, we must limit the number ofHuygens’-Fresnel light sources. We accomplish

this by placing light sources only at the edges ofobjects. This is possible due to the negative inter-ference of all light sources which are not incidenton an edge as stated in the Huygens’-Fresnel the-ory. Hence, an aperture may be represented onlyby light sources around its edge. The resulting pat-tern is the same as if integrating over the entirearea.

Babinet’s principle states that a duality ex-ists between an aperture and an opaque body ofthe same shaped silhouette as seen from the lightsource. This theory allows the rendering of opaqueobjects by placing light sources around their outeredge, as done for apertures. Edges are defined aspoint where the surface normal is orthogonal to theincident light rays. A more formal description ofthe computation of the edges is described in sec-tion 4.2

We use the discretized approximation of theabove theory where light sources are spaced morefrequently than the sampling Nyquist limit dic-tates.

3

Page 4: CS348b Final Project: Ray-tracing Interference and Diffraction

4 General Implementation

PBRT, like most ray tracers has no support forthe wave-like properties of light. In order to im-plement the physical interaction of waves in pbrtmany parts of the the underlying structure need tobe changed. In particular, the spectrum class mustbe augmented to handle phase in addition to am-plitude (intensity) of light, and the light sourcesmust support the proper bending of their rays.

4.1 Complex Valued Lighting

The spectrum class modifications to account forinterference of light are handled in a ratherstraightforward way. Instead of a single real val-ued number representing the intensity of light, weuse a complex valued number, which representsthe amplitude and phase of the light. In order tosimplify the evaluation of the complex signal, weuse a Cartesian coordinate representation insteadof the amplitude and phase representation. Thischange of coordinate systems removes the need oftrigonometric evaluations when performing oper-ations on the spectrum class.

4.2 Silhouette Edge Computation

In order to properly place the diffracted lightsource on the incident objects, the edge boundarymust be computed for all objects. For spheres, thisis computed by first finding the plane orthogonalto the ray passing from the light source throughthe center of the sphere, incident on the centerof the sphere. Then, light sources are uniformlydistributed around the circle of intersection of thisplane and the sphere.

The edge boundaries of triangle meshes areslightly more complicated. For all edges, E, in

the triangulation, check the following: First, com-pute the vector N mutually orthogonal to the raypassing from the light source through the center ofthe edge in question, and the direction of the edge.This vector N represents the edge normal. A pointP is located a small value, ε , from the edge surfacein the direction N. If the edge is along the silhou-ette edge as seen from the light source, then a rayfrom the light source through P will not intersectany of the faces incident on E. This is tested byassuring a ray from the light source to P plus someε does not intersect the object.

Once it has been determined that an edge ispart of the boundary set of edges, light sourceare placed along the edge, in accordance with theHuygens’-Fresnel approximation.

Once it has been determined that an edge ispart of the boundary set of edges, light sourceare placed along the edge, in accordance with theHuygens’-Fresnel approximation.

5 Results

The changes to the rendering system allow for anyscene comprised of spheres and triangle meshesto be rendered with diffraction and interference.Other quadric shapes, such as cylinders and cones,can be easily implemented by computing thereedge bounds. A simple test scene is shown infigure 5. Here, the entire model is comprised ofspheres. Comparing figure 6(c) to figure 6(b), theeffect of diffraction can easily be seen.

Since the edge bounds are computed per facefor triangle mesh objects, the distribution of themesh geometry is important. The diffractionmodel assumes that the geometry distribution willbe of uniform detail, so that when light sources areplaced on the edges, the overall distribution will beuniform. The use of models in which the geom-

4

Page 5: CS348b Final Project: Ray-tracing Interference and Diffraction

Figure 4: The shadow mask as seen from a light source shining from the side

etry is not near-uniformly distributed will causeanomalous diffraction effects. This effect can beseen on the head region of the killeroos in fig-ure 5(b). The head appear to have much morediffracted light then the rest of the model due tothe increased detail in the polygonal model in thisarea.

The extra prepossessing work is insignificantdue to the overhead created by adding numerouslight source on each edge. For the images renderedin figure 5, three light sources were added peredge, totaling approximately 5000 light sources inthe scene. The rendering time for the image withand without diffraction was 6510 seconds and 34seconds, respectively. Although this seems a detri-mental slowdown, it should be noted that the im-age in figure 3, took well over two days to ren-der on comparable hardware. This shows the cleardownside to volume integration. The approach wehave taken maintains a good balance between ac-curacy and rending time, with a user defined vari-able in the pbrt available to bias towards one ofthe other in the form of the number of lights to be

placed per edge. Rendering time increases O(n) inthe number of lights in the scene.

6 Conclusion

Most ray tracers work well for macroscopic ob-jects, but when trying to render microscopic phe-nomena they fall short. Interference is not a largechange for most ray tracers but adding diffrac-tion seems to be a paradigm shift. Huygens’-Fresnel principle will yield diffraction for generalscenes but is computationally infeasible. Geomet-ric diffraction solves the main computational taskby only requiring extra computation at the edges ofobject. We place light sources at the edge of ob-jects instead of bending existing light, in order toaccommodate the usual framework that ray tracersare built on. The system is flexible, working forarbitrary geometry and aviods the high costs asso-ciated with volume rendering.

5

Page 6: CS348b Final Project: Ray-tracing Interference and Diffraction

(a) Without diffraction (b) With diffraction. The model geometry is bi-ased towards the head, which leads to an unequaldiffraction appearance.

Figure 5: Diffraction from a triangle mesh. The approach will work with any polygon model.

7 Acknowledgments

A large thanks to Prof. Essam Marouf for takingtime to help us through the details of Geometric

Diffraction Theory, and for pointing us to invalu-able resources. Thanks also to Fraser Thomson fortest data, and to William Schlotter for evaluatingour experiments, and proposing new ones.

References

[1] Edward R. Freniere, G. Groot Gregory, and Richard A. Hassler. Edge diffraction in monte carlo raytracing. Proceedings of SPIE, 1999.

[2] J. Goodman. Introduction to Fourier Optics. McGraw-Hill, 2nd edition edition, 1996.

[3] Eugene Hecht. Optics. Addison-Wesley, 3rd edition, 1998.

[4] Matt Pharr and Greg Humphreys. Pysically Based Rendering. Elsevier Inc., 2004.

[5] Len Tyler. EE354: Introduction to scattering, course reader, Winter, 2001.

[6] M. Young. Optics and Lasers. Springer, 5th edition edition, 2000.

6

Page 7: CS348b Final Project: Ray-tracing Interference and Diffraction

(a) Without interference or diffraction (b) Only interference

(c) With diffraction

Figure 6: Using only spheres, the spacing of the light sources is uniform.

7