332

C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading
Page 2: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

COMPUTERGRAPHICS

PROGRAMMINGINOPENGLWITHJAVA

Page 3: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

LICENSE,DISCLAIMEROFLIABILITY,ANDLIMITEDWARRANTY

Bypurchasingorusingthisbook(the“Work”),youagreethatthislicensegrantspermissiontousethecontentscontainedherein,butdoesnotgiveyoutherightofownershiptoanyofthetextualcontentinthebookorownershiptoanyoftheinformationorproductscontainedinit.ThislicensedoesnotpermituploadingoftheWorkontotheInternetoronanetwork(ofanykind)without thewrittenconsentof thePublisher.Duplicationordisseminationofany text,code, simulations, images, etc. containedherein is limited toand subject to licensing termsfortherespectiveproducts,andpermissionmustbeobtainedfromthePublisherortheownerof thecontent,etc., inorder toreproduceornetworkanyportionof thetextualmaterial(inanymedia)thatiscontainedintheWork.

MERCURYLEARNINGANDINFORMATION(“MLI”or“thePublisher”)andanyoneinvolvedinthecreation,writing,orproductionofthecompaniondisc,accompanyingalgorithms,code,orcomputerprograms(“thesoftware”),andanyaccompanyingWebsiteorsoftwareoftheWork,cannotanddonotwarranttheperformanceorresultsthatmightbeobtainedbyusingthecontentsoftheWork.Theauthors,developers,andthePublisherhaveusedtheirbesteffortstoinsuretheaccuracyandfunctionalityofthetextualmaterialand/orprogramscontainedinthispackage;we,however,makenowarrantyofanykind,expressorimplied,regardingtheperformanceof these contents or programs. The Work is sold “as is” without warranty (except fordefectivematerialsusedinmanufacturingthebookorduetofaultyworkmanship).

The authors, developers, and the publisher of any accompanying content, and anyoneinvolved in thecomposition,production, andmanufacturingof thisworkwillnotbe liablefor damages of any kind arising out of the use of (or the inability to use) the algorithms,source code, computer programs, or textual material contained in this publication. Thisincludes, but is not limited to, loss of revenue or profit, or other incidental, physical, orconsequentialdamagesarisingoutoftheuseofthisWork.

Thesoleremedyintheeventofaclaimofanykindisexpresslylimitedtoreplacementofthebook,andonlyatthediscretionofthePublisher.Theuseof“impliedwarranty”andcertain“exclusions”varyfromstatetostate,andmightnotapplytothepurchaserofthisproduct.

Companion disc files are available for download from the publisher by writing [email protected].

Page 4: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

COMPUTERGRAPHICSPROGRAMMINGINOPENGL

WITHJAVA

V.ScottGordon,Ph.D.CaliforniaStateUniversity,Sacramento

JohnClevenger,Ph.D.CaliforniaStateUniversity,Sacramento

MERCURYLEARNINGANDINFORMATION

Dulles,VirginiaBoston,Massachusetts

NewDelhi

Page 5: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Copyright©2017byMERCURYLEARNINGANDINFORMATIONLLC.Allrightsreserved.

Thispublication,portionsofit,oranyaccompanyingsoftwaremaynotbereproducedinanyway,storedinaretrievalsystemofanytype,ortransmittedbyanymeans,media,electronicdisplayormechanicaldisplay,including,butnotlimitedto,photocopy,recording,Internetpostings,orscanning,withoutpriorpermissioninwritingfromthepublisher.

Publisher:DavidPallai

MERCURYLEARNINGANDINFORMATION22841QuicksilverDriveDulles,[email protected](800)232-0223

V.ScottGordon&JohnClevengerComputerGraphicsProgramminginOpenGLwithJavaISBN:978-1-683920-27-4

Thepublisherrecognizesandrespectsallmarksusedbycompanies,manufacturers,anddevelopersasameanstodistinguishtheirproducts.Allbrandnamesandproductnamesmentionedinthisbookaretrademarksorservicemarksoftheirrespectivecompanies.Anyomissionormisuse(ofanykind)ofservicemarksortrademarks,etc.isnotanattempttoinfringeonthepropertyofothers.

LibraryofCongressControlNumber:2016962393

171819321PrintedintheUnitedStatesofAmericaonacid-freepaper

Ourtitlesareavailableforadoption,license,orbulkpurchasebyinstitutions,corporations,etc.Foradditionalinformation,pleasecontacttheCustomerServiceDept.at800-232-0223(tollfree).Digitalversionsofourtitlesareavailableat:www.authorcloudware.comandothere-vendors.Allcompanionfilesareavailablebywritingtothepublisheratinfo@merclearning.com.

ThesoleobligationofMERCURYLEARNINGANDINFORMATIONtothepurchaseristoreplacethebookand/ordisc,basedondefectivematerialsorfaultyworkmanship,butnotbasedontheoperationorfunctionalityoftheproduct.

Page 6: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Contents

PrefaceIntendedAudienceHowtoUseThisBookAcknowledgmentsAbouttheAuthors

Chapter1GettingStarted1.1 LanguagesandLibraries

1.1.1 Java1.1.2 OpenGL/GLSL1.1.3 JOGL1.1.4 graphicslib3D

1.2 InstallationandConfiguration1.2.1 InstallingJava1.2.2 InstallingOpenGL/GLSL1.2.3 InstallingJOGL1.2.4 Installinggraphicslib3D

Chapter2JOGLandtheOpenGLGraphicsPipeline2.1 TheOpenGLPipeline

2.1.1 Java/JOGLApplication2.1.2 VertexandFragmentShaders2.1.3 Tessellation2.1.4 GeometryShader2.1.5 Rasterization2.1.6 FragmentShader2.1.7 PixelOperations

2.2 DetectingOpenGLandGLSLErrors2.3 ReadingGLSLSourceCodefromFiles2.4 BuildingObjectsfromVertices

Page 7: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

2.5 AnimatingaScene

Chapter3MathematicalFoundations3.1 3DCoordinateSystems3.2 Points3.3 Matrices3.4 TransformationMatrices

3.4.1 Translation3.4.2 Scaling3.4.3 Rotation

3.5 Vectors3.5.1 UsesforDotProduct3.5.2 UsesforCrossProduct

3.6 LocalandWorldSpace3.7 EyeSpaceandtheSyntheticCamera3.8 ProjectionMatrices

3.8.1 ThePerspectiveProjectionMatrix3.8.2 TheOrthographicProjectionMatrix

3.9 Look-AtMatrix3.10GLSLFunctionsforBuildingMatrixTransforms

Chapter4Managing3DGraphicsData4.1 Buffers&VertexAttributes4.2 UniformVariables4.3 InterpolationofVertexAttributes4.4 Model-ViewandPerspectiveMatrices4.5 OurFirst3DProgram–a3DCube4.6 RenderingMultipleCopiesofanObject

4.6.1 Instancing4.7 RenderingMultipleDifferentModelsinaScene4.8 MatrixStacks4.9 Combating“Z-Fighting”Artifacts4.10OtherOptionsforPrimitives4.11Back-FaceCulling

Chapter5TextureMapping5.1 LoadingTextureImageFiles5.2 TextureCoordinates5.3 CreatingaTextureObject5.4 ConstructingTextureCoordinates5.5 LoadingTextureCoordinatesintoBuffers5.6 UsingtheTextureinaShader:SamplerVariablesandTextureUnits5.7 TextureMapping:ExampleProgram5.8 Mipmapping

Page 8: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

5.9 AnisotropicFiltering5.10WrappingandTiling5.11PerspectiveDistortion5.12LoadingTextureImageFilesUsingJavaAWTClasses

Chapter63DModels6.1 ProceduralModels–BuildingaSphere6.2 OpenGLIndexing–BuildingaTorus

6.2.1 TheTorus6.2.2 IndexinginOpenGL

6.3 LoadingExternallyProducedModels

Chapter7Lighting7.1 LightingModels7.2 Lights7.3 Materials7.4 ADSLightingComputations7.5 ImplementingADSLighting

7.5.1 GouraudShading7.5.2 PhongShading

7.6 CombiningLightingandTextures

Chapter8Shadows8.1 TheImportanceofShadows8.2 ProjectiveShadows8.3 ShadowVolumes8.4 ShadowMapping

8.4.1 ShadowMapping(PASSONE)–“Draw”ObjectsfromLightPosition8.4.2 ShadowMapping(IntermediateStep)–CopyingtheZ-BuffertoaTexture8.4.3 ShadowMapping(PASSTWO)–RenderingtheScenewithShadows

8.5 AShadowMappingExample8.6 ShadowMappingArtifacts

Chapter9SkyandBackgrounds9.1 Skyboxes9.2 Skydomes9.3 ImplementingaSkybox

9.3.1 BuildingaSkyboxfromScratch9.3.2 UsingOpenGLCubeMaps

9.4 EnvironmentMapping

Chapter10EnhancingSurfaceDetail10.1 BumpMapping10.2 NormalMapping

Page 9: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

10.3 HeightMapping

Chapter11ParametricSurfaces11.1 QuadraticBézierCurves11.2 CubicBézierCurves11.3 QuadraticBézierSurfaces11.4 CubicBézierSurfaces

Chapter12Tessellation12.1 TessellationinOpenGL12.2 TessellationforBézierSurfaces12.3 TessellationforTerrain/HeightMaps12.4 ControllingLevelofDetail(LOD)

Chapter13GeometryShaders13.1 Per-PrimitiveProcessinginOpenGL13.2 AlteringPrimitives13.3 DeletingPrimitives13.4 AddingPrimitives

Chapter14OtherTechniques14.1 Fog14.2 Compositing/Blending/Transparency14.3 User-DefinedClippingPlanes14.4 3DTextures14.5 Noise14.6 NoiseApplication–Marble14.7 NoiseApplication–Wood14.8 NoiseApplication–Clouds14.9 NoiseApplication–SpecialEffects

Index

Page 10: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Preface

ThisbookisdesignedprimarilyasatextbookforatypicalcomputerscienceundergraduatecourseinOpenGL3Dgraphicsprogramming.However,wehavealsowantedtocreateatextthatcouldbeusedtoteachoneself,withoutanaccompanyingcourse.Withbothofthoseaimsinmind,wehavetriedtoexplainthingsasclearlyandsimplyaswecan.Everyprogrammingexample is stripped-downand simplifiedasmuchaspossible,but still complete so that thereadermayrunthemallaspresented.

Oneofthethingsthatwehopeisuniqueaboutthisbookisthatwehavestrivedtomakeitaccessibletoabeginner—thatis,someonenewto3Dgraphicsprogramming.Whilethereisbynomeansalackofinformationavailableonthetopic—quitethecontrary—manystudentsareinitiallyoverwhelmed.Thistextisourattempttowritethebookwewishwehadhadwhenwewerestartingout,withstep-by-stepexplanationsofthebasics,progressinginanorganizedmanner up through advanced topics.We considered titling the book “shader programmingmade easy”; however, we don’t think that there really is any way of making shaderprogramming“easy.”Wehopethatwehavecomeclose.

Another thing that makes this book unique is that it teaches OpenGL programming inJava,usingJOGL—aJava“wrapper”forOpenGL’snativeCcalls[JO16].ThereareseveraladvantagestolearninggraphicsprogramminginJavaratherthaninC:

ItismoreconvenientforstudentsatschoolsthatconductmostoftheircurriculuminJava.Java’sI/O,window,andeventhandlingarearguablycleanerthaninC.Java’sexcellentsupportforobject-orienteddesignpatternscanfostergooddesign.

It isworthmentioning that there do exist other Java bindings forOpenGL.One that isbecoming increasinglypopular isLightweightJavaGameLibrary,orLWJGL[LW16].LikeJOGL,LWJGLalsooffersbindingsforOpenALandOpenCL.ThistextbookfocusesonlyonJOGL.

AnotherpointofclarificationisthatthereexistbothdifferentversionsofOpenGL(brieflydiscussed later) and different variants of OpenGL. For example, in addition to “standard

Page 11: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

OpenGL”(sometimescalled“desktopOpenGL”),thereexistsavariantcalled“OpenGLES”which is tailored for development of embedded systems (hence the “ES”). “Embeddedsystems”includedevicessuchasmobilephones,gameconsoles,automobiles,andindustrialcontrol systems. OpenGL ES is mostly a subset of standard OpenGL, eliminating a largenumberofoperations thatare typicallynotneededforembeddedsystems.OpenGLESalsoadds some additional functionality, typically application-specific operations for particulartarget environments. The JOGL suite of Java bindings includes interfaces for differentversionsofOpenGLES,althoughwedonotusetheminthisbook.

Yet another variant of OpenGL is called “WebGL.” Based on OpenGL ES, WebGL isdesignedtosupporttheuseofOpenGLinwebbrowsers.WebGLallowsanapplicationtouseJavaScript1 to invoke OpenGL ES operations, which makes it easy to embed OpenGLgraphicsintostandardHTML(web)documents.MostmodernwebbrowserssupportWebGL,includingAppleSafari,GoogleChrome,Microsoft InternetExplorer,MozillaFirefox, andOpera. Since web programming is outside the scope of this book, we will not cover anyWebGLspecifics.NotehoweverthatbecauseWebGLisbasedonOpenGLES,whichinturnisbasedonstandardOpenGL,muchofwhatiscoveredinthisbookcanbetransferreddirectlytolearningabouttheseOpenGLvariants.

Thevery topicof3Dgraphics lends itself to impressive,evenbeautiful images. Indeed,manypopulartextbooksonthetopicarefilledwithbreathtakingscenes,anditisenticingtoleafthroughtheirgalleries.Whileweacknowledgethemotivationalutilityofsuchexamples,our aim is to teach, not to impress. The images in this book are simply the outputs of theexampleprograms,andsincethisisanintroductorytext,theresultingscenesareunlikelytoimpressanexpert.However,thetechniquespresenteddoconstitutethefoundationalelementsforproducingtoday’sstunning3Deffects.

We also haven’t tried to create an OpenGL or JOGL “reference.” Our coverage ofOpenGLandJOGLrepresentsonlyatinyfractionoftheircapabilities.Rather,ouraimistouseOpenGLandJOGLasvehiclesforteachingthefundamentalsofmodernshader-based3Dgraphics programming, and provide the reader with a sufficiently deep understanding forfurtherstudy.IfalongthewaythistexthelpstoexpandawarenessofJOGLandotherJogAmptechnologies,thatwouldbenicetoo.

IntendedAudienceThis book is targeted at students of computer science. This could mean undergraduatespursuingaBSdegree,butitcouldalsomeananyonewhostudiescomputerscience.Assuch,we are assuming that the reader has at least a solid background in object-orientedprogramming,atthelevelofsomeonewhois,say,acomputersciencemajoratthejuniororseniorlevel.

There are also some specific things that we use in this book, but that we don’t coverbecauseweassumethereaderalreadyhassufficientbackground.Inparticular:

Page 12: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

JavaanditsAbstractWindowToolkit(AWT)orSwinglibrary,especiallyforGUI-buildingJavaconfigurationdetails,suchasmanipulatingtheCLASSPATHevent-drivenprogrammingbasicmatrixalgebraandtrigonometryawarenessofcolormodels,suchasRGB,RGBA,etc.

HowtoUseThisBookThis book is designed to be read from front to back. That is, material in later chaptersfrequently relies on information learned in earlier chapters. So it probablywon’t work tojumpbackandforthinthechapters;rather,workyourwayforwardsthroughthematerial.

This is also intended mostly as a practical, hands-on guide. While there is plenty oftheoretical material included, the reader should treat this text as a sort of “workbook,” inwhichyou learnbasicconceptsbyactuallyprogramming themyourself.Wehaveprovidedcodeforalloftheexamples,buttoreallylearntheconceptsyouwillwanttoplaywiththoseexamples—extendthemtobuildyourown3Dscenes.

Attheendofeachchapterareafewproblemstosolve.Someareverysimple,involvingmerelymaking simplemodifications to the provided code. The problems that aremarked“(PROJECT),” however, are expected to take some time to solve, and require writing asignificantamountofcode,orcombiningtechniquesfromvariousexamples.Therearealsoafewmarked“(RESEARCH)”—thoseareproblems thatencourage independent studybecausethistextbookdoesn’tprovidesufficientdetailtosolvethem.

OpenGLcalls,whethermade inCor in Java throughJOGL,often involve long listsofparameters.While writing this book, the authors debated whether or not to, in each case,describealloftheparameters.Wedecidedthatattheverybeginning,wewoulddescribeeverydetail.Butasthetopicsprogress,wedecidedtoavoidgettingboggeddownineverypieceofminutiaeintheOpenGLcalls(andtherearemany),forfearofthereaderlosingsightofthebigpicture.Forthisreason,itisessentialwhenworkingthroughtheexamplestohavereadyaccesstoreferencematerialforJava,OpenGL,andJOGL.

For this, thereareanumberofexcellentreferencesources thatwerecommendusinginconjunctionwiththisbook.ThejavadocsforJavaandJOGLareabsolutelyessential,andcanbe accessed online or downloaded (we explain in Chapter 1 how to download the JOGLjavadoc). The reader should bookmark them for easy access in a browser, and expect toaccessthemcontinuouslyforlookingupitemssuchasparameterandconstructordetails.TheURLsfortheJavaandJOGLjavadocsare:

https://docs.oracle.com/javase/8/docs/api/

https://jogamp.org/deployment/webstart/javadoc/jogl/javadoc

ManyoftheentriesintheJOGLjavadocaresimplypointerstothecorrespondingentryin

Page 13: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

theOpenGLdocumentation,availablehere:

https://www.opengl.org/sdk/docs/man/

Ourexamplesutilizeamathematicslibrarycalledgraphicslib3D.This isaJavalibrarythatalsohas itsownsetof javadocs included.After installinggraphicslib3D (described inChapter1),thereadershouldlocatetheaccompanyingjavadocfolderandbookmarkitsrootfile(index.html).

Finally, there aremany other books on 3D graphics programming thatwe recommendreadinginparallelwiththisbook(suchasforsolvingthe“research”problems).Herearefivethatweoftenreferto:

(Sellersetal.)OpenGLSuperBible[SW15](Kessenichetal.)OpenGLProgrammingGuide[KS16](the“redbook”)(Wolff)OpenGL4ShadingLanguageCookbook[WO13](AngelandShreiner)InteractiveComputerGraphics[AS14](Luna)Introductionto3DGameProgrammingwithDirectX12[LU16]

AcknowledgmentsEarly drafts of this book were used in the CSc-155 (Advanced Computer GraphicsProgramming)courseatCSUSacramento,andbenefitedfrommanystudentcorrectionsandcomments (and in somecases, code).Theauthorswouldparticularly like to thankMitchellBrannan, Tiffany Chiapuzio-Wong, Samson Chua, Anthony Doan, Kian Faroughi, CodyJackson, John Johnston, Zeeshan Khaliq, Raymond Rivera, Oscar Solorzano, DarrenTakemoto, Jon Tinney, James Womack, and Victor Zepeda for their suggestions. Weapologizeifwehavemissedanyone.

We are extremely grateful for the invaluable advice provided to us by JulienGouesse,enginesupportmaintaineratJogamp.Mr.GouesseprovidedtechnicalinformationonJOGLtextures, cubemaps, buffer handling, and proper loading of shader source files that led toimprovementsinourtext.

JayTurbervilleofStudio522ProductionsinScottsdale(Arizona)builtthedolphinmodelshownon thecoverandused throughout thisbook.Studio522Productionsdoes incrediblyhigh-quality3Danimationandvideoproduction, aswell as custom3Dmodeling.WewerethrilledthatMr.Turbervillekindlyofferedtobuildsuchawonderfulnewmodeljustforthisbookproject.

Wewishtothankafewotherartistsandresearcherswhoweregraciousenoughtoallowus to utilize their models and textures. James Hastings-Trew of Planet Pixel Emporiumprovidedmanyoftheplanetarysurfacetextures.PaulBourkeallowedustousehiswonderfulstar field.Dr.MarcLevoyofStanfordUniversity grantedus permission to use the famous“Stanford Dragon” model. Paul Baker ’s bump-mapping tutorial formed the basis of the“torus”modelweusedinmanyexamples.WealsothankMercuryLearningforallowingusto

Page 14: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

usesomeofthetexturesfrom[LU16].

Dr.DannyKopecconnecteduswithMercuryLearningandintroducedustoitspublisher,DavidPallai.Dr.Kopec’sArtificialIntelligencetextbookinspiredustoconsiderMercury,andourtelephoneconversationswithhimwereextremelyinformative.WeweredeeplysaddenedtohearofDr.Kopec’suntimelypassing,andregretthathedidn’thavethechancetoseeourbookcometofruition.

Finally, we wish to thank David Pallai and Jennifer Blaney of Mercury Learning forbelievinginthisprojectandguidingusthroughthetextbookpublishingprocess.

AbouttheAuthorsDr.V.ScottGordonhasbeenaprofessorintheCaliforniaStateUniversitysystemforovertwentyyears,andcurrentlyteachesadvancedgraphicsandgameengineeringcoursesatCSUSacramento. He has authored or coauthored over thirty publications in a variety of areasincluding video and strategy game programming, artificial intelligence, neural networks,software engineering, and computer science education. Dr. Gordon obtained his PhD atColoradoStateUniversity.Heisalsoajazzdrummerandacompetitivetabletennisplayer.

Dr.JohnClevenger has over forty years of experience teaching awidevariety of coursesincludingadvancedgraphics,gamearchitecture,operatingsystems,VLSIchipdesign,systemsimulation,andother topics.He is thedeveloperof several software frameworksand toolsforteachinggraphicsandgamearchitecture,includingthegraphicslib3Dlibraryusedinthistextbook. He is the technical director of the ACM International Collegiate ProgrammingContest,andoverseestheongoingdevelopmentofPC^2,themostwidelyusedprogrammingcontest support system in the world. Dr. Clevenger obtained his PhD at the University ofCalifornia,Davis.Heisalsoaperformingjazzmusician,andspendssummervacationsinhismountaincabin.

References

[AS14] E.Angel,andD.Shreiner,InteractiveComputerGraphics:ATop-DownApproachwithWebGL,7thed.(Pearson,2014).

[JO16] Jogamp,accessedJuly2016,http://jogamp.org/.

[LW16] LightweightJavaGameLibrary(LWJGL),accessedJuly2016,https://www.lwjgl.org/.

[LU16] F.Luna,Introductionto3DGameProgrammingwithDirectX12,2nded.(MercuryLearning,2016).

J.Kessenich,G.Sellers,andD.Shreiner,OpenGLProgrammingGuide:TheOfficial

Page 15: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

[KS16] GuidetoLearningOpenGL,Version4.5withSPIR-V,9thed.(Addison-Wesley,2016).

[SW15] G.Sellers,R.WrightJr.,andN.Haemel.,OpenGLSuperBible:ComprehensiveTutorialandReference,7thed.(Addison-Wesley,2015).

[WO13] D.Wolff,OpenGLShadingLanguageCookbook,2nded.(PacktPublishing,2013).1 JavaScript is a scripting language that canbeused toembedcode inwebpages. Ithas strong similarities to Java,but alsomanyimportantdifferences.

Page 16: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 1

GETTINGSTARTED

1.1 LanguagesandLibraries1.2 InstallationandConfiguration

Graphicsprogramminghasareputationofbeingamongthemostchallengingcomputersciencetopicstolearn.Thesedays,graphicsprogrammingisshaderbased—thatis,someoftheprogramiswritteninastandardlanguagesuchasJavaorC++forrunningontheCPU,andsomeiswritteninaspecial-purposeshaderlanguageforrunningdirectlyonthegraphicscard(GPU).Shaderprogramminghasasteeplearningcurve,sothatevendrawingsomethingsimple requires a convoluted set of steps to pass graphics data down a “pipeline.”Moderngraphicscardsareabletoprocessthisdatainparallel,andsothegraphicsprogrammermustunderstandtheparallelarchitectureoftheGPU,evenwhendrawingsimpleshapes.

Thepayoff,however,isextraordinarypower.Theblossomingofstunningvirtualrealityin videogames and increasingly realistic effects in Hollywood movies can be greatlyattributed to advances in shader programming. If reading this book is your entrée into 3Dgraphics, you are takingon a personal challenge thatwill rewardyounot onlywith prettypictures,butwithalevelofcontroloveryourmachinethatyouneverimaginedwaspossible.Welcometotheexcitingworldofcomputergraphics!

1.1 LANGUAGESANDLIBRARIES

Runningtheprogramsinthisbookrequirestheuseofthefollowingfourlanguagesandlibraries:

JavaOpenGL/GLSLJOGL

Page 17: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

graphicslib3D

It is likely that the readerwillneed todoa fewpreparatorysteps toensure thateachofthese are installed and properly accessible on his/her system. In this section we brieflydescribeeachofthem.Theninthefollowingsectionweshowhowtoinstalland/orconfigurethemforuse.

1.1.1 Java

JavawasdevelopedatSunMicrosystemsintheearly1990s,andthefirststablereleaseofadevelopmentkit(JDK)wasin1995.In2010,OracleCorp.acquiredSun,andhasmaintainedJavasincethattime[OR16].ThisbookassumesJavaversion8,whichwasreleasedin2014.

1.1.2 OpenGL/GLSL

Version 1.0 of OpenGL appeared in 1992 as an “open” alternative to vendor-specificApplication Programming Interfaces (APIs) for computer graphics. Its specification anddevelopmentwasmanagedandcontrolledbytheOpenGLArchitectureReviewBoard(ARB),athennewlyformedgroupofindustryparticipants.In2006theARBtransferredcontroloftheOpenGLspecificationtotheKhronosGroup,anonprofitconsortiumwhichmanagesnotonlytheOpenGLspecificationbutawidevarietyofotheropenindustrystandards.

SinceitsbeginningOpenGLhasbeenrevisedandextendedregularly.In2004,version2.0introduced the OpenGL Shading Language (GLSL), allowing “shader programs” to beinstalledandrundirectlyingraphicspipelinestages.

In 2009, version 3.1 removed a large number of features that had been deprecated, toenforce the use of shader programming as opposed to earlier approaches (referred to as“immediate mode”).1 Among the more recent features, version 4.0 (in 2010) added atessellationstagetotheprogrammablepipeline.

Thistextbookassumesthattheuserisusingamachinewithagraphicscardthatsupportsat least version 4.3 of OpenGL. If you are not sure which version of OpenGL your GPUsupports, there are free applications available on theweb that canbeused to findout.OnesuchapplicationisGLView,byacompanynamed“realtechvr”[GV16].

1.1.3 JOGL

JOGLfirstappearedin2003,publishedonthewebsiteJava.net.Since2010ithasbeenanindependentopensourceproject,andpartofasuiteofJavabindingsmaintainedbyJogAmp[JO15], an online community of developers. JogAmp also maintains JOAL and JOCL,bindingsforOpenALandOpenCL,respectively.AsnewversionsofOpenGLand/orJavaarereleased,newversionsof JOGLaredeveloped to supportcontinuedcompatibility. JogAmpalsomaintainsashortonlineuser ’sguidethatincludesvaluableguidelinesforinstallingandusing JOGL efficiently and effectively [JU16]. This book assumes at least version 2.3 of

Page 18: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

JOGL.

1.1.4 graphicslib3D

3Dgraphicsprogrammingmakesheavyuseofvectorandmatrixalgebra.Forthisreason,use of OpenGL is greatly facilitated by accompanying it with a function library or classpackage to support common mathematical functions. For example, the popular OpenGLSuperbible[SW15]utilizesaC librarycalled“vecmath”. In thisbook,weusea Java librarycalledgraphicslib3D.

graphicslib3D provides classes for basic math functions related to graphics concepts,suchasvector,matrix,point,vertex,andquaternion. Italsocontainsafewutilityclassesforstoring frequently used 3D graphics structures, such as a stack for building hierarchicalstructures, lightingandmaterial information,andafewbasicshapessuchasasphereandatorus.

graphicslib3D was first developed in 2005 by John Clevenger at California StateUniversitySacramento,andismaintainedbytheauthors.

1.2 INSTALLATIONANDCONFIGURATION

1.2.1 InstallingJava

To use Java for the examples in this book, youwill need both the JRE (Java RuntimeEnvironment)and theJDK(JavaDevelopmentKit).To install them,useOracle’sdownloadsite,http://www.oracle.com/technetwork/java, and click the “JavaSE” (StandardEdition)linkunder“SoftwareDownloads.”FromthereyoucanfindinstructionsfordownloadingthelatestJDK,whichincludesboththeJavacompilerandtheJRE.ItisassumedthatthereaderisexperiencedwithprogramminginJava.Atthetimeofthiswriting,thecurrentversionofJavaisversion8.

1.2.2 InstallingOpenGL/GLSL

It isnotnecessary to“install”OpenGLorGLSL,but it isnecessary toensure thatyourgraphicscardsupportsat leastversion4.3ofOpenGL.IfyoudonotknowwhatversionofOpenGLyourmachine supports, you can use one of the various free applications (such asGLView[GV16])tofindout.

1.2.3 InstallingJOGL

ToinstallJOGL,visithttp://jogamp.org.Asofthiswriting,thecurrentversionofJOGLis in the “Builds/Downloads” section—look under “Current” and click on [zip]. Thisdisplays the latest stable JOGL files in a folder named “/deployment/jogamp-

Page 19: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

current/archive”.Downloadthefollowing:jogamp-all-platforms.7z

jogl-javadoc.7z

UnzipthesefilesintothefolderonyourmachinewhereyouwouldliketostoretheJOGLsystem.AtypicallocationinWindowscouldbe,forexample,inafolderattherootoftheC:drive.

Theunzipped“jogamp-all-platforms”filecontainsafoldernamed“jar”,whichcontainstwoimportantfilesthatwillbeusedbyyourapplications:

jogl-all.jar

gluegen-rt.jar

AddthefullpathnameofeachofthesetwofilestoyourCLASSPATHenvironmentvariable.

Inthejogl-javadocfolder,double-clickthefilenamedindex.html.ThisopenstheJOGLjavadocsinabrowser,whichyoushouldthenbookmark.

1.2.4 Installinggraphicslib3D

To install graphicslib3D, download the graphicslib3D.zip file from the textbooksupportwebsiteor theaccompanyingdisc.Unzipping this fileproducesa foldercontainingthefollowingtwoitems:

Afilenamedgraphicslib3D.jarAfoldernamedjavadoc

Movethesefilestowhereveryouwouldliketostoregraphicslib3D—atypicallocationinWindowscouldbe,forexample,inafolderattherootoftheC:drive.

Addthefullpathnameofthe.jarfiletoyourCLASSPATHenvironmentvariable.

Opentheindex.htmlfileinthejavadocfolder,andbookmarkitasyoudidfortheJOGLJavadoc.

References

[GV16] GLView,realtech-vr,accessedJuly2016,http://www.realtech-vr.com/glview/.

[JO16] Jogamp,accessedJuly2016,http://jogamp.org/.

[JU16] JOGLUsersGuide,accessedJuly2016,https://jogamp.org/jogl/doc/userguide/.

[OR16] JavaSoftware,OracleCorp.,accessedJuly2016,https://www.oracle.com/java/index.html.

Page 20: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

[SW15] G.Sellers,R.WrightJr.,andN.Haemel,OpenGLSuperBible:ComprehensiveTutorialandReference,7thed.(Addison-Wesley,2015).

1Despitethis,manygraphicscardmanufacturers(notablyNVIDIA)continuetosupportdeprecatedfunctionality.

Page 21: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 2

JOGLANDTHEOPENGLGRAPHICSPIPELINE

2.1 TheOpenGLPipeline2.2 DetectingOpenGLandGLSLErrors2.3 ReadingGLSLSourceCodefromFiles2.4 BuildingObjectsfromVertices2.5 AnimatingaScene

SupplementalNotes

OpenGL (Open Graphics Library) is a multi-platform 2D and 3D graphics API thatincorporatesbothhardwareandsoftware.UsingOpenGLrequiresagraphicscard(GPU)thatsupportsasufficientlyup-to-dateversionofOpenGL(asdescribedinChapter1).

On thehardwareside,OpenGLprovidesamulti-stagegraphicspipeline that ispartiallyprogrammableusingalanguagecalledGLSL(OpenGLShadingLanguage).

On the software side, OpenGL’s API is written in C, and thus the calls are directlycompatiblewithCandC++.However,stablelanguagebindings(or“wrappers”)areavailablefor more than a dozen other popular languages (Java, Perl, Python, Visual Basic, Delphi,Haskell, Lisp, Ruby, etc.) with virtually equivalent performance. This textbook uses thepopular JavawrapperJOGL (JavaOpenGL).Whenusing JOGL, theprogrammerwrites aJavaprogramthatrunsontheCPU(morespecifically,ontheJavaVirtualMachine,orJVM)andincludesJOGL(andthus,OpenGL)calls.Wewillrefer toaJavaprogramthatcontainsJOGLcallsasaJava/JOGLapplication.OneimportanttaskofaJava/JOGLapplicationistoinstalltheprogrammer’sGLSLcodeontotheGPU.

An overview of a JOGL-based graphics application is shown in Figure 2.1, with thesoftwarecomponentshighlightedinred:

Page 22: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure2.1OverviewofaJOGL-basedgraphicsapplication.

SomeofthecodewewillwritewillbeinJava,withJOGLcalls,andsomewillbewritteninGLSL.Our Java/JOGL applicationwillwork togetherwith ourGLSLmodules, and thehardware, tocreateour3Dgraphicsoutput.Onceourapplication iscomplete, theenduserwillinteractwiththeJavaapplication.

GLSL is an example of a shader language. Shader languages are intended to run on aGPU,inthecontextofagraphicspipeline.Thereareothershaderlanguages,suchasHLSL,whichworkswithMicrosoft’s3DframeworkDirectX.GLSListhespecificshaderlanguagethatiscompatiblewithOpenGL,andthuswewillwriteshadercodeinGLSL,inadditiontoourJava/JOGLapplicationcode.

Fortherestofthischapter,wewilltakeabrief“tour”oftheOpenGLpipeline.Thereaderisnotexpectedtounderstandeverydetailthoroughly,butjusttogetafeelforhowthestagesworktogether.

2.1 THEOPENGLPIPELINE

Modern3Dgraphicsprogrammingutilizesapipeline,inwhichtheprocessofconvertinga3Dscene toa2Dimage isbrokendownintoaseriesofsteps.OpenGLandDirectXbothutilizesimilarpipelines.

AsimplifiedoverviewoftheOpenGLgraphicspipelineisshowninFigure2.2(noteverystageisshown,justthemajoroneswewillstudy).TheJava/JOGLapplicationsendsgraphicsdataintothevertexshader—processingproceedsthroughthepipeline,andpixelsemergefordisplayonthemonitor.

Thestagesshadedinblue(vertex,tessellation,geometry,andfragment)areprogrammablein GLSL. It is one of the responsibilities of the Java/JOGL application to load GLSLprogramsintotheseshaderstages,asfollows:

Page 23: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

1. ItusesJavatoobtaintheGLSLcode,eitherfromtextfilesorhardcodedasstrings.2. ItthencreatesOpenGLshaderobjects,andloadstheGLSLshadercodeintothem.3. Finally,itusesOpenGLcommandstocompileandlinkobjectsandinstallthemon

theGPU.

Figure2.2OverviewoftheOpenGLpipeline.

In practice, it is usually necessary to provide GLSL code for at least the vertex andfragmentstages,whereasthetessellationandgeometrystagesareoptional.Let’swalkthroughtheentireprocessandseewhattakesplaceateachstep.

2.1.1 Java/JOGLApplication

ThebulkofourgraphicsapplicationiswritteninJava.Dependingonthepurposeoftheprogram, it may interact with the end user using standard Java libraries such as AWT orSwing.Fortasksrelatedto3Drendering,itusestheJOGLlibrary.Otherwindowinglibrariesexist that interface with JOGL, such as SWT and NEWT, that have some performanceadvantages; in this book, however, we use AWT and Swing because of the likelihood thereaderalreadyhasfamiliaritywiththem.

JOGLincludesaclasscalledGLCanvas that is compatiblewith the standard JavaJFrame,andonwhichwecandraw3Dscenes.Asalreadymentioned,JOGLalsogivesuscommandsfor installing GLSL programs onto the programmable shader stages and compiling them.Finally,JOGLusesbuffersforsending3Dmodelsandotherrelatedgraphicsdatadownthepipeline.

Beforewetrywritingshaders,let’swriteasimpleJava/JOGLapplicationthatinstantiatesaGLCanvas and sets its background color.Doing thatwon’t require any shaders at all!ThecodeisshowninProgram2.1(translatedtoJavafromtheC++versionin[SW15]).ItextendsJFrame, and instantiates a GLCanvas, adding it to the JFrame. It also implementsGLEventListener,requiredtoutilizeOpenGL—thisnecessitatesimplementingsomemethods,specificallydisplay(),init(),reshape(),anddispose().Thedisplay()method iswhereweplacecodethatdrawstotheGLCanvas.Inthisexample,weloadaFloatBufferwithvalues

Page 24: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

(1,0,0,1), corresponding to theRGB values of the color red (plus a “1” for the opacitycomponent), and use theOpenGL call glClearBuffer() to fill the display bufferwith thatcolor.

Program2.1FirstJava/JOGLApplicationimportjava.nio.*;

importjavax.swing.*;

importstaticcom.jogamp.opengl.GL4.*;

importcom.jogamp.opengl.*;

importcom.jogamp.opengl.awt.GLCanvas;

importcom.jogamp.common.nio.Buffers;

publicclassCodeextendsJFrameimplementsGLEventListener

{privateGLCanvasmyCanvas;

publicCode()

{setTitle("Chapter2-program1");

setSize(600,400);

setLocation(200,200);

myCanvas=newGLCanvas();

myCanvas.addGLEventListener(this);

this.add(myCanvas);

setVisible(true);

}

publicvoiddisplay(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

floatbkg[]={1.0f,0.0f,0.0f,1.0f};

FloatBufferbkgBuffer=Buffers.newDirectFloatBuffer(bkg);

gl.glClearBufferfv(GL_COLOR,0,bkgBuffer);

}

publicstaticvoidmain(String[]args)

{newCode();

}

publicvoidinit(GLAutoDrawabledrawable){}

publicvoidreshape(GLAutoDrawabledrawable,intx,inty,intwidth,int

height){}

publicvoiddispose(GLAutoDrawabledrawable){}

}

WhenrunningaJava/JOGLapplication(suchastheaboveProgram2.1)onaMicrosoftWindows machine, it is advisable to add the command-line option to disable the use ofDirect3Dacceleration,suchas:

Page 25: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure2.3OutputofProgram2-1.

java-Dsun.java2d.d3d=falseCode

Themechanismbywhichthesefunctionsaredeployedisasfollows:WhenaGLCanvas ismade “visible” (by our calling “setVisible(true)” on the JFrame that contains it), itinitializes OpenGL, which in turn creates a “GL4” object that our application can use formakingOpenGLfunctioncalls.OpenGLthendoesa“callback,”callinginit(),andpassesita “drawable” object (in this case the drawable object is the GLCanvas, although that isn’tobvious from the code). In this particular example, init() doesn’t do anything—in mostapplications it iswherewewouldreadinGLSLcode, load3Dmodels,andsoon.OpenGLnextcallsdisplay(),alsosendingit thedrawableobject. It is typical to immediatelyobtaintheGL4objectandputitinavariablecalled“gl”.(Actually,GL4isaninterface—inpracticewedon’tneedtoknowtheactualGLobjectclass).

Laterwewillseethatifwewantourscenetobeanimated,ourJava/JOGLapplicationwillneedtotellOpenGLtomakeadditionalcallstodisplay().

NowisanappropriatetimetotakeacloserlookatJOGLcallsinProgram2.1.Considerthisone:

gl.glClearBufferfv(GL_COLOR,0,bkgBuffer);

SinceJOGLisaJavabindingforOpenGL,thatmeansthatcallstoJOGLinturngeneratecallstoOpenGL’slibraryofCfunctions.Inthiscase,theCfunctionbeingcalled,asdescribedin the OpenGL reference documentation (available on the web athttps://www.opengl.org/sdk/docs)is:

glClearBufferfv(GLenumbuffer,Glintdrawbuffer,constGLfloat*value);

Thefirst thingtonotice is that thenameof theJOGLfunctionis thesameas thatof theoriginalOpenGLC function, except it is preceded by “gl.”,which is the name of theGL4object.Theperiod“.”afterthe“gl”issignificantbecause“gl”istheobjectonwhichweareinvokingtheOpenGLfunction.

Toreiterate,GL4 isaJavainterfacetotheOpenGLfunctions.Wecanobtainit inoneoftwoways: (a)bycallingdrawable.getGL(),utilizing the“GLAutoDrawable”object providedautomatically when the various GLEventListener functions are invoked (called back) byOpenGL,or(b)bycallingGLContext.getCurrentGL()asdoneinProgram2.1.ObtainingthecurrentGL4 object is important because, in general, anyOpenGL function described in the

Page 26: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

OpenGL documentation can be called from JOGL by preceding it with the name of theappropriateGL4object(suchas“gl.”,here).

Thenextdetailtonoticeisthatthefirstparametermakesreferencetoa“GLenum”.OpenGLhasmany predefined constants (enums); this one references the color buffer GL_COLOR thatcontainsthepixelsastheyarerendered.Inactuality,thereismorethanonecolorbuffer,andthesecondparameterisusedtospecifywhichoneweareusing(inthiscase,the0thorfirstone).

Next,notice that the thirdparameter isdefined inOpenGLas aCpointer.While JOGLmakeseveryeffort tomatchtheoriginalOpenGLCcalls,Javadoesnothavepointers.Anyparameter inOpenGL that is a pointer is changed in JOGL. In this case, the JOGLversionutilizesaFloatBufferinstead.WheneverthereisadiscrepancybetweenaJOGLcallandtheoriginalCcall,consulttheJOGLJavadocfordetailsontherevisedparameter(s).1

ImmediatelybeforethecalltoglClearBufferfv()isacalltonewDirectFloatBuffer().Wewill learnabout this function later inChapter4whenwediscussbuffers and the JOGLbuffertools.

Finally, besides display() and init(), we also must implement reshape() anddispose().Thereshape() function is calledwhen aGLCanvas is resized, anddispose() iscalledwhentheapplicationexits.InProgram2.1weleftthembothempty.

2.1.2 VertexandFragmentShaders

OurfirstJOGLprogramdidn’tactuallydrawanything—itsimplyfilledthecolorbufferwith a single color.To actually draw something,we need to include avertexshader and afragmentshader.

Youmaybe surprised to learn thatOpenGL is capable of drawingonly a fewkinds ofvery simple things, such as points, lines, or triangles. These simple things are calledprimitives,andfor this reason,most3Dmodelsaremadeupof lotsand lotsofprimitives,usuallytriangles.

Primitivesaremadeupofvertices—forexample,atriangleconsistsofthreevertices.Theverticescancomefromavarietyofsources—theycanbereadfromfilesandthenloadedintobuffersbytheJava/JOGLapplication,ortheycanbehardcodedintheJavacodeorevenintheGLSLcode.

Before any of this can happen, the Java/JOGL application must compile and linkappropriateGLSLvertexandfragmentshaderprograms,andthenloadthemintothepipeline.Wewillseethecommandsfordoingthisshortly.

TheapplicationalsoisresponsiblefortellingOpenGLtoconstructtriangles.WedothisbyusingJOGLtocallthefollowingOpenGLfunction:

glDrawArrays(GLenummode,Glintfirst,GLsizeicount);

Page 27: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Themode is the type of primitive—for triangleswe use GL_TRIANGLES. The parameter“first” indicateswhichvertex to startwith (generally vertexnumber0, the first one), andcountspecifiesthetotalnumberofverticestobedrawn.

WhenglDrawArrays()iscalled,theGLSLcodeinthepipelinestartsexecuting.Let’snowaddsomeGLSLcodetothatpipeline.

Regardless of where they originate, all of the vertices pass through the vertex shader.They do so one-by-one; that is, the shader is executed once per vertex. For a large andcomplexmodelwithalotofvertices,thevertexshadermayexecutehundreds,thousands,orevenmillionsoftimes,ofteninparallel.

Let’swriteasimpleprogramwithonlyonevertex,hardcodedinthevertexshader.That’snotenoughtodrawatriangle,butitisenoughtodrawapoint.Forittodisplay,wealsoneedto provide a fragment shader. For simplicity we will declare the two shader programs asarraysofstrings.

Program2.2Shaders,DrawingaPOINT(.....importsasbefore)

publicclassCodeextendsJFrameimplementsGLEventListener

{privateintrendering_program;

privateintvao[]=newint[1]; newdeclarations

publicCode(){(.....constructorasbefore)}

publicvoiddisplay(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

gl.glUseProgram(rendering_program);

gl.glDrawArrays(GL_POINTS,0,1);

}

publicvoidinit(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

rendering_program=createShaderProgram();

gl.glGenVertexArrays(vao.length,vao,0);

gl.glBindVertexArray(vao[0]);

}

privateintcreateShaderProgram()

{GL4gl=(GL4)GLContext.getCurrentGL();

StringvshaderSource[]=

{"#version430\n",

"voidmain(void)\n",

"{gl_Position=vec4(0.0,0.0,0.0,1.0);}\n",

};

StringfshaderSource[]=

{"#version430\n",

"outvec4color;\n",

"voidmain(void)\n",

"{color=vec4(0.0,0.0,1.0,1.0);}\n",

Page 28: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

};

intvShader=gl.glCreateShader(GL_VERTEX_SHADER);

gl.glShaderSource(vShader,3,vshaderSource,null,0);//note:3linesof

code

gl.glCompileShader(vShader);

intfShader=gl.glCreateShader(GL_FRAGMENT_SHADER);

gl.glShaderSource(fShader,4,fshaderSource,null,0);//note:4linesof

code

gl.glCompileShader(fShader);

intvfprogram=gl.glCreateProgram();

gl.glAttachShader(vfprogram,vShader);

gl.glAttachShader(vfprogram,fShader);

gl.glLinkProgram(vfprogram);

gl.glDeleteShader(vShader);

gl.glDeleteShader(fShader);

returnvfprogram;

}

…main(),reshape(),anddispose()asbefore

Theprogramappearstohaveoutputablankcanvas.Butcloseexaminationrevealsatinyblue dot in the center of the window (assuming that this printed page is of sufficientresolution).ThedefaultsizeofapointinOpenGLisonepixel.

Figure2.4OutputofProgram2.2.

There are many important details in Program 2.2 (color-coded in the program, forconvenience) for us to discuss. First, note that init() is no longer empty—it now callsanother function (that we wrote) named “createShaderProgram()”. This function starts bydeclaring twoshadersasarraysof stringscalledvshaderSourceandfshaderSource. It thencallsglCreateShader(),whichgeneratesthedesiredtypeofshader(notethepredefinedvalueGL_VERTEX_SHADER, and then laterGL_FRAGMENT_SHADER).OpenGL creates the shader object(initiallyempty),andreturnsanintegerIDthatisanindexforreferencingitlater—thecodestores this ID in thevariablevShader (andfShader). It then callsglShaderSource(), whichloads theGLSLcodefromthestringarray into theemptyshaderobject.glShaderSource()has five parameters: (a) the shader object in which to load the shader, (b) the number ofstringsintheshadersourcecode,(c)thearrayofstringscontainingthesourcecode,andtwoadditional parameters we aren’t using (they will be explained later, in the supplementarychapternotes).Notealsothetwocommentedlinesofcodeinthebluesection,highlightingthe

Page 29: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

parametervalues3and4—theserefertothenumberoflinesofcodeintheshader(the\n’sdelineate each line in the shader source code). The shaders are then each compiled usingglCompileShader().

Theapplicationthencreatesaprogramobjectnamedvfprogram,andsavestheintegerIDthatpointstoit.AnOpenGL“program”objectcontainsaseriesofcompiledshaders,andhereweseethecommandsglCreateProgram()tocreatetheprogramobject,glAttachShader()toattacheachoftheshaderstoit,andthenglLinkProgram()torequestthattheGLSLcompilerensurethattheyarecompatible.

Afterinit()finishes,display() iscalled(recall this isalsoanOpenGLcallback).Oneof the first things that display() does is call glUseProgram(), which loads the programcontainingthetwocompiledshadersintotheOpenGLpipelinestages(ontotheGPU!).NotethatglUseProgramdoesn’truntheshaders,itjustloadsthemontothehardware.

AswewillseelaterinChapter4,ordinarilyat thispoint theJava/JOGLprogramwouldpreparetheverticesofthemodelbeingdrawnforsendingdownthepipeline.Butnotinthiscase,becauseforourfirstshaderprogramwesimplyhardcodedasinglevertexinthevertexshader. Therefore in this example the display() function next proceeds to theglDrawArrays() call, which initiates pipeline processing. The primitive type is GL_POINTS,andthereisjustonepointtodisplay.

Nowlet’slookattheshadersthemselves,showningreenearlier(andduplicatedahead).Aswesaw,theyhavebeendeclaredintheJava/JOGLprogramasarraysofstrings.Thisisaclumsywaytocode,butitissufficientinthisverysimplecase.Thevertexshaderis:

#version430

voidmain(void)

{gl_Position=vec4(0.0,0.0,0.0,1.0);}

The first line indicates theOpenGL version, in this case 4.30. There follows a “main”function(aswewillsee,GLSLissomewhatJava-likeinsyntax).Theprimarypurposeofanyvertexshaderistosendavertexdownthepipeline(which,asmentionedbefore,itdoesforeveryvertex).Thebuilt-invariablegl_Positionisusedtosetavertex’scoordinatepositionin3Dspace,andissenttothenextstageinthepipeline.TheGLSLdatatypevec4 isused toholda4-tuple,suitableforsuchcoordinates,withtheassociatedfourvaluesrepresentingX,Y, Z, and a fourth value set here to 1.0 (wewill learn the purpose of this fourth value inChapter3).Inthiscase,thevertexishardcodedtotheoriginlocation(0,0,0).

Thevertexmovesthroughthepipeline,eventuallyreachingthefragmentshader:

#version430

outvec4color;

voidmain(void)

{color=vec4(0.0,0.0,1.0,1.0);}

ThepurposeofanyfragmentshaderistosettheRGBcolorofapixeltobedisplayed.Inthiscasethespecifiedoutputcolor(0,0,1)isblue(thefourthvalue1.0specifiesthelevelof opacity). Note the “out” tag indicating that the variable color is an output. (It wasn’t

Page 30: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

necessarytospecifyan“out”tagforgl_Positioninthevertexshader,becausegl_Positionisapredefinedoutputvariable.)

Thereisonedetailinthecodethatwehaven’tdiscussed,inthelasttwolinesintheinit()function(showninred).Theyprobablyappearabitcryptic.AswewillseeinChapter4,whensetsofdataarepreparedforsendingdownthepipeline,theyareorganizedintobuffers.Thosebuffers are in turn organized into Vertex Array Objects (VAOs). In our example, wehardcoded a single point in the vertex shader, so we didn’t need any buffers. However,OpenGLstillrequiresatleastoneVAObecreatedwhenevershadersarebeingused,eveniftheapplicationisn’tusinganybuffers.SothetwolinescreatetherequiredVAO.

Finally, there is the issueofhowthevertex thatcameoutof thevertexshaderbecameapixelinthefragmentshader.RecallfromFigure2.2thatbetweenvertexprocessingandpixelprocessingistherasterizationstage.Itistherethatprimitives(suchaspointsortriangles)areconverted intosetsofpixels.ThedefaultsizeofanOpenGL“point” isonepixel, so that iswhyoursinglepointwasrenderedasasinglepixel.

Figure2.5ChangingglPointSize.

Let’saddthefollowingcommandindisplay(),rightbeforetheglDrawArrays()call:gl.glPointSize(30.0f);

Now,whentherasterizerreceivesthevertexfromthevertexshader,itwillgeneratepixelsthatformapointthathasasizeof30pixels.TheresultingoutputisshowninFigure2.5.

Let’snowcontinueexaminingtheremainderoftheOpenGLpipeline.

2.1.3 Tessellation

Wecover tessellation inChapter12.The programmable tessellation stage is one of themostrecentadditionstoOpenGL(inversion4.0).Itprovidesatessellatorthatcangeneratealarge number of triangles, typically as a grid, and also some tools to manipulate thosetriangles inavarietyofways.Forexample, theprogrammermightmanipulatea tessellatedgridoftrianglesasshowninFigure2.6:

Page 31: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure2.6Gridproducedbytessellator.

Tessellation is useful when a lot of vertices are needed onwhat is otherwise a simpleshape, such as on a square area or curved surface. It is also very useful for generatingcomplexterrain,aswewillseelater.Insuchinstances,itissometimesmuchmoreefficienttohavethetessellatorintheGPUgeneratethetrianglemeshinhardware,ratherthandoingitinJava.

2.1.4 GeometryShader

Wecover thegeometryshaderstageinChapter13.Whereas thevertexshadergives theprogrammertheabilitytomanipulateonevertexatatime(i.e.,“per-vertex”processing),andthefragmentshader(aswewillsee)allowsmanipulatingonepixelatatime(“perfragment”processing),thegeometryshaderprovidesthecapabilitytomanipulateoneprimitiveatatime—“perprimitive”processing.

Recallingthatthemostcommonprimitiveisthetriangle,bythetimewehavereachedthegeometry stage, the pipeline must have completed grouping the vertices into triangles (aprocesscalledprimitiveassembly).Thegeometryshaderthenmakesallthreeverticesineachtriangleaccessibletotheprogrammersimultaneously.

Thereareanumberofusesforper-primitiveprocessing.Theprimitivescouldbealtered,suchasbystretchingorshrinkingthem.Someoftheprimitivescouldbedeleted,thusputting“holes”intheobjectbeingrendered—thisisonewayofturningasimplemodelintoamorecomplexone.

The geometry shader also provides a mechanism for generating additional primitives.Here too, this opens the door to many possibilities for turning simple models into morecomplexones.

An interesting use for the geometry shader is for adding “hair” or “fur” to an object.Considerforexample,thesimpletorusshowninFigure2.7(wewillseehowtogeneratethislater in thebook).Thesurfaceof this torus isbuiltoutofmanyhundredsof triangles. Ifateach triangle,we use a geometry shader to add additional long, narrow triangles that faceoutward,wegettheresultshowninFigure2.8.This“furrytorus”wouldbecomputationallyexpensivetotryandmodelfromscratchintheJava/JOGLapplicationside.

Page 32: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

It might seem redundant to provide a per-primitive shader stage, when the tessellationstage(s)givetheprogrammeraccesstoalloftheverticesinanentiremodelsimultaneously.Thedifferenceisthattessellationonlyoffersthiscapabilityinverylimitedcircumstances—specifically when themodel is a grid of triangles generated by the tessellator. It does notprovidesuchsimultaneousaccesstoalltheverticesof,say,anarbitrarymodelbeingsentinfromJavathroughabuffer.

Figure2.7Torusmodel.

Figure2.8Torusmodifiedingeometryshader.

2.1.5 Rasterization

Ultimately,our3Dworldofvertices,triangles,colors,andsoonneedstobedisplayedona2Dmonitor.That2Dmonitorscreenismadeupofaraster—arectangulararrayofpixels.

When a 3D object is rasterized, OpenGL converts the primitives in the object (usuallytriangles) into fragments. A fragment holds the information associated with a pixel.Rasterizationdeterminesthelocationsofpixelsthatneedtobedrawninordertoproducethetrianglespecifiedbyitsthreevertices.

Rasterization startsby interpolating,pairwise,between the threeverticesof the triangle.There are some options for doing this interpolation; for now it is sufficient to considersimplelinearinterpolationasshowninFigure2.9.Theoriginal threeverticesareshowninred.

Page 33: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure2.9Rasterization(step1).

Ifrasterizationweretostophere,theresultingimagewouldappearaswireframe.ThisisanoptioninOpenGL,byadding thefollowingcommand in thedisplay() function,beforethecalltoglDrawArrays():

gl.glPolygonMode(GL_FRONT_AND_BACK,GL_LINE);

IfthetorusshownpreviouslyinSection2.1.4isrenderedwiththeadditionofthislineofcode,itappearsasshowninFigure2.10.

Figure2.10Toruswithwireframerendering.

Ifwedidn’tinserttheprecedinglineofcode(orifGL_FILLhadbeenspecifiedinsteadofGL_LINE),interpolationwouldcontinuealongrasterlinesandfilltheinteriorofthetriangle,asshowninFigure2.11.Whenapplied to the torus, this results in the solid torus shown inFigure2.12.

Aswewillsee in laterchapters, therasterizercaninterpolatemorethanjustpixels.Anyvariable that is output by the vertex shader and input by the fragment shader will beinterpolatedbasedonthecorrespondingpixelposition.Wewillusethiscapabilitytogeneratesmoothcolorgradations,achieverealisticlighting,andmanymoreeffects.

2.1.6 FragmentShader

As mentioned earlier, the purpose of the fragment shader is to assign colors to the

Page 34: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

rasterized pixels.We have already seen an example of a fragment shader in Program 2.2.There,thefragmentshadersimplyhardcodeditsoutputtoaspecificvalue,soeverygeneratedpixelhadthesamecolor.However,GLSLaffordsusvirtuallylimitlesscreativitytocalculatecolorsinotherways.

Figure2.11Fullyrasterizedtriangle.

Figure2.12Toruswithfullyrasterizedprimitives(wireframegridsuperimposed).

Onesimpleexamplewouldbe tobase theoutputcolorofapixelon its location.Recallthat in the vertex shader, the outgoing coordinates of a vertex are specified using thepredefinedvariablegl_Position.Inthefragmentshader,thereisasimilarvariableavailableto the programmer for accessing the coordinates of an incoming fragment, calledgl_FragCoord. We can modify the fragment shader from Program 2.2 so that it usesgl_FragCoord (in this case referencing its x component using the GLSL field selectornotation)toseteachpixel’scolorbasedonitslocation,asshownhere:

Figure2.13Fragmentshadercolorvariation.

Page 35: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

#version430

outvec4color;

voidmain(void)

{if(gl_FragCoord.x<200)color=vec4(1.0,0.0,0.0,1.0);elsecolor=

vec4(0.0,0.0,1.0,1.0);

}

AssumingthatweincreasetheGL_PointSizeaswedidattheendofSection2.1.2,thepixelcolorswill nowvary across the renderedpoint—redwhere thex coordinates are less than200,andblueotherwise,asseeninFigure2.13.

2.1.7 PixelOperations

Asobjects inourscenearedrawn in thedisplay() functionusing theglDrawArrays()command,weusuallyexpectobjectsinfronttoblockourviewofobjectsbehindthem.Thisalso extends to theobjects themselves,whereinweexpect to see the front of anobject, butgenerallynottheback.

Toachievethis,weneedhiddensurfaceremoval,orHSR.OpenGLcanperformavarietyofHSRoperations,dependingontheeffectwewantinourscene.Andeventhoughthisphaseisnotprogrammable,itisextremelyimportantthatweunderstandhowitworks.Notonlywillweneed toconfigure itproperly,wewill laterneed tocarefullymanipulate itwhenweaddshadowstoourscene.

HiddensurfaceremovalisaccomplishedbyOpenGLthroughthecleverlycoordinateduseoftwobuffers:thecolorbufferandthedepthbuffer(sometimescalledtheZ-buffer).Bothofthesebuffersarethesamesizeastheraster—thatis,thereisanentryineachbufferforeverypixelonthescreen.

As various objects are drawn in a scene, pixel colors are generated by the fragmentshader.Thepixelcolorsareplacedinthecolorbuffer—itisthecolorbufferthatisultimatelywritten to the screen.Whenmultiple objects occupy some of the same pixels in the colorbuffer,adeterminationmustbemadeastowhichpixelcolor(s)areretained,basedonwhichobjectisnearesttheviewer.

Hiddensurfaceremovalisdoneasfollows:

Before a scene is rendered, the depth buffer is filled with values representingmaximumdepth.Asapixelcolor isoutputby the fragmentshader, itsdistance fromtheviewer iscalculated.Ifthecomputeddistanceislessthanthedistancestoredinthedepthbuffer(forthatpixel), then: (a) thepixelcolor replaces thecolor in thecolorbuffer,and (b) thedistancereplacesthevalueinthedepthbuffer.Otherwise,thepixelisdiscarded.

ThisprocedureiscalledtheZ-bufferalgorithm,asexpressedinFigure2.14:

Page 36: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure2.14Z-bufferalgorithm.

2.2 DETECTINGOPENGLANDGLSLERRORS

Theworkflow for compiling and runningGLSL code differs from standard coding, inthatGLSL compilation happens at Java runtime. Another complication is that GLSL codedoesn’t runon theCPU (it runson theGPU), so theoperating systemcannotalways catchOpenGLruntimeerrors.Thismakesdebuggingdifficult,becauseitisoftenhardtodetectifashaderfailed,andwhy.

Program2.3(whichfollows)presentssomemodules forcatchinganddisplayingGLSLerrors.Theymakeuseof theerror string lookup toolgluErrorString() from the “GLU”library[GL16],aswellasOpenGLfunctionsglGetShaderiv()andglGetProgramiv(),whichareusedtoprovideinformationaboutcompiledGLSLshadersandprograms.Accompanyingthemis thecreateShaderProgram() function (on the right) from thepreviousProgram 2.2,butwiththeerror-detectingcallsadded.Theseerror-detectingmodulesarealsoavailableingraphicslib3D.

Program2.3containsthefollowingthreeutilities:

checkOpenGLError – checks the OpenGL error flag for the occurrence of anOpenGLerrorprintShaderLog–displaysthecontentsofOpenGL’slogwhenGLSLcompilationfailedprintProgramLog – displays the contents of OpenGL’s log when GLSL linkingfailed

Thefirst,checkOpenGLError(),isusefulforbothGLSLcompilationerrorsandOpenGLruntime errors, so it is highly recommended to use it throughout a Java/JOGL applicationduring development. For example, in the prior example (Program 2.2), the calls to

Page 37: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

glCompileShader()andglLinkProgram()couldeasilybeaugmentedwiththecodeshowninProgram2.3toensurethatanytyposorothercompileerrorswouldbecaughtandtheircausereported.Calls tocheckOpenGLError() couldbeaddedafter runtimeOpenGLcalls, suchasimmediatelyafterthecalltoglDrawArrays().

AnotherreasonthatitisimportanttousethesetoolsisthataGLSLerrordoesnotcausetheJOGLprogramtostop.Sounlesstheprogrammertakesstepstocatcherrorsatthepointthattheyhappen,debuggingwillbeverydifficult.

Program2.3ModulestoCatchGLSLErrors....

importcom.jogamp.opengl.glu.GLU;

....

privatevoidprintShaderLog(intshader)

{GL4gl=(GL4)GLContext.getCurrentGL();

int[]len=newint[1];

int[]chWrittn=newint[1];

byte[]log=null;

//determinethelengthoftheshadercompilationlog

gl.glGetShaderiv(shader,GL_INFO_LOG_LENGTH,len,0);

if(len[0]>0)

{log=newbyte[len[0]];

gl.glGetShaderInfoLog(shader,len[0],chWrittn,0,log,0);

System.out.println("ShaderInfoLog:");

for(inti=0;i<log.length;i++)

{System.out.print((char)log[i]);

}}}

voidprintProgramLog(intprog)

{GL4gl=(GL4)GLContext.getCurrentGL();

int[]len=newint[1];

int[]chWrittn=newint[1];

byte[]log=null;

//determinethelengthoftheprogramlinkinglog

gl.glGetProgramiv(prog,GL_INFO_LOG_LENGTH,len,0);

if(len[0]>0)

{log=newbyte[len[0]];

gl.glGetProgramInfoLog(prog,len[0],chWrittn,0,log,0);

System.out.println("ProgramInfoLog:");

for(inti=0;i<log.length;i++)

{System.out.print((char)log[i]);

}}}

booleancheckOpenGLError()

{GL4gl=(GL4)GLContext.getCurrentGL();

booleanfoundError=false;

GLUglu=newGLU();

intglErr=gl.glGetError();

while(glErr!=GL_NO_ERROR)

Page 38: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

{System.err.println("glError:"+glu.gluErrorString(glErr));

foundError=true;

glErr=gl.glGetError();

}

returnfoundError;

}

ExampleofcheckingforOpenGLerrors:privateintcreateShaderProgram()

{//arraystocollectGLSLcompilationstatusvalues.

//note:one-elementarraysareusedbecausetheassociatedJOGLcalls

requirearrays.

int[]vertCompiled=newint[1];

int[]fragCompiled=newint[1];

int[]linked=newint[1];

....

//catcherrorswhilecompilingshaders

gl.glCompileShader(vShader);

checkOpenGLError(); //canusereturnedboolean

gl.glGetShaderiv(vShader,GL_COMPILE_STATUS,vertCompiled,0);

if(vertCompiled[0]==1)

{System.out.println("...vertexcompilationsuccess.");

}else

{System.out.println("...vertexcompilationfailed.");

printShaderLog(vShader);

}

gl.glCompileShader();

checkOpenGLError(); //canusereturnedboolean

gl.glGetShaderiv(fShader,GL_COMPILE_STATUS,fragCompiled,0);

if(fragCompiled[0]==1)

{System.out.println("...fragmentcompilationsuccess.");

}else

{System.out.println("...fragmentcompilationfailed.");

printShaderLog(fShader);

}

if((vertCompiled[0]!=1)||(fragCompiled[0]!=1))

{System.out.println("\nCompilationerror;return-flags:");

System.out.println("vertCompiled="+vertCompiled[0]+";

fragCompiled="+fragCompiled[0]);

}else

{System.out.println("Successfulcompilation");

}

....

//catcherrorswhilelinkingshaders

gl.glLinkProgram(vfprogram);

checkOpenGLError();

gl.glGetProgramiv(vfprogram,GL_LINK_STATUS,linked,0);

if(linked[0]==1)

{System.out.println("...linkingsucceeded.");

}else

Page 39: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

{System.out.println("...linkingfailed.");

printProgramLog(vfprogram);

}

....

}

Another set of tools that can help in tracking down the source of OpenGL and GLSLerrors is touse JOGL’scomposablepipelinemechanism.There is a rich set of capabilitiesavailable in the DebugGL and TraceGL JOGL classes, which provide debugging and tracingsupport,respectively.Onewayofutilizingthesecapabilitiesinsimplecasesistoaddoneorbothofthefollowingcommandlineoptions:

-Djogl.debug.DebugGL

-Djogl.debug.TraceGL

Forexample,theapplicationcanberunwithbothcapabilitiesenabledasfollows:java-Dsun.java2d.d3d=false-Djogl.debug.DebugGL-Djogl.debug.TraceGLCode

EnablingdebuggingcausesglGetError() tobe invokedateachOpenGLcall.Althoughanyerrormessagesgeneratedtendtonotbeasinformativeasisthecasewhenretrievingtheerror codes as shown inProgram2.3, it canbe aquickwayof narrowingdown the likelylocationwhereanerroroccurred.

EnablingtracingcausesalineofoutputonthecommandwindowtobedisplayedforeachOpenGLcallexecuted—includingthosecalleddirectlybytheapplication,andothersinvokedbyJOGL.Forexample,atraceforProgram2.2producesthefollowingoutput,whichreflectstheorderofcallsinatypicalrun:

glFinish()

glCreateShader(<int>0x8B31)=1

glShaderSource(<int>0x1,<int>0x3,<[Ljava.lang.String;>,

<java.nio.IntBuffer>null)

glCompileShader(<int>0x1)

glCreateShader(<int>0x8B30)=2

glShaderSource(<int>0x2,<int>0x4,<[Ljava.lang.String;>,

<java.nio.IntBuffer>null)

glCompileShader(<int>0x2)

glCreateProgram()=3

glAttachShader(<int>0x3,<int>0x1)

glAttachShader(<int>0x3,<int>0x2)

glLinkProgram(<int>0x3)

glDeleteShader(<int>0x1)

glDeleteShader(<int>0x2)

glGenVertexArrays(<int>0x1,<[I>,<int>0x0)

glBindVertexArray(<int>0x1)

glGetError()=0

glViewport(<int>0x0,<int>0x0,<int>0x180,<int>0xA2)

glUseProgram(<int>0x3)

glDrawArrays(<int>0x0,<int>0x0,<int>0x1)

Although extremely useful during debugging, as with most debugging tools thecomposablepipelineincursconsiderableoverhead,andshouldnotbeenabledinproductioncode.

Page 40: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

There are other tricks for deducing the causes of runtime errors in shader code. Acommon result of shader runtime errors is for the output screen to be completely blank,essentiallywithnooutputatall.Thiscanhappeneven if theerror isaverysmall typo inashader,yetitcanbedifficulttotellatwhichstageofthepipelinetheerroroccurred.Withnooutputatall,it’slikelookingforaneedleinahaystack.

Oneusefultrickinsuchcasesistotemporarilyreplacethefragmentshaderwiththeoneshown in Program 2.2. Recall that in that example, the fragment shader simply output aparticularcolor—solidblue,forexample.Ifthesubsequentoutputisofthecorrectgeometricform (but solid blue), the vertex shader is probably correct, and there is an error in theoriginalfragmentshader.Iftheoutputisstillablankscreen,theerrorismorelikelyearlierinthepipeline,suchasinthevertexshader.

2.3 READINGGLSLSOURCECODEFROMFILES

So far, our GLSL shader code was stored inline in strings. As our programs grow incomplexity, thiswill become impractical.We should instead storeour shader code in files,andreadthemin.

Reading text files is a basic Java skill, and won’t be covered here. However, forpracticality,codetoreadshadersisprovidedinreadShaderSource(),showninProgram2.4and also available in graphicslib3D. It reads the shader text file and returns an array ofstrings,whereeachstringisonelineoftextfromthefile.Itthendeterminesthesizeofthatarraybasedonhowmanylineswerereadin.Notethathere,createShaderProgram()replacestheversionfromProgram2.2.

Program2.4ReadingGLSLSourcefromFiles(.…importsasbefore,plusthefollowing…)

importjava.io.File;

importjava.io.IOException;

importjava.util.Scanner;

publicclassCodeextendsJFrameimplementsGLEventListener

{(…..declarationssameasbefore,display()asbefore)

privateintcreateShaderProgram()

{(……asbeforeplus….)

vshaderSource=readShaderSource("vert.shader");

fshaderSource=readShaderSource("frag.shader");

gl.glShaderSource(vertexShader,vshaderSource.length,vshaderSource,null,

0);

gl.glShaderSource(fragmentShader,fshaderSource.length,fshaderSource,

null,0);

(….etc.,buildingrenderingprogramasbefore)

}

(…..main,constructor,reshape,init,disposeasbefore)

Page 41: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

privateString[]readShaderSource(Stringfilename)

{Vector<String>lines=newVector<String>();

Scannersc;

try{sc=newScanner(newFile(filename));}

catch(IOExceptione)

{System.err.println("IOExceptionreadingfile:"+e);

returnnull;

}

while(sc.hasNext())

{lines.addElement(sc.nextLine());

}

String[]program=newString[lines.size()];

for(inti=0;i<lines.size();i++)

{program[i]=(String)lines.elementAt(i)+"\n";

}

returnprogram;

}}

2.4 BUILDINGOBJECTSFROMVERTICES

Ultimatelywewanttodrawmorethanjustasinglepoint.We’dliketodrawobjectsthatareconstructedofmanyvertices.Largesectionsofthisbookwillbedevotedtothistopic.Fornowwejuststartwithasimpleexample—wewilldefinethreeverticesandusethemtodrawatriangle.

We can do this bymaking two small changes to Program 2.2 (actually, the version inProgram2.4whichreads theshaders fromfiles): (a)modify thevertexshaderso that threedifferent vertices are output to the subsequent stages of the pipeline, and (b) modify theglDrawArrays()calltospecifythatweareusingthreevertices.

In the Java/JOGL application (specifically in the glDrawArrays() call) we specifyGL_TRIANGLES (rather then GL_POINTS), and also specify that there are three vertices sentthroughthepipeline.Thiscausesthevertexshadertorunthreetimes,andateachiteration,thebuilt-invariablegl_VertexIDisautomaticallyincremented(itisinitiallysetto0).Bytestingthe value ofgl_VertexID, the shader is designed to output a different point in each of thethreetimesitisexecuted.Recallthatthethreepointsthenpassthroughtherasterizationstage,producingafilled-intriangle.ThemodificationsareshowninProgram2.5(theremainderofthecodeisthesameaspreviouslyshowninProgram2.4).

Program2.5DrawingaTriangle

VertexShader#version430

voidmain(void)

{if(gl_VertexID==0)gl_Position=vec4(0.25,-0.25,0.0,1.0);

elseif(gl_VertexID==1)gl_Position=vec4(-0.25,-0.25,0.0,1.0);

elsegl_Position=vec4(0.25,0.25,0.0,1.0);

Page 42: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}

Java/JOGLapplication–indisplay()...

gl.glDrawArrays(GL_TRIANGLES,0,3);

Figure2.15Drawingasimpletriangle.

2.5 ANIMATINGASCENE

Manyofthetechniquesinthisbookcanbeanimated.Thisiswhenthingsinthescenearemoving or changing, and the scene is rendered repeatedly to reflect these changes in realtime.

Recall from Section 2.1.1 that OpenGL makes a single call to init(), and then todisplay(),whenitisinitialized.Afterthat,iftherearechangestoourscene,itbecomestheprogrammer’s responsibility to tell OpenGL to call display() again. This is done byinvokingthedisplay()functionintheGLCanvasfromtheJava/JOGLapplication,asfollows:

myCanvas.display();

Thatis,theGLCanvashasitsowndisplay()function.Ifwecallit,theGLCanvaswilltheninturn call back the display() function in our Java/JOGL application. Technically, the twodisplay()sarecompletelydifferentfunctions,buttheyareintertwined.

Oneapproachistocalldisplay()wheneverachangeinthesceneoccurs.Inacomplexscene, this canbecomeunwieldy.Abetter approach is tomake this call repeatedly, at fixedintervalstypicallycalledtheframerate.Eachrenderingofoursceneisthencalledaframe.

Therearemanywaystoorganizethecodeforanimatingascene.Onewayistocreatean“animator” class in Java, using either Timer, or Thread.sleep(), or better yet, aScheduledThreadPoolExecutor.

Anevensimplerwaytobuild theanimator is touseoneof theJOGL-specificanimatorclasses:

Page 43: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Animator

FPSAnimator

Theseclassesaredesignedspecificallyforanimating3Dscenes.AnFPSAnimator(“FPS”stands for “frames per second”), when instantiated, calls the display() function on adrawableobjectrepeatedly,ataspecifiedframerate.

AnexampleisshowninProgram2.6.WehavetakenthetrianglefromProgram2.5andanimateditsothatitmovestotheright,thenmovestotheleft,backandforth.

TheanimatorisinstantiatedbytheJava/JOGLapplicationinitsconstructor.Afterthat,theapplication is free tomake changes to the scene at any time and at any speed. The screen,however,willonlybeupdatedatintervalscorrespondingtothespecifiedframerate.ThefirstparameterontheFPSAnimatorconstructorcallspecifiesthedrawableobject,andthesecondparameterspecifiestheframerate(framespersecond).Inthisexample,thereare50callspersecondtotheGLCanvas’sdisplay()function.

In Program 2.6, the application’s display() method maintains a variable “x” used tooffset the triangle’sX coordinate position. Its value changes at each frame, and it reversesdirection each time it reaches 1.0 or -1.0. The value in x is copied to a correspondingvariable called “offset” in the vertex shader.Themechanism that performs this copy usessomethingcalledauniformvariable,whichwewillstudylaterinChapter4.Itisn’tnecessaryto understand the details of uniform variables yet. For now, just note that the Java/JOGLapplicationfirstcallsglGetUniformLocation()togetapointertothe“offset”variable,thencallsglProgramUniform1f()tocopythevalueofxintooffset.Thevertexshaderthenaddstheoffset to theXcoordinateof the trianglebeingdrawn.Notealso that thebackground isclearedateachcalltodisplay(),toavoidthetriangleleavingatrailasitmoves.Figure2.16illustratesthedisplayatthreetimeinstances(ofcourse,themovementcan’tbeshowninastillfigure).

Program2.6SimpleAnimationExample

Java/JOGLapplication:

//sameimportsanddeclarationsasbefore,plusthefollowing:

importcom.jogamp.opengl.util.*;

...

publicclassCodeextendsJFrameimplementsGLEventListener

{//samedeclarationsasbefore,plus:

privatefloatx=0.0f;//locationoftriangle

privatefloatinc=0.01f;//offsetformovingthetriangle

publicCode()

{//sameconstructorasbefore,plusthisattheend,afterthecallto

setVisible(true).

//Thesecondparameterspecifiestheframespersecond

FPSAnimatoranimtr=newFPSAnimator(myCanvas,50);

animtr.start();

Page 44: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}

publicvoiddisplay(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

gl.glUseProgram(rendering_program);

floatbkg[]={0.0f,0.0f,0.0f,1.0f}; //clearthebackgroundto

black,eachtime

FloatBufferbkgBuffer=Buffers.newDirectFloatBuffer(bkg);

gl.glClearBufferfv(GL_COLOR,0,bkgBuffer);

Figure2.16

Page 45: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Ananimated,movingtriangle.

SUPPLEMENTALNOTES

There are many details of the OpenGL pipeline that we have not discussed in thisintroductory chapter. We have skipped a number of internal stages, and have completelyomitted how textures are processed. Our goal was to map out, as simply as possible, theframework inwhichwewillbewritingourcode.Asweproceedwewill continue to learnadditionaldetails.

Wehave alsodeferredpresenting code examples for tessellation andgeometry. In laterchapterswewillbuildcompletesystemsthatshowhowtowritepracticalshadersforeachofthestages.

WeignoredonedetailontheglShaderSource()command.Thefourthparameterisusedtospecifya“lengthsarray”thatcontainstheintegerstringlengthsofeachlineofcodeinthegivenshaderprogram.Ifthisparameterissettonull,aswehavedone,OpenGLwillbuildthis arrayautomatically if the stringsarenull-terminated. JOGLensures that strings sent toglShaderSource() are null-terminated. However, it is not uncommon to encounterapplicationsthatbuildthesearraysmanuallyratherthansendingnull.

Thecomposablepipelinecanalsobeconfiguredwithin theJava/JOGLapplicationcode[JU16], rather than just enabling it on the command line (aswas described in Section 2.2).This can be useful for utilizing the debugging and tracing tools based on interactive input(e.g.,auserkeystroke).

Throughout thisbook, thereadermayat timeswish toknowoneormoreofOpenGL’supper limits. For example, the programmermight need to know themaximum number ofoutputs that can be produced by the geometry shader, or the maximum size that can bespecifiedforrenderingapoint.Manysuchvaluesareimplementation-dependent,meaningforexample that theycanvarybetweendifferentmachines.OpenGLprovidesamechanism forretrievingsuchlimitsusingtheglGet()command,whichtakesvariousformsdependingonthetypeof theparameterbeingqueried.Forexample, tofindthemaximumallowablepointsize, the following callwill place theminimum andmaximumvalues (for yourmachine’sOpenGLimplementation)intothefirsttwoelementsofthefloatarraynamed“size”:

gl.glGetFloatv(GL_POINT_SIZE_RANGE,size,0)

Manysuchqueriesarepossible.ConsulttheOpenGLreference[OP16]documentationforexamples.

Exercises

2.1 ModifyProgram2.2toaddanimationthatcausesthedrawnpointtogrowandshrink,inacycle.Hint:usetheglPointSize()function,withavariableastheparameter.

Page 46: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

2.2 ModifyProgram2.5sothatitdrawsanisoscelestriangle(ratherthantherighttriangleshowninFigure2.15).

2.3 (PROJECT)ModifyProgram2.5toincludetheerror-checkingmodulesshowninProgram2.3.Afteryouhavethatworking,tryinsertingvariouserrorsintotheshadersandobservingboththeresultingbehaviorandtheerrormessagesgenerated.

References

[GV16] GLUTandOpenGLUtilityLibraries,accessedJuly2016,https://www.opengl.org/resources/libraries/.

[JU16] JOGLUsersGuide,accessedJuly2016,https://jogamp.org/jogl/doc/userguide/.

[OP16] OpenGL4.5ReferencePages,accessedJuly2016,https://www.opengl.org/sdk/docs/man/.

[SW15] G.Sellers,R.WrightJr.,andN.Haemel,OpenGLSuperBible:ComprehensiveTutorialandReference,7thed.(Addison-Wesley,2015).

1Inthisexample,wehavedescribedeachparameterinthecall.However,asthebookproceeds,wewillsometimesnotbotherdescribingaparameterwhenwebelievethatdoingsowouldcomplicatemattersunnecessarily.ThereadershouldgetusedtousingtheJOGL/OpenGLdocumentationtofillinsuchdetailsasnecessary.

Page 47: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 3

MATHEMATICALFOUNDATIONS

3.1 3DCoordinateSystems3.2 Points3.3 Matrices3.4 TransformationMatrices3.5 Vectors3.6 LocalandWorldSpace3.7 EyeSpaceandtheSyntheticCamera3.8 ProjectionMatrices3.9 Look-AtMatrix3.10 GLSLFunctionsforBuildingMatrixTransforms

SupplementalNotes

Computer graphics makes heavy use of mathematics, particularly matrices and matrixalgebra. Although we tend to consider 3D graphics programming to be among the mostcontemporaryoftechnicalfields(andinmanyrespectsitis),manyofthetechniquesthatareusedactuallydatebackhundredsofyears.SomeofthemwerefirstunderstoodandcodifiedbythegreatphilosophersoftheRenaissanceera.

Virtually every facet of 3D graphics, every effect—movement, scale, perspective,texturing, lighting, shadows, and so on—all will be accomplished largely mathematically.Thereforethischapterlaysthegroundworkuponwhicheverysubsequentchapterrelies.

It isassumed thereaderhasabasicknowledgeofmatrixoperations;a fullcoverageofbasicmatrixalgebra isbeyond thescopeof this text.Therefore, ifatanypointaparticularmatrix operation is unfamiliar, itmaybenecessary to do some supplementarybackgroundreadingtoensurefullunderstandingbeforeproceeding.

Page 48: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

3.1 3DCOORDINATESYSTEMS

3D space is generally represented with three axes: X, Y, and Z. The three axes can bearranged into twoconfigurations,right-handedor left-handed. (Thenamederives from theorientation of the axes as if constructed by pointing the thumb and first two fingers of therightversusthelefthand,atrightangles.)

Figure3.13Dcoordinatesystems.

It is important to know which coordinate system your graphics programmingenvironment uses. For example, the majority of coordinate systems in OpenGL are right-handed, whereas in Direct3D the majority are left-handed. Throughout this book, we willassumearight-handedconfigurationunlessotherwisestated.

3.2 POINTS

Pointsin3DspacecanbespecifiedbylistingtheX,Y,Zvalues,usinganotationsuchas(2, 8, -3). However, it turns out to be much more useful to specify points usinghomogeneous notation, a representation first described in the early 1800s. Points inhomogeneousnotationcontainfourvalues, thefirst threecorresponding toX,Y,andZ,andthefourth,W,isalwaysafixednonzerovalue,usually1.Thus,werepresentthispointas[2,8,-3,1].Aswewillseeshortly,homogeneousnotationwillmakemanyofourgraphicscomputationsmoreefficient.

The appropriateGLSL data type for storingpoints in homogeneous 3Dnotation isvec4.(“vec”referstovector,butitcanalsobeusedforapoint).Thegraphicslib3DlibraryincludesaclassforcreatingandstoringhomogeneouspointsintheJavaapplication,calledPoint3D.

3.3 MATRICES

Amatrixisarectangulararrayofvalues,anditselementsaretypicallyaccessedbymeans

Page 49: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

ofsubscripts.Thefirstsubscriptreferstotherownumber,andthesecondsubscriptreferstothecolumnnumber,withthesubscriptsstartingat0.Mostofthematricesthatwewillusefor3Dgraphicscomputationsareofsize4x4,asshowninFigure3.2:

Figure3.24x4matrix.

The GLSL language includes a data type called mat4 that can be used for storing 4x4matrices. Similarly, graphicslib3D includes a class called Matrix3D for instantiating andstoring4x4matrices.

Theidentitymatrixcontainsallzeros,withonesalongthediagonal.Anyitemmultipliedbytheidentitymatrixisunchanged:

In graphicslib3D the identity matrix is available through the functionMatrix3D.setToIdentity().

The transpose of a matrix is computed by interchanging its rows and columns. Forexample:

The graphicslib3D library and GLSL both have transpose functions:Matrix3D.transpose()andtranspose(mat4),respectively.

Matrixadditionisstraightforward:

Page 50: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

InGLSLthe+operatorisoverloadedonmat4tosupportmatrixaddition.

Therearevariousmultiplicationoperationsthatcanbedonewithmatricesthatareusefulin3Dgraphics.Matrixmultiplicationingeneralcanbedoneeitherleft-to-right,orright-to-left(notethatsincetheseoperationsaredifferent,itfollowsthatmatrixmultiplicationisnotcommutative).Mostofthetimewewilluseright-to-leftmultiplication.

In 3Dgraphics,multiplyingapoint byamatrix is inmost cases done right-to-left, asfollows:

Notethatwerepresentthepoint(X,Y,Z)inhomogeneousnotationasa1-columnmatrix.

GLSL and graphicslib3D support multiplying a point by a matrix:Point3D.mult(Matrix3D)ingraphicslib3D,andinGLSLwiththe*operator.

Multiplyinga4x4Matrixbyanother4x4matrixisdoneasfollows:

Matrixmultiplicationisfrequentlyreferredtoasconcatenation,becauseaswillbeseen,itisused tocombinea setofmatrix transforms intoa singlematrix.Thisability tocombinematrix transforms is made possible because of the associative property of matrixmultiplication.Considerthefollowingsequenceofoperations:

NewPoint=Matrix1*(Matrix2*(Matrix3*Point))

Here,wemultiplyapointbyMatrix3,thenmultiplythatresultbyMatrix2,andthatresultfinallybyMatrix1.Theresultisanewpoint.Theassociativepropertyensuresthattheabovecomputationisequivalentto:

NewPoint=(Matrix1*Matrix2*Matrix3)*Point

Here,wefirstmultiplythethreematricestogether,formingtheconcatenationofMatrix1,Matrix2,andMatrix3 (whichitself isalsoa4x4matrix). Ifwerefer to thisconcatenationasMatrixC,wecanrewritetheaboveoperationas:

Page 51: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

NewPoint=MatrixC*Point

Theadvantagehere,aswewillseeinChapter4, is thatwewillfrequentlyneedtoapplythesamesequenceofmatrixtransformationstoeverypointinourscene.Bypre-computingtheconcatenationofallofthosematricesonce,itturnsoutthatwecanreducethetotalnumberofmatrixoperationsneededmanyfold.

GLSL and graphicslib3D both support matrix multiplication; in GLSL with theoverloaded * operator, and in graphicslib3D with the functionMatrix3D.concatenate(Matrix3D).

Theinverseofa4x4matrixM isanother4x4matrix,denotedM-1, thathas thefollowingpropertyundermatrixmultiplication:

M*(M-1)=(M-1)*M=identitymatrix

Wewon’tpresentthedetailsofcomputingtheinversehere.However,itisworthknowingthat determining the inverse of amatrix can be computationally expensive; fortunately,wewillrarelyneedit.Intherareinstanceswhenwedo,itisavailableingraphicslib3DthroughthefunctionMatrix3D.inverse(),andinGLSLthroughthemat4.inverse()function.

3.4 TRANSFORMATIONMATRICES

Ingraphics,matricesare typicallyused forperforming transformations on objects. Forexample,amatrixcanbeusedtomoveapointfromonelocationtoanother.Inthischapterwewilllearnseveralusefultransformationmatrices:

TranslationRotationScaleProjectionLook-At

Animportantpropertyofourtransformationmatricesisthattheyareallofsize4x4.Thisismadepossiblebyourdecision touse thehomogeneous notation.Otherwise, someof thetransformswouldbeofdiverseandincompatibledimensions.Aswehaveseen,ensuringtheyare the same size is not just for convenience; it also makes it possible to combine themarbitrarily,andpre-computegroupsoftransformsforimprovedperformance.

3.4.1 Translation

Atranslationmatrixisusedtomoveitemsfromonelocationtoanother.Itconsistsofanidentitymatrix,with theX,Y,andZmovement(s)given in locationsA03,A13,A23. Figure 3.3shows the form of a translationmatrix, and its effect whenmultiplied by a homogeneouspoint;theresultisanewpoint“moved”bythetranslatevalues.

Page 52: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.3Translationmatrixtransform.

Note that point (X,Y,Z) is translated (ormoved) to location (X+Tx, Y+Ty, Z+Tz) as aresultofbeingmultipliedbythetranslationmatrix.Alsonotethatmultiplicationisspecifiedright-to-left.

Forexample,ifwewishtomoveagroupofpointsupwards,5unitsalongthepositiveYdirection,we could build a translationmatrix by taking an identitymatrix and placing thevalue5intheTYpositionshownabove.Thenwesimplymultiplyeachofthepointswewishtomovebythematrix.

There are several functions in graphicslib3D for building translationmatrices, and formultiplyingpointsbymatrices.Somerelevantfunctionsare:

Matrix3D.translate(x,y,z)

Point3D.mult(Matrix3D)

3.4.2 Scaling

Ascalematrixisusedtochangethesizeofobjects,or tomovepointstowardsorawayfromtheorigin.Althoughitmayinitiallyseemstrangetoscaleapoint,objects inOpenGLaredefinedbygroupsofpoints.So,scalinganobjectinvolvesexpandingorcontractingitssetofpoints.

Thescalematrixtransformconsistsofanidentitymatrix,withtheX,Y,andZscalefactorsgiven in locationsA00,A11,A22.Figure 3.4 shows the form of a scalematrix, and its effectwhenmultiplied by a homogeneous point; the result is a new point modified by the scalevalues.

Figure3.4Scalematrixtransform.

There are several functions in graphicslib3D for building scale matrices, multiplying

Page 53: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

pointsbymatrices,andscalingapointbyasinglescalefactor.Somerelevantfunctionsare:

Matrix3D.scale(x,y,z)

Point3D.mult(Matrix3D)

Point3D.scale(f)

Scaling can be used to switch coordinate systems. For example, we can use scale todetermine what the left-hand coordinates would be, given a set of right-hand coordinates.FromFigure3.1weseethatnegatingtheZcoordinatewouldtogglebetweenright-handandleft-handsystems,sothescalematrixtransformtoaccomplishthisis:

3.4.3 Rotation

Rotationisabitmorecomplex,becauserotatinganitemin3Dspacerequiresspecifying(a)anaxisofrotation,and(b)arotationamountindegreesorradians.

In themid-1700s, themathematicianLeonhardEuler showed that a rotation around anydesiredaxiscouldbespecified insteadasacombinationof rotationsaround theX,Y, andZaxes[EU76].Thesethreerotationangles,aroundtherespectiveaxes,havecometobeknownas Euler angles. The discovery, known as Euler’s Theorem, is very useful to us, becauserotationsaroundeachofthethreeaxescanbespecifiedusingmatrixtransforms.

The three rotation transforms, around the X, Y, and Z axes respectively, are shown inFigure 3.5. There are several functions in graphicslib3D for building and using rotationmatricesaswell:

Matrix3D.rotateX(degrees)

Matrix3D.rotateY(degrees)

Matrix3D.rotateZ(degrees)

Matrix3D.rotate(θx,θy,θz)

Point3D.mult(Matrix3D)

Page 54: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.5Rotationtransformmatrices.

Inpractice,usingEuleranglestorotateanitemaroundanarbitrarylinein3Dspacetakesacoupleofadditionalstepsifthelinedoesn’tpassthroughtheorigin.Ingeneral:

1. Translatetheaxisofrotationsothatitgoesthroughtheorigin.2. RotatebyappropriateEuleranglesaroundX,Y,andZ.3. UndothetranslationofStep1.

ThethreerotationtransformsshowninFigure3.5eachhavetheinterestingpropertythatthe inverse rotation happens to equal the transpose of the matrix. This can be verified byexamining the abovematrices, recalling thatcos(-θ) = cos(θ), and sin(-θ) = -sin(θ).Thispropertywillbecomeusefullater.

Euleranglescancausecertainartifactsinsome3Dgraphicapplications.Forthatreasonitisoftenadvisabletousequaternionsforcomputingrotations.Manyresourcesexistforthosereadersinterestedinexploringquaternions(e.g.,[KU98]).Eulerangleswillsufficeformostofourneeds.

3.5 VECTORS

Vectors specifyamagnitude anddirection.They are not bound to a specific location; a

Page 55: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

vectorcanbe“moved”withoutchangingwhatitrepresents.

Therearevariouswaystonotateavector,suchasalinesegmentwithanarrowheadatoneend, or as a pair (magnitude, direction), or as the difference between two points. In 3Dgraphics,vectorsarefrequentlyrepresentedasasinglepointinspace,wherethevectoristhedistanceanddirectionfromtheorigintothatpoint.InFigure3.6,vectorV(showninred)canbe specified either as thedifferencebetweenpointsP1 andP2, or as an equivalent distancefromtheorigin toP3. Inallofourapplications,wespecifyVassimply(x,y,z), the samenotationusedtospecifythepointP3.

Figure3.6TworepresentationsforavectorV.

It is convenient to represent a vector the sameway as a point, becausewe can use ourmatrix transformsonpointsorvectors interchangeably.However, it alsocanbeconfusing.For this reasonwe sometimeswillnotate avectorwitha small arrowabove it (suchas ).Some graphics systems do not distinguish between a point and a vector at all, such as inGLSL,whichprovidesdatatypesvec3andvec4thatcanholdeitherpointsorvectors.Ontheother hand, graphicslib3D has separate Point3D and Vector3D classes, and it enforcesappropriate use of one or the other depending on the operation being done. It is an opendebateastowhetheritissimplertouseonedatatypeforboth,orseparatedatatypes.

There are several vector operations that are used frequently in 3D graphics, forwhichthere are functions available in graphicslib3D and GLSL. For example, assuming vectorsA(u,v,w)andB(x,y,z):

AdditionandSubtraction:

A±B=(u±x,v±y,w±z)

graphicslib3D:Vector3D.add(Vector3D)

GLSL:vec3±vec3

Normalize(changetolength=1):

Â=A/|A|=A/sqrt(u2+v2+w2),where|A|≡lengthofvectorA

graphicslib3D:Vector3D.normalize()

GLSL:normalize(vec3)ornormalize(vec4)

DotProduct:

A•B=ux+vy+wz

graphicslib3D:Vector3D.dot(Vector3D)

GLSL:dot(vec3,vec3)ordot(vec4,vec4)

CrossProduct:

A×B=(vz-wy,wx-uz,uy-vx)

Page 56: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

graphicslib3D:Vector3D.cross(Vector3D)

GLSL:cross(vec3,vec3)

Otherusefulvectorfunctionsaremagnitude(whichisavailableinbothgraphicslib3DandGLSL),andreflectionandrefraction(bothofwhichareavailableinGLSL).

Weshallnowtakeacloserlookatthefunctionsdotproductandcrossproduct.

3.5.1 UsesforDotProduct

Throughout this book, our programs make heavy use of the dot product. The mostimportant and fundamental use is for finding theangle between two vectors. Consider twovectors and ,andsaywewishtofindtheangleθseparatingthem.

Therefore, if and are normalized (i.e., of unit length—recall the “^” notation fornormalization),then:

Interestingly,wewilllaterseethatoftenitiscos(θ)thatweneed,ratherthanθitself.So,bothoftheabovederivationswillbedirectlyuseful.

Thedotproductalsohasavarietyofotheruses:

findingavector ’smagnitude:twovectorsareperpendicular,iftwovectorsareparallel,if:parallelbutpointinginoppositedirections:anglebetweenvectorsliesintherange[-90°…+90°]:

minimumsigneddistance frompointP=(x,y,z) toplaneS=(a,b,c,d).First, findunitvectornormal toS: , and shortestdistance

fromtheorigintotheplane.

Then, the minimum signed distance from P to S is: and the sign of thisdistancedeterminesonwhichsideoftheplaneSpointPlies.

Page 57: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

3.5.2 UsesforCrossProduct

Animportantpropertyofthecrossproductoftwovectors,whichwewillmakeheavyuseofthroughoutthisbook,isthatitproducesavectorthatisnormal(perpendicular)totheplanedefinedbytheoriginaltwovectors.

Anytwonon-collinearvectorsdefineaplane.Forexample,considertwoarbitraryvectorsand .Sincevectorscanbemovedwithoutchangingtheirmeaning,theycanbemovedsothattheiroriginscoincide.Figure3.8showsaplanedefinedby and ,andthenormalvectorresultingfromtheircrossproduct.Thedirectionoftheresultingnormalobeystheright-handrule,whereincurlingthefingersofone’srighthandfrom to causesthethumbtopointinthedirectionofthenormalvector .

Figure3.7Crossproductproducesnormalvector.

Note that theorder is significant; wouldproduceavector in theoppositedirectionfrom .

The ability to find normal vectors by using the cross product will become extremelyuseful laterwhenwe study lighting. In order to determine lighting effects,wewill need toknow outward normals associated with the model we are rendering. Figure 3.8 shows anexampleofasimplemodelmadeupofsixpoints(vertices),andthecomputationemployingcrossproductthatdeterminestheoutwardnormalofoneofitsfaces.

Figure3.8Computingoutwardnormals.

3.6 LOCALANDWORLDSPACE

Themost common use for 3D graphics (with OpenGL or any other framework) is tosimulateathree-dimensionalworld,placeobjectsinit,andthenviewthatsimulatedworldon

Page 58: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

amonitor.Theobjectsplacedinthe3Dworldareusuallymodeledascollectionsoftriangles.Later, inChapter6,wewilldive into themodelingprocess.Butwecanstart lookingat theoverallprocessnow.

When building a 3D model of an object, we generally orient the model in the mostconvenientmannerfordescribingit.Forexample,whenmodelingasphere,wemightorientthemodelwiththesphere’scenterattheorigin(0,0,0),andgiveitaconvenientradius,suchas1.Thespaceinwhichamodelisdefinediscalledits localspace,ormodelspace.OpenGLdocumentationusesthetermobjectspace.

Thespheremightthenbeusedasapieceofalargermodel,suchasbecomingtheheadonarobot.Therobotwould,ofcourse,bedefinedinitsownlocal/modelspace.Positioningthespheremodelintotherobotmodelspacecanbedoneusingthematrixtransformsforscale,rotation,andtranslation,asillustratedinFigure3.9.In thismanner,complexmodelscanbebuilt hierarchically (this is developed further in Section4.8 of Chapter 4, using a stack ofmatrices).

Figure3.9Modelspacesforasphereandarobot.

Inthesamemanner,modeledobjectsareplacedinasimulatedworldbydecidingontheorientationanddimensionsof thatworld,calledworldspace.Thematrix thatpositionsandorientsanobjectintoworldspaceiscalledamodelmatrix,orM.

3.7 EYESPACEANDTHESYNTHETICCAMERA

Sofar,thetransformmatriceswehaveseenalloperatein3Dspace.Ultimately,however,wewillwant todisplayour3Dspace—oraportionof it—ona2Dmonitor. Inorder todothis,weneed todecideonavantagepoint. Just aswe seeour realworld throughour eyesfrom a particular point, in a particular direction, so too must we establish a position andorientationasthewindowintoourvirtualworld.Thisvantagepointiscalled“view”or“eye”space,orthe“syntheticcamera.”

Page 59: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.10Positioningacamerainthe3Dworld.

As shown in Figure 3.10 and Figure 3.12, viewing involves: (a) placing the camera atsomeworldlocation;(b)orientingthecamera,whichusuallyrequiresmaintainingitsownsetoforthogonalaxes ;(c)definingaviewvolume;and(d)projectingobjectswithinthevolumeontoaprojectionplane.

OpenGL includes a camera that is permanently fixed at the origin (0,0,0), and facesdownthenegativeZ-axis,asshowninFigure3.11.

Figure3.11OpenGLfixedcamera.

InordertousetheOpenGLcamera,oneofthethingsweneedtodoissimulatemovingittosomedesiredlocationandorientation.This isdonebyfiguringoutwhereourobjects intheworldare locatedrelative to thedesiredcameraposition(i.e.,where theyare located in“cameraspace,”asdefinedbytheU,V,andNaxesofthecameraasillustratedinFigure3.12).GivenapointatworldspacelocationPW,weneedatransformtoconvertittotheequivalentpointincameraspace,makingitappearasthoughweareviewingitfromthedesiredcameralocationCw.WedothisbycomputingitscameraspacepositionPC.KnowingthattheOpenGLcameralocationisalwaysatthefixedposition(0,0,0),whattransformwouldachievethis?

Page 60: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.12Cameraorientation.

Thenecessarytransformsare:

1. TranslatePWbythenegativeofthedesiredcameralocation.2. RotatePWbythenegativeofthedesiredcameraorientationEulerangles.

We can build a single transform that does both the rotation and the translation in onematrix,calledtheviewingtransformmatrix,orV.ThematrixVisproducedbyconcatenatingthe two matrices T (a translation matrix containing the negative of the desired cameralocation),andR(arotationmatrixcontainingthenegativeofthedesiredcameraorientation).Inthiscase,workingfromrighttoleft,wefirsttranslateworldpointP,thenrotateit:

PC=R*(T*PW)

Aswesawearlier,theassociativeruleallowsustogrouptheoperationsinsteadthusly:PC=(R*T)*PW

IfwesavetheconcatenationR*TinthematrixV,theoperationnowlookslike:PC=V*PW

Thecompletecomputation,andtheexactcontentsofmatricesTandR,isshowninFigure3.13(weomitthederivationofmatrixR—aderivationisavailablein[FV95]).

Figure3.13

Page 61: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

DerivingaViewmatrix.

Morecommonly, theVmatrix isconcatenatedwith themodelmatrixM to forma singlemodel-view(MV)matrix:

MV=V*M

Then,apointPM in itsownmodel space is transformeddirectly tocameraspace inonestepasfollows:

PC=MV*PM

The advantage of this approach becomes clear when one considers that, in a complexscene,wewillneedtoperformthistransformationnotonjustonepoint,butoneveryvertexinthescene.Bypre-computingMV,transformingeachpointintoviewspacewillrequireustodo justonematrixmultiplicationpervertex, rather than two.Later,wewill see thatwecanextendthisprocesstopre-computingseveralmatrixconcatenations,reducingtheper-vertexcomputationsconsiderably.

3.8 PROJECTIONMATRICES

Now that we have established the camera, we can examine projection matrices. Twoimportant projection matrices that we will now examine are (a) perspective and (b)orthographic.

3.8.1 ThePerspectiveProjectionMatrix

Perspectiveprojectionattemptstomakea2Dpictureappear3D,byutilizingtheconceptofperspective tomimicwhatweseewhenwe lookat therealworld.Objects thatarecloseappearlargerthanobjects thatarefaraway,andinsomecases, linesthatareparallel in3Dspacearenolongerparallelwhendrawnwithperspective.

Perspectivewasoneof the great discoveries of theRenaissance era in the 1400–1500s,whenartistsstartedpaintingwithmorerealismthandidtheirpredecessors.

Page 62: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.14Annunciation,withSaintEmidius(Crivelli–1486).

AnexcellentexamplecanbeseeninFigure3.14,the“AnnunciationwithSt.Emidius,”byCarloCrivelli,paintedin1486(currentlyheldattheNationalGalleryinLondon[CR86]).Theintenseuseofperspectiveisclear—therecedinglinesoftheleft-facingwallofthebuildingontherightareslantedtowardseachotherdramatically.Thiscreatestheillusionofdepthand3Dspace, and in theprocess lines that areparallel in reality arenotparallel in thepicture.Also,thepeopleintheforegroundarelargerthanthepeopleinthebackground.Whiletodaywetakethesedevicesforgranted,findingatransformationmatrixtoaccomplishthisrequiressomemathematicalanalysis.

We achieve this effect by using a matrix transform that converts parallel lines intoappropriate non-parallel lines. Such a matrix is called a perspectivematrix or perspectivetransform,and isbuiltbydefining thefourparametersofaviewvolume.Thoseparametersare(a)aspectratio,(b)fieldofview,(c)projectionplaneornearclippingplane,and(d) farclippingplane.

Only objects between thenear and far clipping planes are rendered. The near clippingplane also serves as the plane onwhich objects are projected, and is generally positionedclose to the eye or camera (shown on the left in Figure 3.15). Selection of an appropriatevalue for the far clipping plane is discussed inChapter 4. The field of view is the verticalangleofviewablespace.Theaspectratioistheratiowidth/heightofthenearandfarclippingplanes.TheshapeformedbytheseelementsandshowninFigure3.15iscalledafrustum.

Page 63: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.15Perspectiveviewvolumeorfrustum.

The perspective matrix is used to transform points in 3D space to their appropriatepositiononthenearclippingplane,andisbuiltbyfirstcomputingvaluesq,A,B,andC,andthen using those values to construct the matrix, as shown in Figure 3.16 (and derived in[FV95]).

Figure3.16BuildingaPerspectivematrix.

InChapter4wewillimplementafunctionthatgeneratestheperspectivetransformmatrix.

3.8.2 TheOrthographicProjectionMatrix

Page 64: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

In orthographic projection, parallel lines remain parallel; that is, perspective isn’temployed.Instead,objectsthatarewithintheviewvolumeareprojecteddirectly,withoutanyadjustmentoftheirsizesduetotheirdistancesfromthecamera.

Figure3.17Orthographicprojection.

Anorthographicprojection is aparallelprojection inwhichallprojectionsare at rightangleswith theprojectionplane.Anorthographicmatrix isbuiltbydefining the followingparameters: (a) the distanceZnear from the camera to the projection plane, (b) the distanceZfarfromthecameratothefarclippingplane,and(c)valuesforL,R,T,andB,withLandRcorresponding to theX coordinatesof the left and rightboundariesof theprojectionplane,respectively,andTandBcorrespondingtotheYcoordinatesofthetopandbottomboundariesoftheprojectionplane,respectively.Theorthographicprojectionmatrix,asderivedin[FV95]isthen:

Figure3.18Orthographicprojectionmatrix.

Not all parallel projections are orthographic, but others are out of the scope of thistextbook.

Parallelprojectionsdon’tmatchwhattheeyeseeswhenlookingattherealworld.Buttheyareusefulinavarietyofsituations,suchasincastingshadows,performing3Dclipping,andinCAD(computeraideddesign)—thelatterbecausetheypreservemeasurementregardlessofthe placement of the objects. Regardless, the great majority of examples in this book useperspectiveprojection.

Page 65: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

3.9 LOOK-ATMATRIX

Thefinal transformationwewillexamineis the look-atmatrix.This ishandywhenyouwish to place the camera at one location, and look towards a particular other location, asillustratedinFigure3.19.Ofcourse,itwouldbepossibletoachievethisusingthemethodswehavealreadyseen,butitissuchacommonoperationthatbuildingonematrixtransformtodoitisoftenuseful.

Figure3.19ElementsofLook-At.

A look-at transform still requires deciding on a camera orientation. We do this byspecifyingavectorapproximatingthegeneralorientationdesired(suchastheworld axis).Typically,asequenceofcrossproductscanbeusedtothengenerateasuitablesetofforward,side,andupvectorsforthedesiredcameraorientation.Figure3.20showsthecomputations,startingwith the camera location (eye), target location, and initial upvector , to build thelook-atmatrix,asderivedin[FV95].

Page 66: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.20Look-Atmatrix.

We can encode this as a simple Java/JOGLutility function that builds a look-atmatrix,givenspecifiedvaluesforcameralocation,targetlocation,andtheinitial“up”vector .Thisfunction,showninFigure3.21,willbeusefullaterinthistextbook,particularlyinChapter8whenwegenerateshadows.

3.10 GLSLFUNCTIONSFORBUILDINGMATRIXTRANSFORMS

Although graphicslib3D includes predefined functions for performingmany of the 3Dtransformations covered in this chapter, such as translation, rotation, and scale,GLSLonlyincludes basicmatrix operations such as addition, concatenation, and so on. It is thereforesometimesnecessarytowriteourownGLSLutilityfunctionsforbuilding3Dtransformationmatriceswhenweneedthemtoperformcertain3Dcomputationsinashader.TheappropriatedatatypetoholdsuchamatrixinGLSLismat4.

Thesyntaxforinitializingmat4matricesinGLSLloadsvaluesbycolumns.Thefirstfourvalues are put into the first column, the next four into the next column, and so forth, asillustratedinthefollowingexample:

Page 67: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure3.21Java/JOGLfunctionforgeneratingaLook-Atmatrix.

mat4translationMatrix=

mat4(1.0,0.0,0.0,0.0, //notethisistheleftmostcolumn,notthe

toprow

0.0,1.0,0.0,0.0,

0.0,0.0,1.0,0.0,

tx,ty,tz,1.0);

whichbuildsthetranslationmatrixdescribedpreviouslyinFigure3.3.

Program3.1includesfiveGLSLfunctionsforbuilding4x4translation,rotation,andscalematrices,eachcorrespondingtoformulasgivenearlierinthischapter.Wewillusesomeofthesefunctionslaterinthebook.

Program3.1BuildingTransformationMatricesinGLSL//buildsandreturnsatranslationmatrix

mat4buildTranslate(floatx,floaty,floatz)

{mat4trans=mat4(1.0,0.0,0.0,0.0,

0.0,1.0,0.0,0.0,

0.0,0.0,1.0,0.0,

x,y,z,1.0);

returntrans;

}

//buildsandreturnsamatrixthatperformsarotationaroundtheXaxis

mat4buildRotateX(floatrad)

{mat4xrot=mat4(1.0,0.0,0.0,0.0,

0.0,cos(rad),-sin(rad),0.0,

0.0,sin(rad),cos(rad),0.0,

0.0,0.0,0.0,1.0);

returnxrot;

}

//buildsandreturnsamatrixthatperformsarotationaroundtheYaxis

mat4buildRotateY(floatrad)

{mat4yrot=mat4(cos(rad),0.0,sin(rad),0.0,

0.0,1.0,0.0,0.0,

-sin(rad),0.0,cos(rad),0.0,

0.0,0.0,0.0,1.0);

returnyrot;

}

//buildsandreturnsamatrixthatperformsarotationaroundtheZaxis

mat4buildRotateZ(floatrad)

{mat4zrot=mat4(cos(rad),-sin(rad),0.0,0.0,

sin(rad),cos(rad),0.0,0.0,

0.0,0.0,1.0,0.0,

0.0,0.0,0.0,1.0);

returnzrot;

}

Page 68: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//buildsandreturnsascalematrix

mat4buildScale(floatx,floaty,floatz)

{mat4scale=mat4(x,0.0,0.0,0.0,

0.0,y,0.0,0.0,

0.0,0.0,z,0.0,

0.0,0.0,0.0,1.0);

returnscale;

}

SUPPLEMENTALNOTES

Inthischapterwehaveseenexamplesofapplyingmatrixtransformationstopoints.Later,we will also want to apply these same transforms to vectors. In order to accomplish atransform on a vector V equivalent to applying some matrix transform M to a point, it isnecessary in the general case to compute the inverse transpose of M, denoted (M-1)T, andmultiplyVbythatmatrix.Insomecases,M=(M-1)T,andinthosecasesit ispossibletosimplyuseM.Forexample,thebasicrotationmatriceswehaveseeninthischapterareequaltotheirown inverse transpose, and can be applied directly to vectors as well as points. Thus, theexamples in this book sometimes use (M-1)T when applying a transform to a vector, andsometimessimplyuseM.

Oneofthethingswehaven’tdiscussedinthischapteristechniquesformovingthecamerasmoothlythroughspace.Thisisveryuseful,especiallyforgamesandCGImovies,butalsoforvisualization,virtualreality,andfor3Dmodeling.

We didn’t include complete derivations for all of the matrix transforms that werepresented (they can be found in other sources, such as [FV95]). We strove instead for aconcise summaryof the point, vector, andmatrix operations necessary for doingbasic 3Dgraphicsprogramming.Asthisbookproceeds,wewillencountermanypracticalusesforthemethodspresented.

Exercises

3.1 ModifyProgram2.5sothatthevertexshaderincludesoneofthebuildRotate()functionsfromProgram3.1,andappliesittothepointscomprisingthetriangle.Thisshouldcausethetriangletoberotatedfromitsoriginalorientation.Youdon’tneedtoanimatetherotation.

3.2 (RESEARCH)AttheendofSection3.4weindicatedthatEuleranglescaninsomecasesleadtoundesirableartifacts.Themostcommoniscalled“gimballock.”Describegimballock,giveanexample,andexplainwhygimballockcancauseproblems.

3.3 (RESEARCH)OnewayofavoidingtheartifactsthatcanmanifestwhenusingEuleranglesistousequaternions.Wedidn’tstudyquaternions;however,graphicslib3DincludesaQuaternionclass.Dosomeindependentstudyonquaternions,andfamiliarizeyourselfwiththerelatedgraphicslib3DQuaternionclass.

Page 69: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

References

[CR86]

C.Crivelli,TheAnnunciation,withSaintEmidius(1486)-intheNationalGallery,London,England,accessedJuly2016,https://www.nationalgallery.org.uk/paintings/carlo-crivelli-the-annunciation-with-saint-emidius.

[EU76]L.Euler,Formulaegeneralsprotranslationequacunquecoporumrigidorum(Generalformulasforthetranslationofarbitraryrigidbodies),NoviCommentariiacademiaescientiarumPetropolitanae20,1776.

[FV95] J.Foley,A.vanDam,S.Feiner,andJ.Hughes,ComputerGraphics-PrinciplesandPractice,2nded.(Addison-Wesley,1995).

[KU98] J.B.Kuipers,QuaternionsandRotationSequences(PrincetonUniversityPress,1998).

Page 70: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 4

MANAGING3DGRAPHICSDATA

4.1 Buffers&VertexAttributes4.2 UniformVariables4.3 InterpolationofVertexAttributes4.4 Model-ViewandPerspectiveMatrices4.5 OurFirst3DProgram–a3DCube4.6 RenderingMultipleCopiesofanObject4.7 RenderingMultipleDifferentModelsinaScene4.8 MatrixStacks4.9 Combating“Z-Fighting”Artifacts4.10 OtherOptionsforPrimitives4.11 Back-FaceCulling

SupplementalNotes

UsingOpenGLtorender3DimagesgenerallyinvolvessendingseveraldatasetsthroughtheOpenGLshaderpipeline.Forexample,todrawasimple3Dobjectsuchasacube,youwillneedtoatleastsendthefollowingitems:

theverticesforthecubemodelsometransformationmatricestocontroltheappearanceofthecube’sorientationin3Dspace

To complicate matters a bit, there are two ways of sending data through the OpenGLpipeline:

throughabuffertoavertexattribute,ordirectlytoauniformvariable

It is important to understand exactly how these twomechanismswork, so as to use the

Page 71: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

appropriatemethodforeachitemwearesendingthrough.

Let’sstartbyrenderingasimplecube.

4.1 BUFFERS&VERTEXATTRIBUTES

For an object to be drawn, its vertices must be sent to the vertex shader. Vertices areusually sentbyputting them inabuffer on the Java side, and associating that bufferwith avertexattribute declared in the shader.There are several steps to accomplish this, someofwhichonlyneedtobedoneonce,andsomeofwhich—ifthesceneisanimated—mustbedoneateveryframe:

Doneonce–typicallyininit():

1. createabuffer2. copytheverticesintothebuffer

Doneateachframe–typicallyindisplay():

1. enablethebuffercontainingthevertices2. associatethebufferwithavertexattribute3. enablethevertexattribute4. useglDrawArrays(…)todrawtheobject

Buffersaretypicallycreatedallatonceatthestartoftheprogram,eitherininit()orinafunction called by init(). InOpenGL, a buffer is contained in aVertex BufferObject, orVBO,which isdeclared and instantiated in the Java/JOGLapplication.A scenemay requiremanyVBOs, so it iscustomary togenerateand then fill severalof them ininit(), so thattheyareavailablewheneveryourprogramneedstodrawoneormoreofthem.

A buffer interacts with a vertex attribute in a specific way. When glDrawArrays() isexecuted,thedatainthebufferstartsflowing,sequentiallyfromthebeginningofthebuffer,through the vertex shader. As described in Chapter 2, the vertex shader executes once pervertex.Avertex in3D space requires threevalues, so an appropriate vertex attribute in theshadertoreceivethesethreevalueswouldbeoftypevec3.Then,foreachthreevaluesinthebuffer,theshaderisinvoked,asillustratedinFigure4.1:

A related structure in OpenGL is called a Vertex Array Object, or VAO. VAOs wereintroducedinversion3.0ofOpenGL,andareprovidedasawayoforganizingbuffersandmakingthemeasiertomanipulateincomplexscenes.OpenGLrequiresatleastoneVAObecreated,andforourpurposesonewillbesufficient.

For example, suppose that wewish to display two objects. On the Java/JOGL side, wecoulddothisbydeclaringasingleVAOandanassociatedsetoftwoVBOs(oneperobject),asfollows:

Page 72: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure4.1DatatransmissionbetweenaVBOandavertexattribute.

privateintvao[]=newint[1];//OpenGLrequiresthesevaluesbe

specifiedinarrays

privateintvbo[]=newint[2];

gl.glGenVertexArrays(1,vao,0);

gl.glBindVertexArray(vao[0]);

gl.glGenBuffers(2,vbo,0);

ThetwoOpenGLcommandsglGenVertexArrays()andglGenBuffers()createVAOsandVBOs,respectively,andreturn integerIDsfor them.Westore thoseIDs into theintarraysvaoandvbo.Thethreeparametersoneachofthemrefertohowmanyarecreated,anarraytohold the returned IDs, and an offset into that array (usually set to 0). The purpose ofglBindVertexArrays() istomakethespecifiedVAO“active”sothatthegeneratedbuffers1willbeassociatedwiththatVAO.

Abuffer needs to have a correspondingvertexattribute variable declared in the vertexshader. Vertex attributes are generally the first variables declared in a shader. In our cubeexample,avertexattributetoreceivethecubeverticescouldbedeclaredinthevertexshaderasfollows:

layout(location=0)invec3position;

Thekeywordinmeans “input” and indicates that this vertex attributewill be receivingvaluesfromabuffer(aswewillseelater,vertexattributescanalsobeusedfor“output”).Asseenbefore,the“vec3”meansthateachinvocationoftheshaderwillgrabthreefloatvalues(presumably x, y, z, comprising a single vertex). The variable name is “position”. The“layout(location=0)”portionofthecommandiscalleda“layoutqualifier”andishowwewill associate the vertex attributewith a particular buffer. Thus this vertex attribute has anidentifier0thatwewilluselaterforthispurpose.

Themanner inwhichwe load the vertices of amodel into a buffer (VBO) depends on

Page 73: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

where the model’s vertex values are stored. In Chapter 6 we will see how models arecommonlybuilt in amodeling tool (such asBlender[B16] orMaya[M16]), exported to astandard file format (such as .obj—also described in Chapter 6), and imported into theJava/JOGLprogram.Wewillalsoseehowamodel’sverticescanbecalculatedonthefly,orgeneratedinsidethepipeline,suchasinthetessellationshader.

Fornow,let’ssaythatwewishtodrawacube,andlet’spresumethattheverticesofourcubearehardcoded inanarray in the Java/JOGLapplication. In thatcase,weneed tocopythosevaluesintooneofourtwobuffersthatwegeneratedabove.Todothat,weneedto(a)makethatbuffer(say, the0thbuffer)“active”with theOpenGLglBindBuffer()command,(b)copythevertices intoaJavaFloatBuffer,and(c)usetheglBufferData()command tocopy theFloatBuffer into the active buffer (the 0thVBO in this case). Presuming that theverticesarestoredinafloatarraynamedvPositions,thefollowingJOGLcode2wouldcopythosevaluesintothe0thVBO:

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

FloatBuffervBuf=Buffers.newDirectFloatBuffer(vPositions);

gl.glBufferData(GL_ARRAY_BUFFER,vBuf.limit()*4,vBuf,

GL_STATIC_DRAW);

Java has two types of buffers: direct and non-direct. For performance reasons, directbuffers should be used in JOGL applications. JOGL provides tools in the classcom.jogamp.common.nio.Buffersthatfacilitatetheuseofdirectbuffers.Intheexampleabove,theJOGLfunctionnewDirectFloatBuffer()copiesvaluesfromanarraytoaFloatBuffer;inthiscasetheverticesofthecubetobedrawn.

Next,weaddcodetodisplay() thatwillcause thevalues in thebuffer tobesent to thevertexattributeintheshader.Wedothiswiththefollowingthreesteps:(a)makethatbuffer“active”withtheglBindBuffer()commandaswedidabove,(b)associatetheactivebufferwithavertexattribute in theshader,and (c)enable thevertexattribute.Thefollowing threelinesofcodewillaccomplishthesesteps:

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]); //makethe0th

buffer"active"

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0); //associate0th

vertexattributewithactivebuffer

gl.glEnableVertexAttribArray(0); //enablethe0thvertex

attribute

Nowwhenwe execute glDrawArrays(), data in the 0thVBOwill be transmitted to thevertexattributethathasalayoutqualifierwithlocation0.Thissendsthecubeverticestotheshader.

4.2 UNIFORMVARIABLES

Rendering a scene so that it appears 3D requires building appropriate transformation

Page 74: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

matrices, such as those described in Chapter 3, and applying them to each of themodels’vertices.Itismostefficienttoapplytherequiredmatrixoperationsinthevertexshader,anditiscustomarytosendthesematricesfromtheJava/JOGLapplicationtotheshaderinauniformvariable.

Uniform variables are declared in a shader by using the “uniform” keyword. Thefollowingexample,declaringvariablestoholdmodel-viewandprojectionmatrices,willbesuitableforourcubeprogram:

uniformmat4mv_matrix;

uniformmat4proj_matrix;

The keyword “mat4” indicates that these are 4×4 matrices. Here we have named thevariablesmv_matrix toholdthemodel-viewmatrix,andproj_matrix tohold theprojectionmatrix.Since3Dtransformationsare4×4,mat4isacommonlyuseddatatypeinGLSLshaderuniforms.

SendingdatafromaJava/JOGLapplicationtoauniformvariablerequiresthefollowingsteps: (a) acquire a pointer to the uniform variable, and (b) associate a Java float arraycontaining the desired values with the acquired uniform pointer. Assuming that the linkedrenderingprogramissavedinavariablecalled“program”,thefollowinglinesofcodewouldspecify that we will be sending model-view and projection matrices to the two uniformsmv_matrixandproj_matrixinourcubeexample:

mLoc=gl.glGetUniformLocation(program,"mv_matrix"); //getthe

locationsoftheuniforms

pLoc=gl.glGetUniformLocation(program,"proj_matrix"); //in

theshaderprogram

gl.glUniformMatrix4fv(mLoc,1,false,mvMat.getFloatValues(),0); //send

matrixdatatothe

gl.glUniformMatrix4fv(pLoc,1,false,pMat.getFloatValues(),0); //uniform

variables

The above example assumes that we have utilized the graphicslib3D utilities to buildmodel-viewandprojectionmatrixtransformsmvMatandpMat,aswillbediscussedingreaterdetailshortly.TheyareoftypeMatrix3D(agraphicslib3Dclass),andtheassociatedfunctioncall to getFloatValues() returns those matrix values in a float array, as needed byglUniformMatrix4fv().

4.3 INTERPOLATIONOFVERTEXATTRIBUTES

ItisimportanttounderstandhowvertexattributesareprocessedintheOpenGLpipeline,versus how uniform variables are processed. Recall that immediately before the fragmentshaderisrasterization,whereprimitives(e.g.,triangles)definedbyverticesareconvertedtofragments. Rasterization linearly interpolates vertex attribute values so that the displayedpixelsseamlesslyconnectthemodeledsurfaces.

Page 75: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

By contrast, uniform variables behave like initialized constants, and remain unchangedacrosseachvertexshader invocation (i.e., foreachvertexsent from thebuffer).Auniformvariable is not interpolated; it always contains the samevalue regardless of the number ofvertices.

Theinterpolationdoneonvertexattributesbytherasterizerisusefulinmanyways.Later,wewilluserasterizationtointerpolatecolors,texturecoordinates,andsurfacenormals.Itisimportant to understand that all values sent through a buffer to a vertex attribute will beinterpolatedfurtherdownthepipeline.

Wehave seenvertex attributes in a vertex shaderdeclared as “in”, to indicate that theyreceivevaluesfromabuffer.Vertexattributesmayinsteadbedeclaredas“out”,meaningthatthey send their values forward towards the next stage in the pipeline. For example, thefollowing declaration in a vertex shader specifies a vertex attribute named “color” thatoutputsavec4:

outvec4color;

Itisnotnecessarytodeclarean“out”variableforthevertexpositions,becauseOpenGLhasabuilt-inoutvec4variablenamedgl_Positionforthatpurpose.Inthevertexshader,weapply the matrix transformations to the incoming vertex (declared earlier as position)assigningtheresulttogl_Position:

gl_Position=proj_matrix*mv_matrix*position;

The transformed vertices will then be automatically output to the rasterizer, withcorrespondingpixellocationsultimatelysenttothefragmentshader.

TherasterizationprocessisillustratedinFigure4.2.WhenspecifyingGL_TRIANGLESintheglDrawArrays() function, rasterization is done per triangle. Interpolation starts along thelinesconnectingthevertices,atalevelofprecisioncorrespondingtothepixeldisplaydensity.The pixels in the interior space of the triangle are then filled by interpolating along thehorizontallinesconnectingtheedgepixels.

Figure4.2Rasterizationofvertices.

Page 76: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

4.4 MODEL-VIEWANDPERSPECTIVEMATRICES

Afundamental step in renderinganobject in3D is tocreateappropriate transformationmatricesandsendthemtouniformvariableslikewedidinSection4.2.Westartbydefiningthreematrices:

1. aModelmatrix2. aViewmatrix3. aPerspectivematrix

TheModelmatrix positions and orients the object in theworld coordinate space. Eachmodelhasitsownmodelmatrix,andthatmatrixwouldneedtobecontinuouslyrebuiltifthemodelmoves.

TheViewmatrixmoves and rotates themodels in theworld to simulate the effect of acameraatadesiredlocation.RecallfromChapter2thattheOpenGLcameraexistsatlocation(0,0,0)andfacesdownthenegativeZaxis.Tosimulatetheappearanceofthatcamerabeingmovedacertainway,wewillneed tomove theobjects themselves in theoppositeway.Forexample,movingacameratotherightwouldcausetheobjectsinthescenetoappeartomovethe left; although the OpenGL camera is fixed, we canmake it appear as thoughwe havemovedittotherightbymovingtheobjectstotheleft.

ThePerspectivematrixisatransformthatprovidesthe3Deffectaccordingtothedesiredfrustum,asdescribedearlierinChapter3.

Itisalsoimportanttounderstandwhentocomputeeachtypeofmatrix.Matricesthatneverchangecanbebuiltininit(),butthosethatchangewouldneedtobebuiltindisplay() sothattheyarerebuiltforeachframe.Let’sassumethatthemodelsareanimated,andthecameraismovable.Then:

Amodelmatrixneedstobecreatedforeachmodel,andateachframe.Theviewmatrix needs to be createdonceper frame (because the camera can bemoved),butisthesameforallobjectsrenderedduringthatframe.Theperspectivematrixiscreatedonce(ininit())usingthescreenwindow’swidthandheight(anddesiredfrustumparameters),andusuallyremainsunchangedunlessthewindowisresized.

Generating model and view transformation matrices then happens in the display()function,asfollows:

1. Buildtheviewmatrixbasedonthedesiredcameralocationandorientation.2. Foreachmodel,dothefollowing:

i. Buildamodelmatrixbasedonthemodel’slocationandorientation.ii. Concatenatethemodelandviewmatricesintoasingle“MV”matrix.iii. Send the MV and projection matrices to the corresponding shader

Page 77: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

uniforms.

Technically, it isn’t necessary to combine the model and view matrices into a singlematrix. That is, they could be sent to the vertex shader in individual, separate matrices.However, there are certain advantages to combining them, while keeping the perspectivematrixseparate.Forexample,inthevertexshader,eachvertexinthemodelismultipliedbythe matrices. Since complex models may have hundreds or even thousands of vertices,performancecanbeimprovedbypre-multiplyingthemodelandviewmatricesoncebeforesendingthemtothevertexshader.Later,wewillseetheneedtokeeptheperspectivematrixseparateforlightingpurposes.

4.5 OURFIRST3DPROGRAM–A3DCUBE

It’s time to put all the pieces together! In order to build a complete Java/JOGL/GLSLsystemtorenderourcubeina3D“world,”allofthemechanismsdescribedsofarwillneedtobeputtogetherandperfectlycoordinated.WecanreusesomeofthecodethatwehaveseenpreviouslyinChapter2.Specifically,wewon’trepeatthefollowingfunctionsforreadinginfilescontainingshadercode,compilingandlinkingthem,anddetectingGLSLerrors(butyoustillshouldusethem):

createShaderProgram()

readShaderSource()

checkOpenGLError()

printProgramLog()

printShaderLog()

First,let’screateautilityfunctionthatbuildsaperspectivematrix,givenaspecifiedfield-of-view angle for theY axis, the screen aspect ratio, and the desired near and far clippingplanes(selectingappropriatevaluesfornearandfarclippingplanes isdiscussedinSection4.9).This function, basedon the formulas given inFigure3.16,will be useful in our finalcubeprogram:

privateMatrix3Dperspective(floatfovy,floataspect,floatn,floatf)

{floatq=1.0f/((float)Math.tan(Math.toRadians(0.5f*fovy)));

floatA=q/aspect;

floatB=(n+f)/(n-f);

floatC=(2.0f*n*f)/(n-f);

Matrix3Dr=newMatrix3D();

r.setElementAt(0,0,A);

r.setElementAt(1,1,q);

r.setElementAt(2,2,B);

r.setElementAt(3,2,-1.0f);

r.setElementAt(2,3,C);

r.setElementAt(3,3,0.0f);

returnr;

}

Page 78: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Wecannowbuildthecomplete3Dcubeprogram,showninProgram4.1thatfollows:

Program4.1PlainRedCube

Java/JOGLApplicationimportgraphicslib3D.*;

importgraphicslib3D.GLSLUtils.*;

importjava.nio.*;

importjavax.swing.*;

importstaticcom.jogamp.opengl.GL4.*;

importcom.jogamp.opengl.*;

importcom.jogamp.opengl.awt.GLCanvas;

importcom.jogamp.common.nio.Buffers;

importcom.jogamp.opengl.GLContext;

publicclassCodeextendsJFrameimplementsGLEventListener

{privateGLCanvasmyCanvas;

privateintrendering_program;

privateintvao[]=newint[1];

privateintvbo[]=newint[2];

privatefloatcameraX,cameraY,cameraZ;

privatefloatcubeLocX,cubeLocY,cubeLocZ;

privateGLSLUtilsutil=newGLSLUtils();

privateMatrix3DpMat;

publicCode()

{setTitle("Chapter4-program1");

setSize(600,600);

myCanvas=newGLCanvas();

myCanvas.addGLEventListener(this);

this.add(myCanvas);

setVisible(true);

}

publicvoidinit(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

rendering_program=createShaderProgram();

setupVertices();

cameraX=0.0f;cameraY=0.0f;cameraZ=8.0f;

cubeLocX=0.0f;cubeLocY=-2.0f;cubeLocZ=0.0f; //shifteddownalong

theY-axistorevealperspective

//Createaperspectivematrix,thisonehasfovy=60,aspectratio

matchesscreenwindow.

//Valuesfornearandfarclippingplanescanvaryasdiscussedin

Section4.9.

floataspect=(float)myCanvas.getWidth()/(float)myCanvas.getHeight();

pMat=perspective(60.0f,aspect,0.1f,1000.0f);

}

//main(),reshape(),anddispose()areareunchanged

publicstaticvoidmain(String[]args)

{newCode();

Page 79: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}

publicvoidreshape(GLAutoDrawabledrawable,intx,inty,intwidth,int

height){}

publicvoiddispose(GLAutoDrawabledrawable){}

publicvoiddisplay(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

gl.glClear(GL_DEPTH_BUFFER_BIT);

gl.glUseProgram(rendering_program);

//buildviewmatrix

Matrix3DvMat=newMatrix3D();

vMat.translate(-cameraX,-cameraY,-cameraZ);

//buildmodelmatrix

Matrix3DmMat=newMatrix3D();

mMat.translate(cubeLocX,cubeLocY,cubeLocZ);

//concatenatemodelandviewmatrixtocreateMVmatrix

Matrix3DmvMat=newMatrix3D();

mvMat.concatenate(vMat);

mvMat.concatenate(mMat);

//copyperspectiveandMVmatricestocorrespondinguniformvariables

intmv_loc=gl.glGetUniformLocation(rendering_program,"mv_matrix");

intproj_loc=gl.glGetUniformLocation(rendering_program,"proj_matrix");

gl.glUniformMatrix4fv(proj_loc,1,false,pMat.getFloatValues(),0);

gl.glUniformMatrix4fv(mv_loc,1,false,mvMat.getFloatValues(),0);

//associateVBOwiththecorrespondingvertexattributeinthevertex

shader

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

//adjustOpenGLsettingsanddrawmodel

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,36);

}

privatevoidsetupVertices()

{GL4gl=(GL4)GLContext.getCurrentGL();

//36verticesofthe12trianglesmakingupa2x2x2cubecentered

attheorigin

float[]vertex_positions=

{-1.0f,1.0f,-1.0f,-1.0f,-1.0f,-1.0f,1.0f,-1.0f,-1.0f,1.0f,

-1.0f,-1.0f,1.0f,1.0f,-1.0f,-1.0f,1.0f,-1.0f,

1.0f,-1.0f,-1.0f,1.0f,-1.0f,1.0f,1.0f,1.0f,-1.0f,1.0f,

-1.0f,1.0f,1.0f,1.0f,1.0f,1.0f,1.0f,-1.0f,

1.0f,-1.0f,1.0f,-1.0f,-1.0f,1.0f,1.0f,1.0f,1.0f,-1.0f,

-1.0f,1.0f,-1.0f,1.0f,1.0f,1.0f,1.0f,1.0f,

-1.0f,-1.0f,1.0f,-1.0f,-1.0f,-1.0f,-1.0f,1.0f,1.0f,-1.0f,

-1.0f,-1.0f,-1.0f,1.0f,-1.0f,-1.0f,1.0f,1.0f,

-1.0f,-1.0f,1.0f,1.0f,-1.0f,1.0f,1.0f,-1.0f,-1.0f,1.0f,

-1.0f,-1.0f,-1.0f,-1.0f,-1.0f,-1.0f,-1.0f,1.0f,

Page 80: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

-1.0f,1.0f,-1.0f,1.0f,1.0f,-1.0f,1.0f,1.0f,1.0f,1.0f,1.0f,

1.0f,-1.0f,1.0f,1.0f,-1.0f,1.0f,-1.0f

};

gl.glGenVertexArrays(vao.length,vao,0);

gl.glBindVertexArray(vao[0]);

gl.glGenBuffers(vbo.length,vbo,0);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

FloatBuffervertBuf=Buffers.newDirectFloatBuffer(vertex_positions);

gl.glBufferData(GL_ARRAY_BUFFER,vertBuf.limit()*4,vertBuf,

GL_STATIC_DRAW);

}

//...createShaderProgram(),andperspective()areunchanged

}

Vertexshader#version430

layout(location=0)invec3position;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

voidmain(void)

{gl_Position=proj_matrix*mv_matrix*vec4(position,1.0);

}

Fragmentshader#version430

outvec4color;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

voidmain(void)

{color=vec4(1.0,0.0,0.0,1.0);

}

Let’stakeacloselookatthecodeinProgram4.1.Itisimportantthatweunderstandhowallofthepieceswork,andhowtheyworktogether.

Page 81: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure4.3OutputofProgram4.1.Redcubepositionedat(0,-2,0)viewedfrom(0,0,8).

StartbyexaminingthefunctionneartheendoftheJava/JOGLlisting,setupVertices(),calledbyinit().Atthestartofthisfunction,anarrayisdeclaredcalledvertex_positionsthatcontainsthe36verticescomprisingthecube.Atfirstyoumightwonderwhythiscubehas36vertices,whenlogicallyacubeshouldonlyrequire8.Theansweristhatweneedtobuildourcubeoutoftriangles,andsoeachofthesixcubefacesneedstobebuiltoftwotriangles,for a total of 6×2=12 triangles (see Figure 4.4). Since each triangle is specified by threevertices,thistotals36vertices.Sinceeachvertexhasthreevalues(x,y,z),thereareatotalof36x3=108valuesinthearray.Itistruethateachvertexparticipatesinmultipletriangles,butwestill specifyeachvertexseparatelybecause fornowwearesending theverticesofeachtriangledownthepipelineseparately.

Thecubeisdefinedinitsowncoordinatesystem,with(0,0,0)atitscenter,andwithitscornersrangingfrom-1.0to+1.0alongthex,y,andzaxes.TherestofthesetupVertices()functionsetsuptheVAO,twoVBOs(althoughonlyoneisused),andloadsthecubeverticesintothe0thVBObuffer.

Notethattheinit()functionperformstasksthatonlyneedtobedoneonce:readingintheshadercodeandbuildingtherenderingprogram,buildingtheperspectivematrix,andloadingthe cube vertices into the buffer.Note that it also positions the cube and the camera in theworld . . . laterwewill animate the cube and also see how tomove the camera around, atwhichpointwemayneedtoremovethishardcodedpositioning.

Page 82: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure4.4Cubemadeoftriangles.

Nowlet’slookatthedisplay()function.Recallthatdisplay()maybecalledrepeatedlyandtherateatwhichitiscalledisreferredtoastheframerate.Thatis,animationworksbycontinuallydrawingandre-drawingthescene,orframe,veryquickly.Itisusuallynecessaryto clear the depth buffer before rendering a frame, so that hidden surface removal occursproperly(notclearingthedepthbuffercansometimesresultineverysurfacebeingremoved,resultinginacompletelyblackscreen).Bydefault,depthvaluesinOpenGLrangefrom0.0to1.0.ClearingthedepthbufferisdonebycallingglClear(GL_DEPTH_BUFFER_BIT),whichfillsthedepthbufferwiththedefaultvalue(usually1.0).

Next,display()enablestheshadersbycallingglUseProgram(),whichinstallstheGLSLcode on the GPU. Recall that this doesn’t run the program, but it does allow JOGL todetermine the shader ’svertexattribute anduniform locations.Thedisplay() function nextbuilds the view and model matrices, concatenates them into a single MV matrix, gets theuniformvariablelocations,andassignstheperspectiveandMVmatricestothecorrespondinguniforms.

Next, display() enables the buffer containing the cube vertices, and attaches it to 0thvertexattributetoprepareforsendingtheverticestotheshader.

Thelastthingdisplay()doesisdrawthemodelbycallingglDrawArrays(), specifyingthat the model is comprised of triangles and that model has 36 total vertices. The call toglDrawArrays()istypicallyprecededbyadditionalcommandsthatadjustrenderingsettingsforthismodel.3Inthisexample,therearetwosuchcommands,bothofwhicharerelatedtodepthtesting.RecallfromChapter2thatdepthtestingisusedbyOpenGLtoperformhiddensurfaceremoval.Here,weenabledepthtesting,andspecifytheparticulardepthtestwewishOpenGLtouse.ThesettingsshownherecorrespondtothedescriptioninChapter2; later inthebookwewillseeotherusesforthesecommands.

Finally,considertheshaders.First,notethattheybothincludethesameblockofuniformvariable declarations. Although this is not always required, it is often a good practice toinclude the same block of uniform variable declarations in all of the shaders within a

Page 83: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

particularrenderingprogram.

Notealsointhevertexshaderthepresenceofthelayoutqualifierontheincomingvertexattributeposition.Sincethelocationisspecifiedas“0”,thedisplay()functioncanreferencethis variable simply by using 0 in the first parameter of the glVertexAttribPointer()function call, and in the glEnableVertexAttribArray() function call. Note also that thepositionvertexattribute isdeclaredasavec3,andso itneeds tobeconverted toavec4 inordertobecompatiblewiththe4×4matriceswithwhichitwillbemultiplied.Thisconversionisdonewithvec4(position,1.0),whichbuildsavec4outofthevariablenamed“position”,puttingavalueof1.0inthenewlyadded4thspot.

The multiplication in the vertex shader applies the matrix transforms to the vertex,converting it to camera space (note the right-to-left concatenationorder).Thosevalues areput in the built-in OpenGL output variable gl_Position, and then proceed through thepipelineandareinterpolatedbytherasterizer.

The interpolatedpixel locations (referred toas fragments) are then sent to the fragmentshader. Recall that the primary purpose of the fragment shader is to set the color of anoutputtedpixel. Inamannersimilar to thevertexshader, the fragment shaderprocesses thepixels one-by-one, with a separate invocation for each pixel. In this case, it outputs ahardcodedvaluecorresponding to red.For reasons indicatedearlier, theuniformvariableshavebeen included in the fragment shader even though they aren’t beingused there in thisexample.

An overview of the flow of data starting with the Java/JOGL application and passingthroughthepipelineisshowninFigure4.5,below:

Figure4.5DataflowthroughProgram4.1.

Let’smake a slightmodification to the shaders. In particular,wewill assign a color toeach vertex according to its location, and put that color in the outgoing vertex attributevaryingColor. The fragment shader is similarly revised to accept the incoming color(interpolatedbytherasterizer)andusethat toset thecolorof theoutputpixel.Notethat thecodealsomultipliesthelocationby½andthenadds½toconvert therangeofvaluesfrom(-1,+1)to(0,1).Notealsotheuseofthecommonconventionofassigningvariablenamesthat

Page 84: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

include the word “varying” to programmer-defined interpolated vertex attributes. Thechangesineachshaderarehighlighted,andtheresultingoutputshownbelow.

Revisedvertexshader:

#version430

layout(location=0)invec3position;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

outvec4varyingColor;

voidmain(void)

{gl_Position=proj_matrix*mv_matrix*vec4(position,1.0);

varyingColor=vec4(position,1.0)*0.5+vec4(0.5,0.5,0.5,

0.5);

}

Revisedfragmentshader:

#version430

invec4varyingColor;

outvec4color;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

voidmain(void)

{color=varyingColor;

}

Note that because the colors are sent out from the vertex shader in a vertex attribute(varyingColor),thattheytooareinterpolatedbytherasterizer!Theeffectofthiscanbeseenin Figure 4.6, where the colors from corner to corner are clearly interpolated smoothlythroughoutthecube.

Note also that the “out” variable varyingColor in the vertex shader is also the “in”variableinthefragmentshader.Thetwoshadersknowwhichvariablefromthevertexshaderfeeds which variable in the fragment shader because they have the same name“varyingColor”inbothshaders.

Page 85: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure4.6Cubewithinterpolatedcolors.

WecananimatethecubeusingtheFPSAnimatorclassasinProgram2.6,bybuilding themodelmatrixusingavaryingtranslationandrotationbasedontheelapsedtime.Forexample,thecodeinthedisplay()functioninProgram4.1couldbemodifiedasfollows(changesarehighlighted):

gl.glClear(GL_DEPTH_BUFFER_BIT);

floatbkg[]={0.0f,0.0f,0.0f,1.0f};

FloatBufferbkgBuffer=Buffers.newDirectFloatBuffer(bkg);

gl.glClearBufferfv(GL_COLOR,0,bkgBuffer);

...

Matrix3DmMat=newMatrix3D();

//usesystemtimetogenerateslowly-increasingsequenceof

floating-pointvalues

doublet=(double)(System.currentTimeMillis())/10000.0;

//usettocomputedifferenttranslationsinx,y,andz

mMat.translate(Math.sin(2*t)*2.0,Math.sin(3*t)*2.0,

Math.sin(4*t)*2.0);

mMat.rotate(1000*t,1000*t,1000*t); //the1000adjuststhe

rotationspeed

Theuseof elapsed time (and avarietyof trigonometric functions) in themodelmatrixcausesthecubetoappeartotumblearoundinspace.Notethataddingthisanimationillustratesthe importanceofclearing thedepthbuffereach time throughdisplay() to ensure correcthiddensurfaceremoval.ItalsonecessitatesclearingtheGL_COLORbufferasshown;otherwise,thecubewillleaveatrailasitmoves.TheJavaFloatBufferisfilledwithzeros,resultinginablackbackgroundcolor.

Thetranslate()androtate()functionsarepartofthegraphicslib3Dlibrary.Notethatinthecode,translate() is calledbeforerotate().This results in a concatenationof the two

Page 86: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

transforms, with translation on the left and rotation on the right. When a vertex issubsequently multiplied by this matrix, the computation is right-to-left, meaning that therotationisdonefirst, followedbythe translation.Theorderofapplicationof transformsissignificantandchangingtheorderwouldresultindifferentbehavior.Figure4.7showssomeoftheframesthataredisplayedafteranimatingthecube:

Figure4.7Animated(“tumbling”)3Dcube.

4.6 RENDERINGMULTIPLECOPIESOFANOBJECT

Wenowextendwhatwehavelearnedtorenderingmultipleobjects.Beforewetacklethegeneralcaseofrenderingavarietyofmodelsinasinglescene,let’sconsiderthesimplercaseofmultipleoccurrencesofthesamemodel.Suppose,forinstance,thatwewishtoexpandtheprevious example so that it renders a “swarm” of 24 tumbling cubes. We can do this bymovingtheportionsofthecodeindisplay()thatbuildtheMVmatrixandthatdrawthecube(shownbelow inblue) intoa loop thatexecutes24 times.We incorporate the loopvariableintothecube’srotationandtranslation,sothateachtimethecubeisdrawnadifferentmodelmatrixisbuilt.(WealsopositionedthecamerafurtherdownthepositiveZaxissowecanseeallofthecubes.)TheresultinganimatedsceneisshownthereafterinFigure4.8.

publicvoiddisplay(GLAutoDrawabledrawable)

{...

doubletimeFactor=(double)(System.currentTimeMillis())/10000.0;

for(inti=0;i<24;i++)

{doublex=i+timeFactor;

Matrix3DmMat=newMatrix3D();

mMat.translate(Math.sin(2*x)*6.0,Math.sin(3*x)*6.0,

Math.sin(4*x)*6.0);

mMat.rotate(1000*x,1000*x,1000*x);

Matrix3DmvMat=newMatrix3D();

mvMat.concatenate(vMat);

mvMat.concatenate(mMat);

gl.glUniformMatrix4fv(mv_loc,1,false,mvMat.getFloatValues(),

0);

gl.glEnable(GL_DEPTH_TEST);

Page 87: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,36);

}}

4.6.1 Instancing

InstancingprovidesamechanismfortellingthegraphicscardtorendermultiplecopiesofanobjectusingonlyasingleJavacall.Thiscanresultinasignificantperformancebenefit,particularlywhentherearethousandsormillionsofcopiesoftheobjectbeingdrawn—suchaswhenrenderingmanyflowersinafield,ormanyzombiesinanarmy.

We start by changing the glDrawArrays() call in our Java/JOGL application toglDrawArraysInstanced().Now,wecanaskOpenGLtodrawasmanycopiesaswewant.Wecanspecifydrawing24cubesasfollows:

glDrawArraysInstanced(GL_TRIANGLES,0,36,24);

Figure4.8Multipletumblingcubes.

Whenusinginstancing,thevertexshaderhasaccesstoabuilt-invariablegl_InstanceID,anintegerthatreferstowhichnumericinstanceoftheobjectiscurrentlybeingprocessed.

Toreplicateourprevioustumblingcubesexampleusinginstancing,wewillneedtomovethe computations that build the different model matrices (previously inside a loop indisplay())intothevertexshader.SinceGLSLdoesnotprovidetranslateorrotatefunctions,andwecannotmakecallstographicslib3Dfrominsideashader,wewillneedtousetheutilityfunctionsfromProgram3.1.Wewill alsoneed to pass the “time factor” variable from theJava/JOGLapplicationtothevertexshaderinauniform.Wealsoneedtopassthemodelandviewmatrices into separateuniformsbecause the rotationcomputationsareapplied toeachcube’smodelmatrix.Therevisions,includingthoseintheJava/JOGLapplicationandthoseinthenewvertexshader,areshowninProgram4.2.

Program4.2Instancing–24AnimatedCubes

Page 88: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

VertexShader:#version430

layout(location=0)invec3position;

uniformmat4m_matrix; //thesearenowseparatemodelandview

matrices

uniformmat4v_matrix;

uniformmat4proj_matrix;

uniformfloattf; //timefactorforanimationandplacementof

cubes

outvec4varyingColor;

mat4buildRotateX(floatrad); //declarationofmatrix

transformationutilityfunctions

mat4buildRotateY(floatrad); //(GLSLrequiresfunctionstobe

declaredpriortoinvocation)

mat4buildRotateZ(floatrad);

mat4buildTranslate(floatx,floaty,floatz);

voidmain(void)

{floati=gl_InstanceID+tf;//valuebasedontimefactor,butdifferent

foreachcubeinstance

floata=sin(2.0*i)*8.0; //thesearethex,y,andz

componentsforthetranslation,below

floatb=sin(3.0*i)*8.0;

floatc=sin(4.0*i)*8.0;

//buildtherotationandtranslationmatricestobeappliedtothis

cube’smodelmatrix

mat4localRotX=buildRotateX(1000*i);

mat4localRotY=buildRotateY(1000*i);

mat4localRotZ=buildRotateZ(1000*i);

mat4localTrans=buildTranslate(a,b,c);

//buildthemodelmatrixandthenthemodel-viewmatrix

mat4newM_matrix=m_matrix*localTrans*localRotX*localRotY*localRotZ;

mat4mv_matrix=v_matrix*newM_matrix;

gl_Position=proj_matrix*mv_matrix*vec4(position,1.0);

varyingColor=vec4(position,1.0)*0.5+vec4(0.5,0.5,0.5,0.5);

}

//utilityfunctiontobuildatranslationmatrix(fromChapter3)

mat4buildTranslate(floatx,floaty,floatz)

{mat4trans=mat4(1.0,0.0,0.0,0.0,

0.0,1.0,0.0,0.0,

0.0,0.0,1.0,0.0,

x,y,z,1.0);

returntrans;

}

//similarfunctionsincludedforrotationaroundtheX,Y,andZaxes(also

fromChapter3)

...

Page 89: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Java/JOGLApplication(indisplay())...

//sincethetimeFactorisusedintheshader,usemod(%)toreduceits

magnitude

doubletimeFactor=(double)(System.currentTimeMillis()%3600000)/10000.0;

//computationsthattransformmMathavebeenmovedtothevertexshader.

gl.glUniformMatrix4fv(m_loc,1,false,mMat.getFloatValues(),0); //

getGetUniform()isnowsplit

gl.glUniformMatrix4fv(v_loc,1,false,vMat.getFloatValues(),0); //

intotwoseparatecalls.

inttf_loc=gl.glGetUniformLocation(rendering_program,"tf"); //uniform

forthetimefactor

gl.glUniform1f(tf_loc,(float)timeFactor);

...

gl.glDrawArraysInstanced(GL_TRIANGLES,0,36,24);

TheresultingoutputofProgram4.2isidenticaltothatforthepreviousexample,andcanbeseeninthepreviousFigure4.8.

Instancingmakesitpossibletogreatlyexpandthenumberofcopiesofanobject;inthisexampleanimating100,000cubesisstillfeasibleevenforamodestGPU.Thechangestothecode—mainlyjustafewmodifiedconstantstospreadthelargenumberofcubesfurtherapart—areasfollows:

VertexShader:...

floata=sin(203.0*i/4000.0)*403.0;

floatb=cos(301.0*i/2001.0)*401.0;

floatc=sin(400.0*i/3003.0)*405.0;

...

Java/JOGLApplication...

gl.glDrawArraysInstanced(GL_TRIANGLES,0,36,100000);

TheresultingoutputisshownbelowinFigure4.9.

Page 90: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure4.9Instancing:100,000animatedcubes.

4.7 RENDERINGMULTIPLEDIFFERENTMODELSINASCENE

Torendermorethanonemodelinasinglescene,asimpleapproachistouseaseparatebuffer foreachmodel.Eachmodelwillneed itsownmodelmatrix,and thusanewmodel-viewmatrix will be generated for each model that we render. There will also need to beseparatecalls toglDrawArrays() foreachmodel.Therearechangesboth ininit()and indisplay().

Another consideration iswhether or notwewill need different shaders—or a differentrenderingprogram—foreachoftheobjectswewishtodraw.Asitturnsout,inmanycaseswecanusethesameshaders(andthusthesamerenderingprogram)forthevariousobjectswe are drawing. We usually only need to employ different rendering programs for thevariousobjectsiftheyarebuiltofdifferentprimitives(suchaslinesinsteadoftriangles),oriftherearecomplexlightingorothereffectsinvolved.Fornow,thatisn’tthecase,sowecanreuse the samevertexand fragment shaders, and justmodifyour Java/JOGLapplication tosendeachmodeldownthepipelinewhendisplay()iscalled.

Let’sproceedbyaddingasimplepyramid,sooursceneincludesbothasinglecubeandapyramid.ThecodeisshowninProgram4.3.Afewofthekeydetailsarehighlighted,suchaswhere we specify one or the other buffer, and where we specify the number of vertices

Page 91: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

containedinthemodel.Notethatthepyramidiscomprisedof6triangles—fouronthesidesandtwoonthebottom,totaling6×3=18vertices.

The resulting scene, containingboth the cubeand thepyramid, is then shown inFigure4.10.

Figure4.103Dcubeandpyramid.

Program4.3CubeandPyramid

Page 92: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

publicvoiddisplay(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

...clearthecoloranddepthbuffersasbefore(notshownhere)

...userenderingprogramandobtainuniformlocationsasbefore(not

shownhere)

//---------------------------setuptheprojectionandviewmatrices

floataspect=(float)myCanvas.getWidth()/(float)myCanvas.getHeight();

Matrix3DpMat=perspective(60.0f,aspect,0.1f,1000.0f);

Matrix3DvMat=newMatrix3D();

vMat.translate(-cameraX,-cameraY,-cameraZ);

//---------------------------drawthecube(usebuffer#0)

Matrix3DmMat=newMatrix3D();

mMat.translate(cubeLocX,cubeLocY,cubeLocZ);

Matrix3DmvMat=newMatrix3D();

mvMat.concatenate(vMat);

mvMat.concatenate(mMat);

gl.glUniformMatrix4fv(mv_loc,1,false,mvMat.getFloatValues(),0);

gl.glUniformMatrix4fv(proj_loc,1,false,pMat.getFloatValues(),0);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,36);

//------------------------------drawthepyramid(usebuffer#1)

mMat=newMatrix3D();

mMat.translate(pyrLocX,pyrLocY,pyrLocZ);

mvMat=newMatrix3D();

mvMat.concatenate(vMat);

mvMat.concatenate(mMat);

gl.glUniformMatrix4fv(mv_loc,1,false,mvMat.getFloatValues(),0);

gl.glUniformMatrix4fv(proj_loc,1,false,pMat.getFloatValues(),0); //

(repeatedforclarity)

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,18);

}

AfewotherminordetailstonoteregardingProgram4.3:

ThevariablespyrLocX,pyrLocY,andpyrLocZneedtobedeclaredintheclass,and

Page 93: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

initialized in init() to the desired pyramid location, as was done for the cubelocation.TheviewmatrixvMat is built at the top ofdisplay(), and then used in both thecube’sandthepyramid’smodel-viewmatrices.Thevertexandfragmentshadersarenotshown—theyareunchangedfromSection4.5.

4.8 MATRIXSTACKS

Sofar,themodelswehaverenderedhaveeachbeenconstructedofasinglesetofvertices.Itisoftendesired,however,tobuildcomplexmodelsbyassemblingsmaller,simplemodels.For example, amodelof a “robot” couldbecreatedby separatelydrawing thehead,body,legs,andarms,whereeachofthoseisaseparatemodel.Anobjectbuiltinthismannerisoftencalledahierarchicalmodel.Thetrickypartofbuildinghierarchicalmodelsiskeepingtrackofallthemodel-viewmatrices,andmakingsuretheystayperfectlycoordinated—otherwisetherobotmightflyapartintopieces!

Hierarchicalmodelsareusefulnotonlyforbuildingcomplexobjects—theycanalsobeused to generate complex scenes. For example, consider how our planet Earth revolvesaroundthesun,andinturnhowthemoonrevolvesaroundtheEarth.Computingthemoon’sactual path through space could be complex. However, if we can combine the transformsrepresentingthetwosimplecircularpaths—themoon’spatharoundtheEarthandtheEarth’spatharoundthesun—weavoidhavingtoexplicitlycomputethemoontrajectory.

Figure4.11Animatedplanetarysystem(sunandearthtexturesfrom[HT16],moontexturefrom[NA16]).

It turnsout thatwecando this fairlyeasilywithamatrixstack.Amatrix stack is, as itsnameimplies,astackoftransformationmatrices.Aswewillsee,matrixstacksmakeiteasytocreateandmanagecomplexhierarchicalobjectsandscenes,wheretransformscanbebuiltupon(andremovedfrom)othertransforms.

OpenGL has a built-in matrix stack, but as part of the older fixed-function (non-

Page 94: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

programmable)pipelineithaslongbeendeprecated.Itwasnicelydevisedandconvenienttouse,sowehavecreatedasimilarJavaclassinourgraphicslib3DlibrarycalledMatrixStack,patternedaftertheonethatbecamepopularinolderversionsofOpenGL[OL16].Aswewillsee,manyofthemodel,view,andmodel-viewmatricesthatwouldnormallybeneededinacomplexscenecanbereplacedbyasingleinstanceofMatrixStack.

Wewill firstexamine thebasiccommands for instantiatingandutilizingamatrixstack,thenuseMatrixStack to build a complex animated scene.Themost importantMatrixStackfunctionsare:

pushMatrix()–makesacopyofthetopmatrix,andpushesthecopyontothestackpopMatrix()–removesandreturnsthetopmatrixmultMatrix(m)–concatenatesmatrixmontothetopmatrixinthestackpeek()–returnsthetopmostmatrix,withoutremovingitfromthestack

Now, rather thanbuilding transformsbycreating instancesofMatrix3D,we insteadusethepushMatrix()commandtocreatenewmatricesatthetopofthestack.Desiredtransformsarethenconcatenatedasneededtothenewlycreatedmatrixonthetopofthestack.

ThefirstmatrixpushedonthestackisfrequentlytheVIEWmatrix.Thematricesaboveitaremodel-viewmatricesofincreasingcomplexity;thatis,theyhaveanincreasingnumberofmodeltransformsappliedtothem.Thesetransformscaneitherbeapplieddirectly,orbyfirstconcatenatingothermatrices.

In our planetary system example, the matrix positioned immediately above the VIEWmatrixwouldbethesun’sMVmatrix.Thematrixontopofthatmatrixwouldbetheearth’sMVmatrix,whichconsistsofthesun’sMVmatrixconcatenatedwiththeEarth’smodelmatrix.Thatis, the Earth’s MV matrix is built by incorporating the planet’s transforms into the sun’stransforms. Similarly, the moon’s MV matrix sits on top of the planet’s MV matrix, and isconstructedbyconcatenatingtheplanet’sMVmatrixwiththemoon’smodelmatrix.

After rendering themoon, a second “moon” could be rendered by “popping” the firstmoon’s matrix off of the stack (restoring the top of the stack to the planet’s model-viewmatrix)andthenrepeatingtheprocessforthesecondmoon.

Thebasicapproachisasfollows:

1. When a new object is introduced relative to a parent object, do a “push” and a“multiply.”

2. Whenanobjectorsub-objectiscompleted,“pop”itsmodel-viewmatrix,removingitfromatopthematrixstack.

Inlaterchapterswewilllearnhowtocreatespheresandmakethemlooklikeplanetsandmoons.Fornow,tokeepthingssimple,wewillbuilda“planetarysystem”usingourpyramid

Page 95: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

andacoupleofcubes.

Here is an overview of how a display() function using a matrix stack is typicallyorganized:

setup InstantiatetheMatrixStack

cameraPushanewmatrixontothestack

(thiswillinstantiateanemptyVIEWmatrix)

applytransform(s)totheviewmatrixonthetopofthestack

Parent

Push a new matrix onto the stack (this will be the parent MV

matrix–forthefirstparent,itduplicatestheviewmatrix)

apply transforms to incorporate the parent’s M matrix into the

duplicatedviewmatrix

send the topmost matrix (“peek”) to the MV uniform variable in

thevertexshader

drawtheparentobject

Child

Push a new matrix onto the stack. This will be the child’s MV

matrix,duplicatingtheparentMVmatrix.apply transforms to incorporate the child’s M matrix into the

duplicatedparentMVmatrix

send the topmost matrix (“peek”) to the MV uniform variable in

thevertexshader

drawthechildobject

cleanupPopthechild’sMVmatrixoffthestack

Poptheparent’sMVmatrixoffthestack

PoptheVIEWmatrixoffthestack

Thepyramid(“sun”)rotationonitsaxisisinitsownlocalcoordinatespace,anddoesnotaffect the “children.” Therefore, the sun’s rotation is pushed onto the stack, but then afterdrawingthesun,itwouldberemoved(popped)fromthestack.

The big cube’s (planet) revolution around the sun (left figure below) will affect the

Page 96: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

moon’smovement,andsoitispushedonthestackandremainstherewhendrawingthemoonaswell.Bycontrast,theplanet’srotationonitsaxis(rightfigurebelow)islocalanddoesnotaffectthemoon,soitispoppedoffthestackbeforedrawingthemoon.

Similarly,wewouldpushtransformsontothestackforthemoon’srotations(onitsaxis,andaroundtheplanet),indicatedinthefollowingimages.

Herearethestepsforthe“planet”:

push()anewmatrix.Thiswillbe theportionof theplanet’sMVmatrix thatwillalsoaffectchildren.translate(…)toincorporatetheplanetmovementaroundthesunintotheplanet’sMVmatrix.Inthisexampleweusetrigonometrytocalculatetheplanetmovementasatranslation.push()anewmatrix.ThiswillbetheplanetMVmatrix.rotate(…) to incorporate theplanet’saxis rotation (thiswill laterbepoppedandnotaffectchildren).peek()toobtaintheMVmatrixandthensendittotheMVuniform.Drawtheplanet.pop()theplanetMVmatrixoffthestack,exposingunderneathitanearliercopyofthe planet MV matrix without the planet’s axis rotation (so that only the planet’stranslationaffectsthemoon).

Wenowcanwritethecompletedisplay()routine,showninProgram4.4.

Page 97: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Program4.4SimpleSolarSystemUsingMatrixStackpublicvoiddisplay(GLAutoDrawabledrawable)

{//setupofbackground,depthbuffer,renderingprogram,andprojmatrices

unchanged

...

MatrixStackmvStack=newMatrixStack(20);

//pushviewmatrixontothestack

mvStack.pushMatrix();

mvStack.translate(-cameraX,-cameraY,-cameraZ);

doubleamt=(double)(System.currentTimeMillis())/1000.0;

//---------------------pyramid==sun--------------------------

mvStack.pushMatrix();

mvStack.translate(pyrLocX,pyrLocY,pyrLocZ); //sun’s

position

mvStack.pushMatrix();

mvStack.rotate((System.currentTimeMillis())/10.0,1.0,0.0,0.0); //

sun’srotationonitsaxis

gl.glUniformMatrix4fv(mv_loc,1,false,mvStack.peek().getFloatValues(),0);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,18); //drawthesun

mvStack.popMatrix(); //removethesun’saxialrotationfromthestack

//---------------------cube==planet--------------------------

mvStack.pushMatrix();

mvStack.translate(Math.sin(amt)*4.0f,0.0f,Math.cos(amt)*4.0f);//planet

movesaroundsun

mvStack.pushMatrix();

mvStack.rotate((System.currentTimeMillis())/10.0,0.0,1.0,0.0); //planet

axisrotation

gl.glUniformMatrix4fv(mv_loc,1,false,mvStack.peek().getFloatValues(),0);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

gl.glDrawArrays(GL_TRIANGLES,0,36);//drawtheplanet

mvStack.popMatrix(); //removetheplanet’saxialrotationfromthestack

//---------------------smallercube==moon--------------------------

mvStack.pushMatrix();

mvStack.translate(0.0f,Math.sin(amt)*2.0f,Math.cos(amt)*2.0f); //

moonmovesaroundplanet

mvStack.rotate((System.currentTimeMillis())/10.0,0.0,0.0,1.0); //

moon’srotationonitsaxis

mvStack.scale(0.25,0.25,0.25); //makethemoonsmaller

gl.glUniformMatrix4fv(mv_loc,1,false,mvStack.peek().getFloatValues(),0);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

gl.glDrawArrays(GL_TRIANGLES,0,36);//drawthemoon

Page 98: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//removemoonscale/rotation/position,planetposition,sunposition,and

viewmatricesfromstack

mvStack.popMatrix();mvStack.popMatrix();mvStack.popMatrix();

mvStack.popMatrix();

}

Thematrixstackoperationshavebeenhighlighted.Thereareseveraldetailsworthnoting:

Wehaveintroducedascaleoperationinamodelmatrix.Wewantthemoontobeasmaller cube than the planet, so we use a call to scale() when building the MVmatrixforthemoon.In this example, we are using the trigonometric operations sin() and cos() tocomputetherevolutionoftheplanetaroundthesun(asatranslation),andalsoforthemoonaroundtheplanet.Thetwobuffers#0and#1containcubeandpyramidverticesrespectively.Notetheuseofthepeek() functioncallwithintheglUniformMatrix()command.Thepeek() call retrieves the values in thematrix on top of the stack, and thosevaluesarethensenttotheuniformvariable(inthiscase, thesun,planet,andthenmoon’sMVmatrices).

Thevertexandfragmentshadersarenotshown—theyareunchangedfromthepreviousexample.Wealsomovedtheinitialpositionofthepyramid(sun)andthecameratocenterthesceneonthescreen.

4.9 COMBATING“Z-FIGHTING”ARTIFACTS

Recallthatwhenrenderingmultipleobjects,OpenGLusestheZ-bufferalgorithm (shownearlier in Figure 2.14) for performing hidden surface removal. Ordinarily, this resolveswhichobjectsurfacesarevisibleandrenderedtothescreen,versuswhichsurfacesliebehindotherobjectsand thusshouldnotbe rendered,bychoosingapixel’scolor tobe thatof thecorrespondingfragmentclosesttothecamera.

However, therecanbeoccasionswhentwoobjectsurfaces inasceneoverlapand lie incoincidentplanes,makingitproblematicfortheZ-bufferalgorithmtodeterminewhichofthetwosurfacesshouldberendered(sinceneitheris“closest”tothecamera).Whenthishappens,floatingpoint roundingerrors can lead to someportionsof the rendered surfaceusing thecolorofoneoftheobjects,andotherportionsusingthecoloroftheotherobject.ThisartifactisknownasZ-fightingordepth-fighting,becausetheeffectistheresultofrenderedfragments“fighting”overmutually correspondingpixel entries in theZ-buffer.Figure4.12 shows anexampleofZ-fightingbetweentwoboxeswithoverlappingcoincident(top)faces.

Situations like thisoftenoccurwhencreating terrainor shadows. It isoftenpossible topredictZ-fightinginsuchinstances,andacommonwayofcorrectingit in thesecasesis tomoveoneobjectslightly,sothatthesurfacesarenolongercoplanar.WewillseeanexampleofthisinChapter8.

Page 99: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure4.12Z-fightingexample.

Z-fightingcanalsooccurdue to limitedprecisionof thevalues in thedepthbuffer.Foreach pixel processed by the Z-buffer algorithm, the accuracy of its depth information islimitedbythenumberofbitsavailableforstoringitinthedepthbuffer.Thegreatertherangebetweennearandfarclippingplanes,themorelikelytwoobject’spointswithsimilar(butnotequal)actualdepthswillendbeingrepresentedbythesamenumericvalueinthedepthbuffer.Therefore,itisuptotheprogrammertoselectnearandfarclippingplanevaluestominimizethedistancebetweenthetwoplanes,whilestillensuringthatallobjectsessentialtothesceneliewithintheviewingfrustum.

It is also important to understand that, due to the effect of the perspective transform,changing the near clipping plane value can have a greater impact on the likelihood of Z-fightingartifactsthanmakinganequivalentchangeinthefarclippingplane.Therefore,itisadvisabletoavoidselectinganearclippingplanethatistooclosetotheeye.

Previousexamplesinthisbookhavesimplyusedvaluesof0.1and1000(inourcallstoperspective())forthenearandfarclippingplanes.Thesemayneedtobeadjustedforyourscene.

4.10 OTHEROPTIONSFORPRIMITIVES

OpenGLsupportsanumberofprimitive types—sofarwehaveseen two:GL_TRIANGLESandGL_POINTS.Infact,thereareseveralothers.AlloftheavailableprimitivetypessupportedinOpenGLfallintothecategoriesoftriangles,lines,points,andpatches.Hereisacompletelist:

Triangleprimitives:

GL_TRIANGLES Themostcommonprimitive typein thisbook.Verticesthatpassthroughthepipelineformdistincttriangles:

Page 100: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

vertices: etc.triangles:

GL_TRIANGLE_STRIP Each vertex that passes through the pipeline efficientlyformsatrianglewiththeprevioustwovertices:

vertices: etc.triangles:

GL_TRIANGLE_FAN Each pair of vertices that passes through the pipelineformsatrianglewiththeveryfirstvertex:

vertices:01234etc.

triangles:

GL_TRIANGLES_ADJACENCY

Intendedonlyforusewithgeometryshaders.Allowstheshadertoaccesstheverticesinthecurrenttriangle,plusadditionaladjacentvertices.

GL_TRIANGLE_STRIP_ADJACENCY

Intendedonlyforusewithgeometryshaders.SimilartoGL_TRIANGLES_ADJACENCY, except that triangle verticesoverlapasforGL_TRIANGLE_STRIP.

Lineprimitives:

GL_LINES Verticesthatpassthroughthepipelineformdistinctlines:

vertices: etc.lines:

GL_LINE_STRIPEachvertexthatpassesthroughthepipelineefficientlyformsalinewiththepreviousvertex:

vertices: etc.lines:

GL_LINE_LOOPSame as GL_LINE_STRIP, except a line is also formedbetweentheveryfirstandverylastvertices.

GL_LINES_ADJACENCY

Intendedforusewithgeometryshaders.Allowstheshadertoaccesstheverticesinthecurrentline,plusadditionaladjacentvertices.

Page 101: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

GL_LINE_STRIP_ADJACENCY Similar to GL_LINES_ADJACENCY, except that line verticesoverlapasforGL_LINE_STRIP.

Pointprimitives:

GL_POINTS Eachvertexthatpassesthroughthepipelineisapoint.

Patchprimitives:

GL_PATCH

Intended foruseonlywith tessellationshaders. Indicates thata setofverticespassesfromthevertexshadertothetessellationcontrolshader,wheretheyaretypicallyusedtoshapeatessellatedgridintoacurvedsurface.

4.11 BACK-FACECULLING

Asthecomplexityofour3Dscenesgrows,wewillbecomeincreasinglyconcernedwithperformance. One simple and effective way of improving rendering efficiency is to takeadvantageofOpenGL’sabilitytodoback-faceculling.

Whena3Dmodelisentirely“closed,”meaningtheinteriorisnevervisible(suchasforthecubeandforthepyramid),thenitturnsoutthatthoseportionsoftheoutersurfacethatareangled away from the viewerwill always be obscured by some other portion of the samemodel.Thatis,thosetrianglesthatfaceawayfromtheviewercannotpossiblybeseen(theywould be overwritten by hidden surface removal anyway), and thus there is no reason torasterizeorrenderthem.

We can ask OpenGL to identify and “cull” (not render) back-facing triangles with thecommand glEnable(GL_CULL_FACE). We can also disable face culling withglDisable(GL_CULL_FACE).Bydefault,facecullingisdisabled,soifyouwantOpenGLtocullback-facingtriangles,youmustenableit.

When face culling is enabled, by default triangles are rendered only if they are front-facing.Alsobydefaultatriangleisconsideredfront-facingifitsthreeverticesprogressinacounter-clockwise direction (based on the order that they were defined in the buffer) asviewedfromtheOpenGLcamera.Triangleswhoseverticesprogressinaclockwisedirection(asviewed from theOpenGLcamera)areback-facing, and are not rendered.This counter-clockwisedefinitionof“front-facing”issometimescalledthewindingorder,andcanbesetexplicitlyusingthefunctioncallgl_FrontFace(GL_CCW) forcounter-clockwise(thedefault),orgl_FrontFace(GL_CW)forclockwise.Similarly,whetheritisthefront-facingortheback-facing triangles that are rendered can also be set explicitly. Actually, for this purpose wespecifywhichonesarenot toberendered—thatis,whichonesare“culled.”Wecanspecifythattheback-facingtrianglesbeculled(althoughthiswouldbeunnecessarybecauseitisthedefault)bycallingglCullFace(GL_BACK).Alternatively,wecanspecifyinsteadthatthefront-

Page 102: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

facing triangles be culled, or even that all of the triangles be culled, by replacing theparameterGL_BACKwitheitherGL_FRONTorGL_FRONT_AND_BACK,respectively.

AswewillseeinChapter6,3Dmodelsaretypicallydesignedsothattheoutersurfaceisconstructedoftriangleswiththesamewindingorder—mostcommonlycounter-clockwise—sothatifcullingisenabled,thenbydefaulttheportionofthemodel’soutersurfacethatfacesthe camera is rendered. Since by default OpenGL assumes the winding order is counter-clockwise,ifamodelisdesignedtobedisplayedwithaclockwisewindingorder,itisuptothe programmer to call gl_FrontFace(GL_CW) to account for this if back face culling isenabled.

NotethatinthecaseofGL_TRIANGLE_STRIP,thewindingorderofeachtrianglealternates.OpenGL compensates for this by “flipping” the vertex sequence when building eachsuccessivetriangle,asfollows:0-1-2,then2-1-3,2-3-4,4-3-5,4-5-6,andsoon.

Back face culling improves performance by ensuring that OpenGL doesn’t spend timerasterizingandrenderingsurfacesthatareneverintendedtobeseen.Mostoftheexampleswehaveseeninthischapteraresosmallthatthereislittlemotivationtoenablefaceculling(anexception is the example shown in Figure 4.9, with the 100,000 instanced animated cubes,whichmaypose a performance challenge on some systems). In practice, it is common formost3Dmodelstobe“closed,”andsoitiscustomarytoroutinelyenableback-faceculling.For example, we can add back-face culling to Program 4.3 by modifying the display()functionasfollows:

publicvoiddisplay(GLAutoDrawabledrawable)

{...

gl.glEnable(GL_CULL_FACE);

//---------------------------drawthecube

...

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glFrontFace(GL_CW); //thecubeverticeshaveclockwise

windingorder

gl.glDrawArrays(GL_TRIANGLES,0,36);

//------------------------------drawthepyramid(usebuffer#1)

...

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glFrontFace(GL_CCW); //thepyramidverticeshavecounter-

clockwisewindingorder

gl.glDrawArrays(GL_TRIANGLES,0,18);

}

Properly setting the winding order is important when using back-face culling. Anincorrectsetting,suchasGL_CWwhenitshouldbeGL_CCW,canleadtotheinteriorofanobjectbeingrenderedratherthanitsexterior,whichinturncanproducedistortionsimilartothatofanincorrectperspectivematrix.

Efficiencyisn’ttheonlyreasonfordoingfaceculling.Inlaterchapterswewillseeother

Page 103: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

uses,suchasforthosecircumstanceswhenwewanttoseetheinsideofa3Dmodel,orwhenusingtransparency.

SUPPLEMENTALNOTES

Asmentioned in thischapter,whenusingJavabuffers it isadvisable toensure thatonlydirect buffers are used. For example, it is tempting to use the Java FloatBuffer.wrap()functiontoputvertexdatastoredinanarrayintoabuffer.However,wrap()producesanon-directbuffer,inwhichtheresultingbufferisawrapperfordatathatisstillstoredinanarray4.OneapproachforcopyingthedataintoadirectbufferistoallocateadirectByteBuffer,useByteBuffer.asFloatBuffer()toviewitasaFloatBuffer,andthenusealoopwiththeput()command to copy the data one at a time. As described earlier, the JOGLcom.jogamp.common.nio.Buffersclassmakesiteasiertopopulatedirectbuffersbyprovidingconveniencemethodsforloadingarraydataintoavarietyofbuffertypes:

newDirectByteBuffer(byte[])

newDirectCharBuffer(char[])

newDirectDoubleBuffer(double[])

newDirectFloatBuffer(float[])

newDirectIntBuffer(int[])

newDirectLongBuffer(long[])

newDirectShortBuffer(short[])

Using the above JOGL methods for populating buffers with vertex data (that is, dataintended tobeusedbyJOGLapplications)also insurescorrectnativebyteordering. It alsoensurescorrectsizeallocation,andmakesanexplicitallocationunnecessary.

Thereisamyriadofothercapabilitiesandstructuresavailableformanagingandutilizingdata in OpenGL/JOGL/GLSL, and we have only scratched the surface in this chapter. Wehaven’t, for example, described a uniform block, which is a mechanism for organizinguniformvariablessimilartoastructinC.Uniformblockscanevenbesetuptoreceivedatafrombuffers.Anotherpowerfulmechanismisashaderstorageblock,whichisessentiallyabufferintowhichashadercanwrite.

An excellent reference on the many options for managing data (albeit in C++) is theOpenGL SuperBible [SW15], particularly the chapter entitled “Data” (Chapter 5 in the 7thedition).Italsodescribesmanyofthedetailsandoptionsforthevariouscommandsthatwehavecovered.Thefirsttwoexampleprogramsinthischapter,Program4.1andProgram4.2,wereinspiredbysimilarexamplesintheSuperBible.

Thereareothertypesofdatathatwewillneedtolearnhowtomanage,andhowtosenddowntheOpenGLpipeline.Oneoftheseisatexture,whichcontainscolorimagedata(suchas in a photograph) which can be used to “paint” the objects in our scene.We will studytextureimagesinChapter5.Anotherimportantbufferthatwewillstudyfurtheristhedepthbuffer (orZ-buffer).Thiswill become importantwhenwe study shadows inChapter8.We

Page 104: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

stillhavemuchtolearnaboutmanaginggraphicsdatainOpenGL!

Exercises

4.1 (PROJECT)ModifyProgram4.1toreplacethecubewithsomeothersimple3Dshapeofyourowndesign.BesuretoproperlyspecifythenumberofverticesintheglDrawArrays()command.

4.2 (PROJECT)InProgram4.1,the“view”matrixisdefinedinthedisplay()functionsimplyasthenegativeofthecameralocation:

Matrix3DvMat=newMatrix3D();

vMat.translate(-cameraX,-cameraY,-cameraZ);

ReplacethiscodewithanimplementationofthecomputationshowninFigure3.13.Thiswill allow you to position the camera by specifying a camera position and threeorientation axes.Youwill find it necessary to store the vectorsU,V,N described inSection 3.7. Then, experiment with different camera viewpoints, and observe theresultingappearanceoftherenderedcube.

4.3 (PROJECT)ModifyProgram4.4toincludeasecond“planet,”whichisyourcustom3DshapefromExercise4.1.Makesurethatyournew“planet”isinadifferentorbitthantheexistingplanet,sothattheydon’tcollide.

4.4 (PROJECT)ModifyProgram4.4sothatthe“view”matrixisconstructedusingthe“look-at”functionfromFigure3.21.Thenexperimentwithsettingthe“look-at”parameterstovariouslocations,suchaslookingatthesun(inwhichcasethesceneshouldappearnormal),lookingattheplanet,orlookingatthemoon.

4.5 (RESEARCH)ProposeapracticaluseforglCullFace(GL_FRONT_AND_BACK).

References

[BL16] Blender,TheBlenderFoundation,accessedJuly2016,https://www.blender.org/.

[HT16] J.Hastings-Trew,JHT’sPlanetaryPixelEmporium,accessedJuly2016,http://planetpixelemporium.com/.

[MA16] Maya,AutoDesk,Inc.,accessedJuly2016,http://www.autodesk.com/products/maya/overview.

[NA16] NASA3DResources,accessedJuly2016,http://nasa3d.arc.nasa.gov/.

[OL16] LegacyOpenGL,accessedJuly2016,https://www.opengl.org/wiki/Legacy_OpenGL.

[SW15] G.Sellers,R.WrightJr.,andN.Haemel,OpenGLSuperBible:ComprehensiveTutorialandReference,7thed.(Addison-Wesley,2015).

1Throughoutthisexample,twobuffersaredeclared,toemphasizethatusuallywewilluseseveralbuffers.Laterwewilluse

Page 105: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

theadditionalbuffer(s) to storeother informationassociatedwith thevertex, suchascolor. In thecurrent caseweareusingonlyoneofthedeclaredbuffers,soitwouldhavebeensufficienttodeclarejustoneVBO.2Notethathere,forthefirsttime,wearerefrainingfromdescribingeveryparameterinoneormoreJOGLcalls.AsmentionedinChapter2,thereaderisencouragedtoutilizetheOpenGLdocumentationforsuchdetailsasneeded.3Often, thesecallsmaybeplacedininit()ratherthanindisplay().However, it isnecessarytoplaceoneormoreofthemindisplay()whendrawingmultipleobjectswithdifferentproperties.Forsimplicity,wealwaysplacethemindisplay().4Usingnon-directbuffers is actuallyallowable in JOGL.However,whenanon-directbuffer ispassed toa JOGLmethod,JOGLwillconvertittoadirectbuffer,whichincursaperformancepenalty.

Page 106: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 5

TEXTUREMAPPING

5.1 LoadingTextureImageFiles5.2 TextureCoordinates5.3 CreatingaTextureObject5.4 ConstructingTextureCoordinates5.5 LoadingTextureCoordinatesintoBuffers5.6 UsingtheTextureinaShader:SamplerVariablesandTextureUnits5.7 TextureMapping:ExampleProgram5.8 Mipmapping5.9 AnisotropicFiltering5.10 WrappingandTiling5.11 PerspectiveDistortion5.12 LoadingTextureImageFilesUsingJavaAWTClasses

SupplementalNotes

Texture mapping is the technique of overlaying an image across a rasterized modelsurface.Itisoneofthemostfundamentalandimportantwaysofaddingrealismtoarenderedscene.

Texturemappingissoimportantthatthereishardwaresupportforit,allowingforveryhigh performance resulting in real-time photorealism. Texture Units are hardwarecomponents designed specifically for texturing, andmodern graphics cards typically comewithseveraltextureunitsincluded.

Page 107: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure5.1Texturingadolphinmodelwithtwodifferentimages[JT16].

5.1 LOADINGTEXTUREIMAGEFILES

ThereareanumberofdatasetsandmechanismsthatneedtobecoordinatedtoaccomplishtexturemappingefficientlyinJOGL/GLSL:

a texture object to hold the texture image (in this chapter we consider only 2Dimages)aspecialuniformsamplervariablesothatthevertexshadercanaccessthetextureabuffertoholdthetexturecoordinatesavertexattributeforpassingthetexturecoordinatesdownthepipelineatextureunitonthegraphicscard

Atextureimagecanbeapictureofanything.Itcanbeapictureofsomethingman-madeoroccurringinnature,suchascloth,grass,oraplanetarysurface;or,itcouldbeageometricpattern,suchasthecheckerboardinFigure5.1.Invideogamesandanimatedmovies,textureimages are commonly used to paint faces and clothing on characters, or skin on creaturessuchasonthedolphininFigure5.1.

Page 108: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Imagesaretypicallystoredinimagefiles,suchas.jpg,.png,.gif,or.tiff.Inordertomake a texture image available to shaders in the OpenGL pipeline, we need to extract thecolors from the image and put them into an OpenGL texture object (a built-in OpenGLstructureforholdingatextureimage).

Javahassomeusefulimagefiletoolsinitsimageioandawtpackagesthatcanbeusedtoreadtextureimages.Thestepsare:(a)readtheimagedataintoaByteBuffer,usingtheJOGLbuffertoolswesawinChapter4,(b)useglGenTextures()toinstantiateatextureobjectandassign it an integer ID, (c) callglBindTexture() tomake the newly created texture objectactive, (d) load the previously read-in image data into the texture object with theglTexImage2D() command, and (e) adjust the texture settings using the glTexParameter()function.TheresultisanintegerpointertothenowavailableOpenGLtextureobject.Wewillwalkthroughthesestepsattheendofthischapter,forcompleteness.

However,JOGLincludesitsowntoolsforworkingwithtexturesthatmakeitconsiderablysimplertoloadatextureimagefileintoanOpenGLtextureobject.ThosetoolsarefoundintheJOGLclassesTexture,TextureIO,andTextureData.

TexturinganobjectstartsbydeclaringavariableoftypeTexture.ThisisaJOGLclass;aJOGL Texture object serves as a wrapper for an OpenGL texture object. Next, we callnewTexture()—a static method in the TextureIO class—to actually generate the textureobject. The newTexture() function accepts an image file name as one of its parameters(severalstandardimagefiletypesaresupported,includingthefourmentionedabove).Thesestepsareimplementedinthefollowingfunction:

publicTextureloadTexture(StringtextureFileName)

{Texturetex=null;

try{tex=TextureIO.newTexture(newFile(textureFileName),false);}

catch(Exceptione){e.printStackTrace();}

returntex;

}

The Java/JOGL application calls the aboveloadTexture() function to create the JOGLtextureobject,andextractstheOpenGLtextureobjectfromitusinggetTextureObject().Forexample:

TexturejoglTexture=loadTexture("image.jpg");

intopenglTexture=joglTexture.getTextureObject();

whereimage.jpgisatextureimagefile.

5.2 TEXTURECOORDINATES

NowthatwehaveameansforloadingatextureimageintoOpenGL,weneedtospecifyhowwewant the texture to be applied to the rendered surface of an object.We do this byspecifyingtexturecoordinatesforeachvertexinourmodel.

Page 109: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Texturecoordinatesarereferencestothepixelsina(usually2D)textureimage.Pixelsinatextureimagearereferredtoastexels,inordertodifferentiatethemfromthepixelsbeingrendered on the screen. Texture coordinates are used to map points on a 3D model tolocations in a texture. Each point on the surface of themodel has, in addition to (x,y,z)coordinatesthatplaceitin3Dspace,texturecoordinates(s,t)thatspecifywhichtexelinthetexture imageprovides its color.Thus, the surface of the object is “painted”by the textureimage. The orientation of a texture across the surface of an object is determined by theassignmentoftexturecoordinatestoobjectvertices.

Inorder touse texturemapping, it is necessary toprovide texture coordinates for everyvertexintheobjecttobetextured.OpenGLwillusethesetexturecoordinatestodeterminethecolorofeachrasterizedpixelinthemodel,bylookingupthecolorstoredatthereferencedtexelintheimage.Inordertoensurethateverypixelinyourrenderedmodelispaintedwithan appropriate texel from the texture image, the texture coordinates are put into a vertexattributesothattheyarealsointerpolatedbytherasterizer.Inthatwaythetextureimageisinterpolated,orfilledin,alongwiththemodelvertices.

Foreachsetofvertexcoordinates(x,y,z)passingthroughthevertexshader, therewillbeanaccompanyingsetoftexturecoordinates(s,t).Wewillthussetuptwobuffers,oneforthevertices(withthreecomponentsx,y,andzineachentry)andoneforthecorrespondingtexture coordinates (with two components s and t in each entry). Each vertex shaderinvocation thus receives one vertex, now comprised of both its spatial coordinates and itscorrespondingtexturecoordinates.

Texture coordinates are most often 2D (OpenGL does support some otherdimensionalities but we won’t cover them in this chapter). It is assumed that the image isrectangular, with location (0,0) at the lower left and (1,1) at the upper right.1 Texturecoordinates,then,shouldideallyhavevaluesintherange(0,1).

Consider the example shown in Figure 5.2. The cube model, recall, is constructed oftriangles.Thefourcornersofonesideofthecubearehighlighted,butrememberthatittakestwotrianglestospecifyeachsquareside.Thetexturecoordinatesforeachofthesixverticesthat specify thisonecube sideare listedalongside the fourcorners,with thecornersat theupperleftandlowerrighteachcomprisedofapairofvertices.Atextureimageisalsoshown.Thetexturecoordinates(indexedbysandt)havemappedportionsoftheimage(thetexels)ontotherasterizedpixelsofthefrontfaceofthemodel.Notethatalloftheinterveningpixelsin between the vertices have been paintedwith the intervening texels in the image. This isachievedbecausethetexturecoordinatesaresenttothefragmentshaderinavertexattribute,andthusarealsointerpolatedjustliketheverticesthemselves.

Inthisexample,wepurposelyspecifiedtexturecoordinatesthatresultinanoddly-paintedsurface, for purposes of illustration. If you look closely, you can also see that the imageappearsslightlystretched—thatisbecausetheaspectratioofthetextureimagedoesn’tmatchtheaspectratioofthecubefacerelativetothegiventexturecoordinates.

Page 110: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure5.2Texturecoordinates.

Forsimplemodelslikecubesorpyramids,selectingtexturecoordinatesisrelativelyeasy.But formore complex curvedmodelswith lots of triangles, it isn’t practical to determinethem by hand. In the case of curved geometric shapes, such as a sphere or torus, texturecoordinatescanbecomputedalgorithmicallyormathematically.Inthecaseofamodelbuiltwith a modeling tool such as Maya [MA16] or Blender [BL16], such tools offer “UV-mapping”(outsideofthescopeofthisbook)tomakethistaskeasier.

Letusreturntorenderingourpyramid,onlythistimetexturingitwithanimageofbricks.Wewillneedtospecify:(a)atextureobjecttoholdthetextureimage,(b)texturecoordinatesforthemodelvertices,(c)abufferforholdingthetexturecoordinates,(d)vertexattributessothatthevertexshadercanreceiveandforwardthetexturecoordinatesthroughthepipeline,(e)a textureuniton thegraphicscardforholding the textureobject,and(f)auniformsamplervariable for accessing the texture unit inGLSL,whichwewill see shortly.These are eachdescribedinthenextsections.

5.3 CREATINGATEXTUREOBJECT

Supposetheadjacentimageisstoredinafilenamed“brick1.jpg”[LU16].

Asshownpreviously,wecanloadthisimagebycallingourloadTexture() function,asfollows:

TexturejoglBrickTexture=loadTexture("brick1.jpg");

intbrickTexture=joglBrickTexture.getTextureObject();

Page 111: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

RecallthattextureobjectsareidentifiedbyintegerIDs,sobrickTextureisoftypeint.

5.4 CONSTRUCTINGTEXTURECOORDINATES

Our pyramid has four triangular sides and a square on the bottom. Althoughgeometricallythisonlyrequiresfive(5)points,wehavebeenrenderingitwithtriangles.Thisrequiresfourtrianglesforthesides,andtwotrianglesforthesquarebottom,foratotalofsixtriangles.Eachtrianglehasthreevertices,foratotalof6*3=18verticesthatmustbespecifiedinthemodel.

We already listed the pyramid’s geometric vertices in Program 4.3 in the float arraypyramid_positions[].Therearemanywaysthatwecouldorientourtexturecoordinatessoastodrawourbricksontothepyramid.Onesimple(albeitimperfect)waywouldbetomakethetopcenteroftheimagecorrespondtothepeakofthepyramid,asfollows:

Wecandothisforallfourofthetrianglesides.Wealsoneedtopaintthebottomsquareofthepyramid,whichiscomprisedoftwotriangles.Asimpleandreasonableapproachwouldbetotextureitwiththeentireareafromthepicture(thepyramidhasbeentippedbackandissittingonitsside):

Thecorrespondingsetofvertexandtexturecoordinatesusingthisverysimplestrategy,forthefirsteightofthepyramidverticesfromProgram4.3,isshownbelow,inFigure5.3:

Page 112: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure5.3Texturecoordinatesforthepyramid(partiallist).

5.5 LOADINGTEXTURECOORDINATESINTOBUFFERS

WecanloadthetexturecoordinatesintoaVBOinasimilarmanneraswesawpreviouslyforloadingthevertices.InsetupVertices(),weaddthefollowingdeclarationofthetexturecoordinatevalues:

float[]pyr_texture_coordinates=

{0.0f,0.0f,1.0f,0.0f,0.5f,1.0f,0.0f,0.0f,1.0f,0.0f,0.5f,1.0f,

//topandrightfaces

0.0f,0.0f,1.0f,0.0f,0.5f,1.0f,0.0f,0.0f,1.0f,0.0f,0.5f,1.0f,

//backandleftfaces

0.0f,0.0f,1.0f,1.0f,0.0f,1.0f,1.0f,1.0f,0.0f,0.0f,1.0f,0.0f

};//basetriangles

Then,afterthecreationofatleasttwoVBOs(oneforthevertices,andoneforthetexturecoordinates),weaddthefollowinglinesofcodetoloadthetexturecoordinatesintoVBO#1:

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

FloatBufferpTexBuf=Buffers.newDirectFloatBuffer(pyr_texture_coordinates);

gl.glBufferData(GL_ARRAY_BUFFER,pTexBuf.limit()*4,pTexBuf,

GL_STATIC_DRAW);

5.6 USINGTHETEXTUREINASHADER:SAMPLERVARIABLESANDTEXTUREUNITS

Tomaximizeperformance,wewillwanttoperformthetexturinginhardware.ThismeansthatourfragmentshaderwillneedawayofaccessingthetextureobjectthatwecreatedintheJava/JOGL application. Themechanism for doing this is via a special GLSL tool called a

Page 113: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

uniform sampler variable. This is a variable designed for instructing a texture unit on thegraphicscardastowhichtexeltoextractor“sample”fromaloadedtextureobject.

Declaringasamplervariableintheshaderiseasy—justaddittoyoursetofuniforms:

layout(binding=0)uniformsampler2Dsamp;

Ours is named “samp”.The “layout(binding=0)” portion of the declaration specifiesthatthissampleristobeassociatedwithtextureunit0.

A texture unit (and associated sampler) can be used to samplewhichever texture objectyouwish,andthatcanchangeatruntime.Yourdisplay()functionwillneedtospecifywhichtexture object youwant the texture unit to sample for the current frame. So each time youdrawanobject,youwillneedtoactivateatextureunitandbindittoaparticulartextureobject,asfollows:

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_2D,brickTexture);

Thenumberofavailabletextureunitsdependsonhowmanyareprovidedonthegraphicscard.AccordingtotheOpenGLAPIdocumentation,OpenGLversion4.5requiresthatthisbeatleast16pershaderstage,andatleast80totalunitsacrossallstages[OP16].Inthisexample,wehavemadethe0thtextureunitactivebyspecifyingGL_TEXTURE0intheglActiveTexture()call.

To actually perform the texturing, we will need to modify how our fragment shaderoutputscolors.Previously,ourfragmentshadereitheroutputaconstantcolor,oritobtainedcolors from a vertex attribute. This time instead, we need to use the interpolated texturecoordinates received from the vertex shader (through the rasterizer) to sample the textureobject,bycallingthetexture()functionasfollows:

invec2tc; //texturecoordinates

...

color=texture(samp,tc);

5.7 TEXTUREMAPPING:EXAMPLEPROGRAM

Program5.1combinesthepreviousstepsintoasingleprogram.Theresult,showingthepyramidtexturedwiththebrickimage,appearsinFigure5.4.Tworotations(notshowninthecode listing) were added to the pyramid’s model matrix to expose the underside of thepyramid.

Page 114: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure5.4Pyramidtexturemappedwithbrickimage.

Itisnowasimplemattertoreplacethebricktextureimagewithothertextureimages,asdesired, simply by changing the filename in the loadTexture() call. For example, if wereplace“brick1.jpg”withtheimagefile“ice.jpg”[LU16],wegettheresultshowninFigure5.5.

Figure5.5Pyramidtexturemappedwith“ice”image.

Program5.1Pyramidwithbricktexture.

JAVA/JOGLApplication//thefollowingadditionalimportswillbenecessaryfortheloadTexture:

importjava.io.*;

importjava.nio.*;

importcom.jogamp.opengl.util.texture.*;

importcom.jogamp.opengl.util.texture.TextureIO.*;

publicclassCodeextendsJFrameimplementsGLEventListener

{//previousdeclarationsandconstructorcodeapplies.Wejustneedtoadd

declarationsforthetexture:

...

privateintbrickTexture;

...

publicCode()

Page 115: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

{//unchanged...

}

publicvoiddisplay(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

...

//setupofbackgroundcolor,depthbuffer,VIEWandPROJmatrices

unchanged

...

//thistimewearedrawingonlythepyramid.

//setupofMandMVmatricesisunchanged.

...

//activatebuffer#0,whichcontainsthevertices

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

//activatebuffer#1,whichcontainsthetexturecoordinates

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

gl.glVertexAttribPointer(1,2,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(1);

//activatetextureunit#0andbindittothebricktextureobject

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_2D,brickTexture);

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,18);

}

publicTextureloadTexture(StringtextureFileName)

{Texturetex=null;

try{tex=TextureIO.newTexture(newFile(textureFileName),false);}

catch(Exceptione){e.printStackTrace();}

returntex;

}

publicvoidinit(GLAutoDrawabledrawable)

{//setupofrenderingprogram,cameraandobjectlocationunchanged

...

joglBrickTexture=loadTexture("brick1.jpg");

brickTexture=joglBrickTexture.getTextureObject();

}

privatevoidsetupVertices()

{GL4gl=(GL4)GLContext.getCurrentGL();

float[]pyramid_positions={/*dataaslistedpreviouslyinProgram

4.2*/};

float[]pyr_texture_coordinates=

{0.0f,0.0f,1.0f,0.0f,0.5f,1.0f,0.0f,0.0f,1.0f,0.0f,0.5f,

1.0f,

0.0f,0.0f,1.0f,0.0f,0.5f,1.0f,0.0f,0.0f,1.0f,0.0f,0.5f,

1.0f,

0.0f,0.0f,1.0f,1.0f,0.0f,1.0f,1.0f,1.0f,0.0f,0.0f,1.0f,

Page 116: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

0.0f};

//...generatetheVAOasbefore,andatleasttwoVBOs,thenload

thetwobuffersasfollows:

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

FloatBufferpyrBuf=Buffers.newDirectFloatBuffer(pyramid_positions);

gl.glBufferData(GL_ARRAY_BUFFER,pyrBuf.limit()*4,pyrBuf,

GL_STATIC_DRAW);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

FloatBufferpTexBuf=

Buffers.newDirectFloatBuffer(pyr_texture_coordinates);

gl.glBufferData(GL_ARRAY_BUFFER,pTexBuf.limit()*4,pTexBuf,

GL_STATIC_DRAW);

}

...//remainderoftheclassdefinitionandutilityfunctionshere

}

Vertexshader#version430

layout(location=0)invec3pos;

layout(location=1)invec2texCoord;

outvec2tc;//texturecoordinateoutputtorasterizerforinterpolation

uniformmat4mv_matrix;

uniformmat4proj_matrix;

layout(binding=0)uniformsampler2Dsamp;//notusedinvertexshader

voidmain(void)

{gl_Position=proj_matrix*mv_matrix*vec4(pos,1.0);

tc=texCoord;

}

Fragmentshader#version430

invec2tc;//interpolatedincomingtexturecoordinate

outvec4color;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

layout(binding=0)uniformsampler2Dsamp;

voidmain(void)

{color=texture(samp,tc);

}

5.8 MIPMAPPING

Texturemapping commonly produces a variety of undesirable artifacts in the renderedimage.Thisisbecausetheresolutionoraspectratioofthetextureimagerarelymatchesthatoftheregioninthescenebeingtextured.

Page 117: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Averycommonartifactoccurswhentheimageresolutionislessthan thatoftheregionbeingdrawn.Inthiscase,theimagewouldneedtobestretchedtocovertheregion,becomingblurry(andpossiblydistorted).Thiscansometimesbecombated,dependingonthenatureofthe texture, by assigning the texture coordinates differently so that applying the texturerequireslessstretching.Anothersolutionistouseahigherresolutiontextureimage.

Thereversesituationiswhentheresolutionoftheimagetextureisgreaterthanthatoftheregionbeingdrawn.It isprobablynotatallobviouswhythiswouldposeaproblem,but itdoes!Inthiscase,noticeablealiasingartifactscanoccur,givingrisetostrange-lookingfalsepatterns,or“shimmering”effectsinmovingobjects.

Aliasingiscausedbysamplingerrors.Itismostoftenassociatedwithsignalprocessing,where an inadequately sampled signal appears to have different properties (such aswavelength)thanitactuallydoeswhenitisreconstructed.AnexampleisshowninFigure5.6.Theoriginalwaveformisshowninred.Theyellowdotsalong thewaveformrepresent thesamples.Iftheyareusedtoreconstructthewave,andiftherearen’tenoughofthem,theycandefineadifferentwave(showninblue).

Figure5.6Aliasingduetoinadequatesampling.

Similarly, in texture-mapping, when a high-resolution (and highly detailed) image issparselysampled(suchaswhenusingauniformsamplervariable),thecolorsretrievedwillbe inadequate to reflect theactualdetail in the image, andmay instead seemrandom. If thetextureimagehasarepeatedpattern,aliasingcanresultinadifferentpatternbeingproducedthantheoneintheoriginalimage.Iftheobjectbeingtexturedismoving,roundingerrorsinlookingup the texels can result in constant changes in the sampledpixel at a given texturecoordinate,resultinginasparklingeffectacrossthesurfaceoftheobjectbeingdrawn.

Figure5.7showsatiltedclose-uprenderingofthetopofacubewhichhasbeentexturedbyalarge,high-resolutionimageofacheckerboard.

Page 118: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure5.7Aliasinginatexturemap.

Aliasing is evident near the top of the image, where the under-sampling of thecheckerboardhasproduceda“striped”effect.Althoughwecan’tshowithereinastillimage,if thiswere an animated scene, the patternswould likely seem to undulate betweenvariousincorrectpatternssuchasthisone.

AnotherexampleappearsinFigure5.8,inwhichthecubehasbeentexturedwithanimageofthesurfaceofthemoon[HT16].Atfirstglance,thisimageappearssharpandfullofdetail.However,someofthedetailattheupperrightoftheimageisfalse,andcauses“sparkling”asthe cube object (or the camera)moves. (Unfortunately, we can’t show the sparkling effectclearlyinastillimage.)

Figure5.8"Sparkling"inatexturemap.

Theseandsimilarsamplingerrorartifactscanbelargelycorrectedbyatechniquecalledmipmapping, in which different versions of the texture image are created at variousresolutions.The texture image is used thatmost closelymatches the resolution at thepointbeingtextured.Evenbetter,colorscanbeaveragedbetweentheimagesclosestinresolutiontothatoftheregionbeingtextured.ResultsofapplyingmipmappingtotheimagesinFigure5.7andFigure5.8areshowninFigure5.9.

Page 119: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure5.9Mipmappedresults

Mipmappingworks by a clevermechanism for storing a series of successively lower-resolutioncopiesofthesameimage,inatextureimage⅓largerthantheoriginalimage.Thisisachievedbystoring theR,G,andBcomponentsof the imageseparately in three (outoffour)quadrantsofthetextureimage,thenrepeatingtheprocessinthefourthquadrantforthesame image at ¼ the original resolution. This subdividing repeats until the remainingquadrant is toosmall tocontainanyuseful imagedata.AnexampleimageandtheresultingmipmapisshownbelowinFigure5.10.

Figure5.10Mipmappinganimage.

Thismethodofstuffingseveralimagesintoasmallspace(well,justabitbiggerthanthespaceneeded to store theoriginal image) ishowmipmappinggot itsname.MIPstands forMultumInParvo[WI83],whichisLatinfor“muchinasmallspace.”

When actually texturing an object, the mipmap can be sampled in several ways. InOpenGL, the manner in which the mipmap is sampled can be chosen by setting the

Page 120: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

GL_TEXTURE_MIN_FILTERparametertothedesiredminificationtechnique,whichisoneofthefollowing:

GL_NEAREST_MIPMAP_NEAREST

choosesthemipmapwiththeresolutionmostsimilartothatoftheregionofpixelsbeingtextured.Itthenobtainsthenearesttexeltothedesiredtexturecoordinates.GL_LINEAR_MIPMAP_NEAREST

choosesthemipmapwiththeresolutionmostsimilartothatoftheregionofpixelsbeingtextured.Ittheninterpolatesthefourtexelsnearesttothetexturecoordinates.Thisiscalled“linearfiltering.”GL_NEAREST_MIPMAP_LINEAR

chooses the twomipmapswith resolutionsnearest to that of the regionof pixelsbeingtextured.Itthenobtainsthenearesttexeltothetexturecoordinatesfromeachmipmap,andinterpolatesthem.Thisiscalled“bilinearfiltering.”GL_LINEAR_MIPMAP_LINEAR

chooses the twomipmapswith resolutionsnearest to that of the regionof pixelsbeing textured. It then interpolates the four nearest texels in each mipmap, andinterpolates those two results.This is called“trilinear filtering”and is illustratedbelowinFigure5.11.

Figure5.11Trilinearfiltering.

Trilinear filtering is usually preferable, as lower levels of blending often produceartifacts,suchasvisibleseparationsbetweenmipmaplevels.Figure5.12showsaclose-upofthe checkerboard using mipmapping with only linear filtering enabled. Note the circledartifacts,wherethevertical linessuddenlychangefromthicktothinatamipmapboundary.Bycontrast,theexampleinFigure5.9usedtrilinearfiltering.

Page 121: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure5.12Linearfilteringartifacts.

MipmappingisrichlysupportedinOpenGL.Therearemechanismsprovidedforbuildingyourownmipmaplevels,orhavingOpenGLbuildthemforyou.Inmostcases,themipmapsbuiltautomaticallybyOpenGLaresufficient.ThefollowinglinesofcodeaddedtoProgram5.1ininit(), immediatelyafter theloadTexture()andgetTextureObject() functioncalls,causeOpenGLtogeneratethemipmaps:

gl.glBindTexture(GL_TEXTURE_2D,brickTexture);

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,

GL_LINEAR_MIPMAP_LINEAR);

gl.glGenerateMipmap(GL_TEXTURE_2D);

The brick texture is made active with the glBindTexture() call, and then theglTexParameteri()functioncallenablesoneoftheminificationfactorslistedabove,suchasGL_LINEAR_MIPMAP_LINEARshownintheabovecall,whichenablestrilinearfiltering.

Once the mipmap is built, the filtering option can be changed (although this is rarelynecessary)bycallingglTexParameteri()again,suchasinthedisplayfunction.MipmappingcanevenbedisabledbyselectingGL_NEARESTorGL_LINEAR.

Forcriticalapplications, it ispossible tobuild themipmapsyourself,usingwhatever isyour preferred image editing software. They can then be added as mipmap levels whencreating the texture object by repeatedly calling OpenGL’s glTexImage2D() function, orJOGL’s updateSubImage() function, for each mipmap level. This approach is outside thescopeofthisbook.

5.9 ANISOTROPICFILTERING

Mipmapped textures can sometimes appear more blurry than non-mipmapped textures,especiallywhen the texturedobject is renderedataheavily tiltedviewingangle.WesawanexampleofthisbackinFigure5.9,wherereducingartifactswithmipmappingledtoreduceddetail(comparingwithFigure5.8).

This lossofdetailoccursbecausewhenanobject is tilted, itsprimitivesappearsmalleralong one axis (i.e.,width vs. height) than the other.WhenOpenGL textures a primitive, it

Page 122: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

selectsthemipmapappropriateforthesmallerofthetwoaxes(toavoid“sparkling”artifacts).InFigure5.9 thesurface is tiltedheavilyawayfromtheviewer,soeachrenderedprimitivewillutilizethemipmapappropriateforitsreducedheight,whichislikelytohavearesolutionlowerthanappropriateforitswidth.

Onewayofrestoringsomeofthislostdetailistouseanisotropicfiltering(AF).Whereasstandardmipmappingsamplesatextureimageatavarietyofsquareresolutions(e.g.,256×256,128×128,etc.),AFsamplesthetexturesatanumberofrectangularresolutionsaswell,such as 256 × 128, 64 × 128, and so on. This enables viewing at various angles whileretainingasmuchdetailinthetextureaspossible.

Anisotropic filtering ismorecomputationallyexpensive thanstandardmipmapping,andisnotarequiredpartofOpenGL.However,mostgraphicscardssupportAF(thisisreferredtoasanOpenGLextension),andOpenGLdoesprovidebothawayofqueryingthecardtoseeifitsupportsAF,andawayofaccessingAFifitdoes.Thecodeisaddedimmediatelyaftergeneratingthemipmap:

...

//ifmipmapping

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,

GL_LINEAR_MIPMAP_LINEAR);

gl.glGenerateMipmap(GL_TEXTURE_2D);

//ifalsoanisotropicfiltering

if(gl.isExtensionAvailable("GL_EXT_texture_filter_anisotropic"))

{floatmax[]=newfloat[1];

gl.glGetFloatv(GL_MAX_TEXTURE_MAX_ANISOTROPY_EXT,max,0);

gl.glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAX_ANISOTROPY_EXT,max[0]);

}

Thecalltogl.isExtenstionAvailable()testswhetherthegraphicscardsupportsAF.Ifitdoes, we set it to the maximum degree of sampling supported, a value retrieved usingglGetFloatv() as shown. It is then applied to the active texture object usingglTexParameterf().TheresultisshowninFigure5.13.NotethatmuchofthelostdetailfromFigure5.8hasbeenrestored,whilestillremovingthesparklingartifacts.

Figure5.13Anisotropicfiltering.

Page 123: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

5.10 WRAPPINGANDTILING

So far we have assumed that texture coordinates all fall in the range (0,1). However,OpenGL actually supports texture coordinates of any value. There are several options forspecifyingwhat happenswhen texture coordinates fall outside the range (0,1). The desiredbehaviorissetusingglTexParameteri(),andtheoptionsareasfollows:

GL_REPEAT:Theintegerportionofthetexturecoordinatesareignored,generatingarepeatingor“tiling”pattern.Thisisthedefaultbehavior.GL_MIRRORED_REPEAT:The integerportion is ignored,except that thecoordinatesare reversed when the integer portion is odd, so the repeating pattern alternatesbetweennormalandmirrored.GL_CLAMP_TO_EDGE:Coordinates less than0andgreater than1areset to0and1,respectively.GL_CLAMP_TO_BORDER: Texels outside of (0,1) will be assigned some specifiedbordercolor.

Forexample,considerapyramidinwhichthetexturecoordinateshavebeendefinedintherange (0,4) rather than the range (0,1).Thedefaultbehavior (GL_REPEAT), using the textureimageshownpreviouslyinFigure5.2,wouldresultinthetexturerepeatingfourtimesacrossthesurface(sometimescalled“tiling”),asshownbelowinFigure5.14:

Figure5.14TexturecoordinatewrappingwithGL_REPEAT.

Tomakethetiles’appearancealternatebetweennormalandmirrored,wecanspecifythefollowing:

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_MIRRORED_REPEAT);

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_MIRRORED_REPEAT);

Specifyingthatvalueslessthan0andgreaterthan1besetto0and1respectively,canbedonebyreplacingGL_MIRRORED_REPEATwithGL_CLAMP_TO_EDGE.

Specifyingthatvalueslessthan0andgreaterthan1resultina“border”colorcanbedoneasfollows:

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_BORDER);

Page 124: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_BORDER);

float[]redColor=newfloat[]{1.0f,0.0f,0.0f,1.0f};

gl.glTexParameterfv(GL_TEXTURE_2D,GL_TEXTURE_BORDER_COLOR,redColor,0);

Theeffectofeachoftheseoptions(mirroredrepeat,clamptoedge,andclamptoborder),with texture coordinates ranging from -1 to +2, are shown respectively (left to right) inFigure5.15.

Figure5.15Texturedpyramidwithvariouswrappingoptions.

Inthecenterexample(clamp-to-edge),thepixelsalongtheedgesofthetextureimagearereplicated outward.Note that as a side effect, the lower-left and lower-right regions of thepyramid faces obtain their color from the lower-left and lower-right pixels of the textureimage,respectively.

5.11 PERSPECTIVEDISTORTION

Wehaveseenthatastexturecoordinatesarepassedfromthevertexshadertothefragmentshader,theyareinterpolatedastheypassthroughtherasterizer.Wehavealsoseenthatthisistheresultoftheautomaticlinearinterpolationthatisalwaysperformedonvertexattributes.

However, in the case of texture coordinates, linear interpolation can lead to noticeabledistortionina3Dscenewithperspectiveprojection.

Consider a rectangle made of two triangles, and textured with a checkerboard image,facingthecamera.AstherectangleisrotatedaroundtheXaxis,thetoppartoftherectangletiltsawayfromthecamera,whilethelowerpartoftherectangleswingsclosertothecamera.Thus,wewouldexpectthesquaresatthetoptobecomesmaller,andthesquaresatthebottomtobecomelarger.However,linearinterpolationofthetexturecoordinateswillinsteadcausetheheightofallsquarestobeequal.Thedistortionisexacerbatedalongthediagonaldefiningthetwotrianglesthatmakeuptherectangle.TheresultingdistortionisshowninFigure5.16.

Fortunately, there are algorithms for correcting perspective distortion, and by default,OpenGL applies a perspective correction algorithm [OP14] when sampling texturecoordinates. Figure 5.17 shows the same rotating checkerboard, properly rendered by

Page 125: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

OpenGL.

Although not common, it is possible to disable OpenGL’s perspective correction byaddingthekeyword“noperspective”inthedeclarationofthevertexattributecontainingthetexturecoordinates.Thishastobedoneinboththevertexandfragmentshaders.Forexample,thevertexattributeinthevertexshaderwouldbedeclaredasfollows:

noperspectiveoutvec2texCoord;

Figure5.16Textureperspectivedistortion.

Figure5.17OpenGLperspectivecorrection.

Thecorrespondingattributeinthefragmentshaderwouldbedeclared:noperspectiveinvec2texCoord;

ThissecondsyntaxwasinfactusedtoproducethedistortedcheckerboardinFigure5.16.

5.12 LOADINGTEXTUREIMAGEFILESUSINGJAVAAWTCLASSES

Throughout the rest of this textbook we use the JOGL Texture, TextureIO, andTextureData classes as described earlier in this chapter to load texture image data intoOpenGLtextureobjects.However,itispossibletoloadtextureimagefiledataintoOpenGLtextures directly, using Java AWT classes and some additional OpenGL commands. Theprocess isquiteabitmorecomplicated,soforsimplicityandclaritywewilluse theJOGLclassesinthisbookwheneverpossible.However, it isuseful tounderstandtheprocess(andtheparticularcommands)onecoulduseinlieuoftheJOGLtextureclasses.Forexample,theJOGLtextureclassesdon’tsupport3Dtextures,soaswewillseelater,buildinganOpenGL

Page 126: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

3DtextureobjectwillrequiredoingmanyofthestepsourselvesinJava.

BuildingaloadTexture() functionanalogous to theone inProgram5.1,without usingJOGLtextureclasses,isshowninProgram5.2.Itstartsbycallingtwoutilityfunctions(alsoshown).ThefirstoneisgetBufferedImage(),whichreadsthespecifiedimagefile,assumedto be in a recognized format such as .jpg or .png, and returns a Java BufferedImagecontainingtheimagefiledata.Thesecondutilityfunction,getRGBAPixelData(),extractstheRGBA pixel colors from the specified BufferedImage and returns them in a byte arrayorganizedintheformexpectedbyOpenGL.

The loadTexture() function then continues by copying the byte array returned fromgetRGBAPixelData() into a Java ByteBuffer, using the JOGLBuffers.newDirectByteBuffer()methoddescribed inChapter4. It then creates the textureobject,inamannersimilartothestepsweusedforcreatingVBOs.Textures,likebuffers,aregivenintegerIDsbycallingglGenTextures().Here, thevariabletextureID isused toholdthe ID of a generated texture object. Next, the texture object is made active by callingglBindTexture(),andthenweloadthepreviouslyread-inimagedataintotheactivetextureobjectbyusingtheglTexImage2D()command.Notethefirstparameteronthiscallspecifiesthe type of texture object—in this caseGL_TEXTURE_2D (laterwewill use this command tocreateothertypesofOpenGLtextures,suchastexturecubemapsinChapter9and3DtexturesinChapter14). The next command,glTexParameteri(), can be used to adjust some of thetexture settings, such as building mipmaps. When loadTexture() finishes, it returns theinteger ID for thenowavailableOpenGL textureobject, rather than theearlierversion thatreturnedaJOGLTextureobject.

The call to loadTexture() in init() would also change slightly. Since the version ofloadTexture()inProgram5.2returnsanint(ratherthanaJOGLTexture),thereisnoneedto callgetTextureObject(). Instead, the following single call toloadTexture() suffices tocreatetheintegerpointertotheOpenGLtextureobject:

intbrickTexture=loadTexture(“brick1.jpg”);

Program5.2JavaAWTRoutinesforLoadingTextureImagesprivateintloadTexture(StringtextureFileName)

{GL4gl=(GL4)GLContext.getCurrentGL();

BufferedImagetextureImage=getBufferedImage(textureFileName);

byte[]imgRGBA=getRGBAPixelData(textureImage);

ByteBufferrgbaBuffer=Buffers.newDirectByteBuffer(imgRGBA);

int[]textureIDs=newint[1]; //arrayto

holdgeneratedtextureIDs

gl.glGenTextures(1,textureIDs,0);

inttextureID=textureIDs[0]; //IDforthe0th

textureobject

gl.glBindTexture(GL_TEXTURE_2D,textureID); //specifies

theactive2Dtexture

gl.glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA, //MIPMAP

level,colorspace

Page 127: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

textureImage.getWidth(),textureImage.getHeight(),0//imagesize,border

(ignored)

GL_RGBA,GL_UNSIGNED_BYTE, //pixelformatand

datatype

rgbaBuffer); //bufferholdingtexturedata

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);

returntextureID;

}

privateBufferedImagegetBufferedImage(StringfileName)

{BufferedImageimg;

try{img=ImageIO.read(newFile(fileName));}

catch(IOExceptione)

{System.err.println("Errorreading'"+fileName+'"');thrownew

RuntimeException(e);}

returnimg;

}

privatebyte[]getRGBAPixelData(BufferedImageimg)

{byte[]imgRGBA;

intheight=img.getHeight(null);

intwidth=img.getWidth(null);

WritableRasterraster=

Raster.createInterleavedRaster(DataBuffer.TYPE_BYTE,width,height,4,

null);

ComponentColorModelcolorModel=newComponentColorModel(

ColorSpace.getInstance(ColorSpace.CS_sRGB),

newint[]{8,8,8,8},true,false, //bits,hasAlpha,

isAlphaPreMultiplied

ComponentColorModel.TRANSLUCENT, //transparency

DataBuffer.TYPE_BYTE); //datatransfertype

BufferedImagenewImage=newBufferedImage(colorModel,raster,false,null);

//useanaffinetransformto"flip"theimagetoconformtoOpenGL

orientation.

//InJavatheoriginisattheupperleft.InOpenGLtheoriginisat

thelowerleft.

AffineTransformgt=newAffineTransform();

gt.translate(0,height);

gt.scale(1,-1d);

Graphics2Dg=newImage.createGraphics();

g.transform(gt);

g.drawImage(img,null,null);

g.dispose();

DataBufferBytedataBuf=(DataBufferByte)raster.getDataBuffer();

imgRGBA=dataBuf.getData();

returnimgRGBA;

}

SUPPLEMENTALNOTES

Researchers have developed a number of uses for texture units beyond just texturing

Page 128: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

modelsinascene.Inlaterchapters,wewillseehowtextureunitscanbeusedforalteringtheway light reflects off anobject,making it appear bumpy.Wecan alsouse a textureunit tostore“heightmaps”forgeneratingterrain,andforstoring“shadowmaps”toefficientlyaddshadowstoourscenes.Theseuseswillbedescribedinsubsequentchapters.

Shaders can alsowrite to textures, allowing shaders tomodify texture images, or evencopypartofonetextureintosomeportionofanothertexture.

Mipmapsandanisotropicfilteringarenottheonlytoolsforreducingaliasingartifactsintextures.Full-scene anti-aliasing (FSAA) and other super-sampling methods, for example,canalsoimprovetheappearanceoftexturesina3Dscene.AlthoughnotpartoftheOpenGLcore, they are supported onmany graphics cards throughOpenGL’s extensionmechanism[OE16].

Thereisanalternativemechanismforconfiguringandmanagingtexturesandsamplers.Version3.3ofOpenGLintroducedsamplerobjects(sometimescalled“samplerstates”—notto be confused with sampler variables) that can be used to hold a set of texture settingsindependent of the actual texture object. Sampler objects are attached to texture units andallowforconvenientlyandefficientlychangingtexturesettings.Theexamplesshowninthistextbook are sufficiently simple that we decided to omit coverage of sampler objects. Forinterestedreaders,samplerobjectsareeasytolearnhowtouse,andtherearemanyexcellentonlinetutorials(suchas[GE11]).

TheJOGLTextureclassmakesanumberofOpenGLtexture-relatedfunctionsavailabledirectly,without extracting the actualOpenGL texture object aswe did in this chapter. Forexample, there are bind() and setTexParameter() functions that invoke the OpenGLfunctions glBindTexture() and glSetTexParameter(). We will explore more of thefunctionalityintheJOGLtextureclasseslaterinthebookwhenwestudycubemapsand3Dtextures. An excellent source of information on the JOGL Texture, TextureIO, andTextureDataclassesistheirextensiveJavadocpages.

Exercises

5.1 ModifyProgram5.1byaddingthe“noperspective”declarationtothetexturecoordinatevertexattributes,asdescribedinSection5.11.Thenre-runtheprogramandcomparetheoutputwiththeoriginal.Isanyperspectivedistortionevident?

5.2 Usingasimple“paint”program(suchasWindows“Paint”orGIMP[GI16]),drawafreehandpictureofyourowndesign.ThenuseyourimagetotexturethepyramidinProgram5.1.

5.3 (PROJECT)ModifyProgram4.4sothatthe“sun,”“planet,”and“moon”aretextured.Youmaycontinuetousetheshapesalreadypresent,andyoumayuseanytextureyoulike.Texturecoordinatesforthecubeareavailablebysearchingthroughsomeofthepostedcodeexamples,oryoucanbuildthemyourselfbyhand(althoughthatisabittedious).

Page 129: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

References

[BL16] Blender,TheBlenderFoundation,accessedJuly2016,https://www.blender.org/.

[GE11] Geeks3D,“OpenGLSamplerObjects:ControlYourTextureUnits,”September8,2011,accessedJuly2016,http://www.geeks3d.com/20110908/.

[GI16] GNUImageManipulationProgram,accessedJuly2016,http://www.gimp.org.

[HT16] J.Hastings-Trew,JHT’sPlanetaryPixelEmporium,accessedJuly2016,http://planetpixelemporium.com/.

[LU16] F.Luna,Introductionto3DGameProgrammingwithDirectX12,2nded.(MercuryLearning,2016).

[MA16] Maya,AutoDesk,Inc.,accessedJuly2016,http://www.autodesk.com/products/maya/overview.

[OE16] OpenGLRegistry,TheKhronosGroup,accessedJuly2016,https://www.opengl.org/registry/.

[OP14]OpenGLGraphicsSystem:ASpecification(version4.4),M.SegalandK.Akeley,March19,2014,accessedJuly2016,https://www.opengl.org/registry/doc/glspec44.core.pdf.

[OP16] OpenGL4.5ReferencePages,accessedJuly2016,https://www.opengl.org/sdk/docs/man/.

[TU16] J.Turberville,Studio522Productions,Scottsdale,AZ,www.studio522.com(dolphinmodeldeveloped2016).

[WI83] L.Williams,“PyramidalParametrics,”ComputerGraphics17,no.3(July1983).1ThisistheorientationthatOpenGLtextureobjectsassume.However,thisisdifferentfromtheorientationofanimagestoredinmanystandardimagefileformats,inwhichtheoriginisattheupperleft.Re-orientingtheimagebyflippingitverticallysothatitcorrespondstoOpenGL'sexpectedformatisoneoftheoperationsperformedbytheJOGLnewTexture()callthatwemadefromtheloadTexture()function.

Page 130: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 6

3DMODELS

6.1 ProceduralModels–BuildingaSphere6.2 OpenGLIndexing–BuildingaTorus6.3 LoadingExternallyProducedModels

SupplementalNotes

Sofarwehavedealtonlywithverysimple3Dobjects,suchascubesandpyramids.Theseobjectsaresosimplethatwehavebeenabletoexplicitlylistallofthevertexinformationinoursourcecodeandplaceitdirectlyintobuffers.

However, most interesting 3D scenes include objects that are too complex to continuebuilding them as we have, by hand. In this chapter, we will exploremore complex objectmodels,howtobuildthem,andhowtoloadthemintoourscenes.

3Dmodelingis itselfanextensivefield,andourcoverageherewillnecessarilybeverylimited.Wewillfocusonthefollowingtwotopics:

buildingmodelsprocedurallyloadingmodelsproducedexternally

Whilethisonlyscratchesthesurfaceoftherichfieldof3Dmodeling,itwillgiveusthecapabilitytoincludeawidevarietyofcomplexandrealisticallydetailedobjectsinourscenes.

6.1 PROCEDURALMODELS–BUILDINGASPHERE

Sometypesofobjects,suchasspheres,cones,andsoforth,havemathematicaldefinitionsthatlendthemselvestoalgorithmicgeneration.ConsiderforexampleacircleofradiusR—coordinatesofpointsarounditsperimeterarewelldefined:

Page 131: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure6.1Pointsonacircle.

Wecansystematicallyuseourknowledgeofthegeometryofacircletoalgorithmicallybuildaspheremodel.Ourstrategyisasfollows:

1. Selectaprecisionrepresentinganumberofcircular“horizontalslices”throughthesphere.SeetheleftsideofFigure6.2.

2. Subdividethecircumferenceofeachcircularsliceintosomenumberofpoints.Seethe right side of Figure 6.2. More points and horizontal slices produces a moreaccurateandsmoothermodelofthesphere.

Figure6.2Buildingcirclevertices.

3. Group the vertices into triangles. One approach is to step through the vertices,buildingtwotrianglesateachstep.Forexample,aswemovealongtherowofthefivecoloredverticesonthesphereinFigure6.3,foreachofthosefiveverticeswebuildthetwotrianglesshowninthecorrespondingcolor(thestepsaredescribedingreaterdetailbelow).

Page 132: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure6.3Groupingverticesintotriangles.

4. Selecttexturecoordinatesdependingonthenatureofourtextureimages.Inthecaseofasphere,thereexistmanytopographicaltextureimages,suchastheoneshownbelowinFigure6.4[VE16]forplanetEarth.Ifweassumethissortoftextureimage,thenbyimaginingtheimage“wrapped”aroundthesphereasshowninFigure6.5,we can assign texture coordinates to each vertex according to the resultingcorrespondingpositionsofthetexelsintheimage.

Figure6.4Topographicaltextureimage[VE16].

Figure6.5Spheretexturecoordinates.

Page 133: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

5. Itisalsooftendesirabletogeneratenormalvectors—vectorsthatareperpendicularto themodel’ssurface—foreachvertex.Wewilluse themsoon, inChapter7, forlighting.

Determiningnormalvectorscanbetricky,butinthecaseofasphere,thevectorpointingfromthecenterofthespheretoavertexhappenstoconvenientlyequalthenormalvectorforthat vertex!Figure6.6 illustrates this property (the center of the sphere is indicatedwith a“star”):

Figure6.6Spherevertexnormalvectors.

Somemodelsdefinetrianglesusingindices.NoteinFigure6.3thateachvertexappearsinmultipletriangles,whichwouldleadtoeachvertexbeingspecifiedmultipletimes.Ratherthandoing this,we insteadstoreeachvertexonce,and thenspecify indices foreachcornerofatriangle, referencing the desired vertices. Since we will store a vertex’s location, texturecoordinates,andnormalvector,thiscanfacilitatememorysavingsforlargemodels.

The vertices are stored in a one-dimensional array, starting with the vertices in thebottommosthorizontalslice.Whenusingindexing,theassociatedarrayofindicesincludesanentry for each triangle corner.Thecontents are integer references (specifically, subscripts)intothevertexarray.Assumingthateachslicecontainsnvertices,thevertexarraywouldlookasshowninFigure6.7,alongwithanexampleportionofthecorrespondingindexarray.

Figure6.7Vertexarrayandcorrespondingindexarray.

Page 134: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

We can then traverse the vertices in a circular fashion around each horizontal slice,startingatthebottomofthesphere.Aswevisiteachvertex,webuildtwotrianglesformingasquareregionaboveand to its right,asshownearlier inFigure6.3.Theprocessing is thusorganizedintonestedloops,asfollows:

foreachhorizontalsliceiinthesphere(irangesfrom0throughallthe

slicesinthesphere)

{foreachvertexjinslicei(jrangesfrom0throughallthevertices

intheslice)

{buildtwotrianglesoutoftheneighboringverticesaboveandto

therightofvertexj

}}

For example, consider the “red” vertex from Figure 6.3 (repeated in Figure 6.8). ThevertexinquestionisatthelowerleftoftheyellowtrianglesshowninFigure6.8,andgiventhe loops just described, would be indexed by i*n+j, where i is the slice currently beingprocessed (the outer loop), j is the vertex currently being processed within that slice (theinner loop),andn is thenumberofverticesper slice.Figure6.8 shows thisvertex (in red)along with its three relevant neighboring vertices, each with formulas showing how theywouldbeindexed.

Figure6.8Verticesgeneratedforthejthvertexintheithslice(n=numberofverticesperslice).

Thesefourverticesarethenusedtobuildthetwotriangles(showninyellow)generatedforthis(red)vertex.Thesixentriesintheindextableforthesetwotrianglesareindicatedinthefigureintheordershownbythenumbers1through6.Notethatentries3and6bothrefertothesamevertex,whichisalsothecaseforentries2and4.Thetwotrianglesthusdefinedwhenwe reach thevertexhighlighted in red (i.e.,vertex[i*n+j]) arebuilt outof these sixvertices—one with entries marked 1, 2, 3 referencing vertices vertex[i*n+j],vertex[i*n+j+1],andvertex[(i+1)*n+j],andonewithentriesmarked4,5,6 referencingthethreeverticesvertex[i*n+j+1],vertex[(i+1)*n+j+1],andvertex[(i+1)*n+j].

Program6.1showstheimplementationofourspheremodelasaclassnamedSphere.Thecenteroftheresultingsphereisattheorigin.CodeforusingSphereisalsoshown.Notethateachvertexisstoredasaninstanceofthegraphicslib3DclassVertex3D(thisisdifferentfromprevious examples,whereverticeswere stored in float arrays).Vertex3D includesmethodsfor obtaining the desired vertex components as float values, which are then put into floatbuffersasbefore.

Page 135: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

NotethecalculationoftriangleindicesintheSphereclass,asdescribedearlierinFigure6.8. The variable “prec” refers to the “precision,” which in this case is used both for thenumberofsphereslicesandthenumberofverticesperslice.Becausethetexturemapwrapscompletelyaround the sphere,wewillneedanextracoincidentvertexateachof thepointswheretheleftandrightedgesofthetexturemapmeet.Thus,thetotalnumberofverticesis(prec+1)*(prec+1).Sincesixtriangleindicesaregeneratedpervertex, thetotalnumberofindicesisprec*prec*6.

Program6.1Procedurally-GeneratedSphere

Sphereclassimportstaticjava.lang.Math.*;

publicclassSphere

{privateintnumVertices,numIndices,prec;//prec=precision

privateint[]indices;

privateVertex3D[]vertices;

publicSphere(intp)

{prec=p;

initSphere();

}

privatevoidinitSphere()

{numVertices=(prec+1)*(prec+1);

numIndices=prec*prec*6;

vertices=newVertex3D[numVertices];

indices=newint[numIndices];

for(inti=0;i<numVertices;i++)

{vertices[i]=newVertex3D();

}

//calculatetrianglevertices

for(inti=0;i<=prec;i++)

{for(intj=0;j<=prec;j++)

{floaty=(float)cos(toRadians(180-i*180/prec));

floatx=-(float)cos(toRadians(j*360/prec))*(float)

abs(cos(asin(y)));

floatz=(float)sin(toRadians(j*360/prec))*(float)

abs(cos(asin(y)));

vertices[i*(prec+1)+j].setLocation(newPoint3D(x,y,z));

vertices[i*(prec+1)+j].setS((float)j/prec); //texturecoordinates

(s,t)

vertices[i*(prec+1)+j].setT((float)i/prec);

vertices[i*(prec+1)+j].setNormal(newVector3D(vertices[i*

(prec+1)+j].getLocation()));

}}

//calculatetriangleindices

for(inti=0;i<prec;i++)

{for(intj=0;j<prec;j++)

{indices[6*(i*prec+j)+0]=i*(prec+1)+j;

Page 136: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

indices[6*(i*prec+j)+1]=i*(prec+1)+j+1;

indices[6*(i*prec+j)+2]=(i+1)*(prec+1)+j;

indices[6*(i*prec+j)+3]=i*(prec+1)+j+1;

indices[6*(i*prec+j)+4]=(i+1)*(prec+1)+j+1;

indices[6*(i*prec+j)+5]=(i+1)*(prec+1)+j;

}}}

publicint[]getIndices(){returnindices;}

publicVertex3D[]getVertices(){returnvertices;}

}

UsingtheSphereclass...

mySphere=newSphere(48); //ininit()

...

privatevoidsetupVertices()

{GL4gl=(GL4)GLContext.getCurrentGL();

Vertex3D[]vertices=mySphere.getVertices();

int[]indices=mySphere.getIndices();

float[]pvalues=newfloat[indices.length*3]; //vertexpositions

float[]tvalues=newfloat[indices.length*2]; //texture

coordinates

float[]nvalues=newfloat[indices.length*3]; //normalvectors

for(inti=0;i<indices.length;i++)

{pvalues[i*3]=(float)(vertices[indices[i]]).getX();

pvalues[i*3+1]=(float)(vertices[indices[i]]).getY();

pvalues[i*3+2]=(float)(vertices[indices[i]]).getZ();

tvalues[i*2]=(float)(vertices[indices[i]]).getS();

tvalues[i*2+1]=(float)(vertices[indices[i]]).getT();

nvalues[i*3]=(float)(vertices[indices[i]]).getNormalX();

nvalues[i*3+1]=(float)(vertices[indices[i]]).getNormalY();

nvalues[i*3+2]=(float)(vertices[indices[i]]).getNormalZ();

}

gl.glGenVertexArrays(vao.length,vao,0);

gl.glBindVertexArray(vao[0]);

gl.glGenBuffers(3,vbo,0);

//puttheverticesintobuffer#0

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

FloatBuffervertBuf=Buffers.newDirectFloatBuffer(pvalues);

gl.glBufferData(GL_ARRAY_BUFFER,vertBuf.limit()*4,vertBuf,

GL_STATIC_DRAW);

//putthetexturecoordinatesintobuffer#1

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

FloatBuffertexBuf=Buffers.newDirectFloatBuffer(tvalues);

gl.glBufferData(GL_ARRAY_BUFFER,texBuf.limit()*4,texBuf,GL_STATIC_DRAW);

//putthenormalcoordinatesintobuffer#2

Page 137: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[2]);

FloatBuffernorBuf=Buffers.newDirectFloatBuffer(nvalues);

gl.glBufferData(GL_ARRAY_BUFFER,norBuf.limit()*4,norBuf,GL_STATIC_DRAW);

}

indisplay()...

intnumVerts=mySphere.getIndices().length;

gl.glDrawArrays(GL_TRIANGLES,0,numVerts);

...

When using the Sphere class, we will need three values for each vertex position andnormal vector, but only two values for each texture coordinate. This is reflected in thedeclarations for the arrays (pvalues, tvalues, and nvalues) that are later populated withvaluesobtainedbycallstoSpherefunctions,andloadedintothebuffers.

Itisimportanttonotethatalthoughindexingisusedintheprocessofbuildingthesphere,the ultimate sphere vertex data stored in the VBOs doesn’t utilize indexing. Rather, assetupVertices() loops through the sphere indices, it generates separate (often redundant)vertexentriesintheVBOforeachoftheindexentries.OpenGLdoeshaveamechanismforindexingvertexdata;forsimplicitywedidn’tuseitinthisexample,butwewilluseOpenGL’sindexinginthenextexample.

Figure 6.9 shows the output of Program 6.1, with a precision of 48. The texture fromFigure6.5hasbeenloadedasdescribedinChapter5.

Figure6.9Texturedspheremodel.

Many other models can be created procedurally, from geometric shapes to real-worldobjects.Oneof themostwell-known is the“Utah teapot”[CH16],whichwas developed in1975byMartinNewell,usingavarietyofBézier curvesand surfaces.TheOpenGLUtilityToolkit (or “GLUT”) [GL16] even includes procedures for drawing teapots(!) (see Figure6.10).Wedon’tcoverGLUTinthisbook,butBéziersurfacesarecoveredinChapter11.

Page 138: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure6.10OpenGLGLUTteapot.

6.2 OPENGLINDEXING–BUILDINGATORUS

6.2.1 TheTorus

Algorithmsforproducinga toruscanbefoundonvariouswebsites.PaulBakergivesastep-by-step description for defining a circular slice, and then rotating the slice around acircletoformadonut[PP07].Figure6.11showstwoviews;fromtheside,andfromabove.

Theway that the torus vertexpositions are generated is rather different fromwhatwasdone to build the sphere.For the torus, the algorithmpositions a vertex to the right of theorigin,and thenrotates thatvertex inacircleon theXYplaneusinga rotationaround theZaxistoforma“ring.”Texturecoordinatesandnormalvectorsarecomputedforeachoftheseverticesas theyarebuilt.Anadditionalvector tangent to thesurfaceof thetorus(calledthetangentvector)isalsogeneratedforeachvertex.

Figure6.11Buildingatorus.

VerticesforadditionaltorusringsareformedbyrotatingtheoriginalringaroundtheYaxis.Tangentandnormalvectorsforeachresultingvertexarecomputedbyalsorotatingthetangent and normal vectors of the original ring around the Y axis. After the vertices arecreated,theyaretraversedfromringtoring,andforeachvertextwotrianglesaregenerated.Thegenerationof six index table entries comprising the two triangles is done in a similarmanneraswedidforthesphere.

OurstrategyforchoosingtexturecoordinateswillbetoarrangethemsothattheSaxisof

Page 139: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

thetextureimagewrapshalfwayaroundthehorizontalperimeterofthetorus,andthenrepeatsfor the other half. This can be done by using the tiling capability in OpenGL texturespreviouslydescribed inChapter5.Aswe rotate around theY axis generating the rings,wespecify a variable ring that starts at 1 and increases up to the specified precision (againdubbed“prec”).WesetthevalueoftheStexturecoordinatevalueto2.0*ring/prec,causingS to range between 0 and 2.Recalling that the default behavior for textureswhen a texturecoordinateexceeds1isGL_REPEAT, theresultisthatthetexturestretcheshalfwayaroundthetorus,and is thenrepeatedon thesecondhalf.Themotivation for thisapproach is toavoidhavingthetextureimageappearoverly“stretched”horizontally.

BuildingatorusclassinJava/JOGLcouldbedoneinavirtuallyidenticalmannerasfortheSphereclass.However,wehave theopportunity to takeadvantageof the indices thatwecreatedwhile building the torus byusingOpenGL’s support for vertex indexing (we couldhavealsodone this for thesphere,butwedidn’t).Forvery largemodelswith thousandsofvertices, using OpenGL indexing can result in improved performance, and so we willdescribehowtodothatnext.

6.2.2 IndexinginOpenGL

Inbothoursphereandtorusmodels,wegeneratedanarrayofintegerindexesreferencingintothevertexarray.Inthecaseofthesphere,weusedthelistofindicestobuildacompletesetofindividualverticesandloadedthemintoaVBOjustaswedidforexamplesinearlierchapters. Instantiating the torus and loading its vertices, normals, etc. into buffers could bedone in a similarmanner as was done in Program 6.1, but instead we will use OpenGL’sindexing.

When using OpenGL indexing, we also load the indices themselves into a VBO. Wegenerate one additionalVBO for holding the indices. Since each index value is simply aninteger reference, we first copy the index array into a Java IntBuffer, and then useglBufferData() to load theIntBuffer into the addedVBO, specifying that theVBO is oftypeGL_ELEMENT_ARRAY_BUFFER(thistellsOpenGLthattheVBOcontainsindices).ThecodethatdoesthiscanbeaddedtosetupVertices():

int[]indices=myTorus.getIndices(); //thetorusindexaccessorreturns

theindicesasanintarray

...

gl.glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,vbo[3]); //vbo#3isthe

additionaladdedvbo

IntBufferidxBuf=Buffers.newDirectIntBuffer(indices);

gl.glBufferData(GL_ELEMENT_ARRAY_BUFFER,idxBuf.limit()*4,idxBuf,

GL_STATIC_DRAW);

In the display() method, we replace the glDrawArrays() call with a call toglDrawElements(),whichtellsOpenGLtoutilizetheindexVBOforlookinguptheverticesto be drawn.We also enable the VBO that contains the indices by using glBindBuffer(),specifyingwhichVBOcontains the indices, and that it is aGL_ELEMENT_ARRAY_BUFFER. Thecodeisasfollows:

Page 140: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

intnumIndices=myTorus.getIndices().length;

gl.glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,vbo[3]);

gl.glDrawElements(GL_TRIANGLES,numIndices,GL_UNSIGNED_INT,0);

Interestingly, the shadersused fordrawing the spherecontinue towork,unchanged, forthe torus, even with the changes that wemade in the Java/JOGL application to implementindexing. OpenGL is able to recognize the presence of a GL_ELEMENT_ARRAY_BUFFER andutilizeittoaccessthevertexattributes.

Program6.2showsaclassnamedTorusbasedonBaker ’s implementation.The“inner”and“outer”variablesrefer to thecorrespondinginnerandouterradiusinFigure6.11.Theprec(“precision”)variablehasasimilarroleasinthesphere,withanalogouscomputationsfornumberofverticesand indices.Bycontrast,determiningnormalvectors ismuchmorecomplex than itwasfor thesphere.Wehaveused thestrategygiven inBaker ’sdescription,wherein two tangentvectors arecomputed (dubbed tangentandbitangent), and their cross-productformsthenormal.

The torus is also available in graphicslib3D, by instantiating classgraphicslib3D.shape.Torus.

Program6.2Procedurally-GeneratedTorus

TorusclasspublicclassTorus

{

privateintnumVertices,numIndices,prec;

privateint[]indices;

privateVertex3D[]vertices;

privatefloatinner,outer;

privateVector3D[]tTangent,bTangent;

publicTorus(floatinnerRadius,floatouterRadius,intp)

{inner=innerRadius;outer=outerRadius;prec=p;

initTorus();

}

privatevoidinitTorus()

{numVertices=(prec+1)*(prec+1);

numIndices=prec*prec*6;

vertices=newVertex3D[numVertices];

indices=newint[numIndices];

tTangent=newVector3D[numVertices];

bTangent=newVector3D[numVertices];

for(inti=0;i<numVertices;i++)

{vertices[i]=newVertex3D();}

//calculatefirstring.

for(inti=0;i<prec+1;i++)

{//buildtheringbyrotatingpointsaroundtheorigin,thenmoving

themoutward

Point3DinitOuterPos=newPoint3D(outer,0.0,0.0);

Page 141: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Point3DrotOuterPos=tRotateZ(initOuterPos,(i*360.0f/prec));

Point3DinitInnerPos=newPoint3D(inner,0.0,0.0);

Point3DringPos=rotOuterPos.add(initInnerPos);

vertices[i].setLocation(ringPos);

//computetexturecoordinatesforeachvertexonthering

vertices[i].setS(0.0f);

vertices[i].setT(((float)i)/((float)prec));

//computenormalvectorsforeachvertexinthering

Vector3DnegZ=newVector3D(0.0f,0.0f,-1.0f); //tangentvectorof

firstouterring=-Zaxis

Vector3DnegY=newVector3D(0.0f,-1.0f,0.0f); //initialbitangent

vector=-Yaxis

tTangent[i]=negZ; //thetangentvectorissaved(andthus

availabletotheapplication)

//thebitangentisthenrotatedaroundtheZaxis,andalsosaved

bTangent[i]=newVector3D(tRotateZ(newPoint3D(negY),

(i*360.0f/(prec))));

vertices[i].setNormal(tTangent[i].cross(bTangent[i])); //cross

productproducesthenormal

}

//rotatethefirstringabouttheYaxistogeneratetheotherringsof

thetorus

for(intring=1;ring<=prec;ring++)

{for(intvert=0;vert<=prec;vert++)

{//rotatethevertexpositionsoftheoriginalringaroundtheY

axis

floatrotAmt=(float)((float)ring*360.0f/prec);

Vector3Dvp=newVector3D(vertices[vert].getLocation());

Vector3Dvpr=tRotateY(vp,rotAmt);

vertices[ring*(prec+1)+vert].setLocation(newPoint3D(vpr));

//computethetexturecoordinatesfortheverticesinthenewrings

vertices[ring*(prec+1)+vert].setS((float)ring/(float)prec);

vertices[ring*(prec+1)+vert].setT(vertices[vert].getT());

//rotatethetangentandbitangentvectorsaroundtheYaxis

tTangent[ring*(prec+1)+vert]=tRotateY(tTangent[vert],rotAmt);

bTangent[ring*(prec+1)+vert]=tRotateY(bTangent[vert],rotAmt);

//rotatethenormalvectoraroundtheYaxis

Vector3DnormalRotateY=tRotateY(vertices[vert].getNormal(),rotAmt);

vertices[ring*(prec+1)+vert].setNormal(normalRotateY);

}}

//calculatetriangleindicescorrespondingtothetwotrianglesbuiltper

vertex

for(intring=0;ring<prec;ring++)

{for(intvert=0;vert<prec;vert++)

{indices[((ring*prec+vert)*2)*3+0]=ring*(prec+1)+vert;

indices[((ring*prec+vert)*2)*3+1]=(ring+1)*(prec+1)+vert;

indices[((ring*prec+vert)*2)*3+2]=ring*(prec+1)+vert+1;

indices[((ring*prec+vert)*2+1)*3+0]=ring*(prec+1)+vert+1;

indices[((ring*prec+vert)*2+1)*3+1]=(ring+1)*(prec+1)+vert;

indices[((ring*prec+vert)*2+1)*3+2]=(ring+1)*(prec+1)+vert+1;

}}}

//utilityfunctionforrotatingavectoraroundtheYaxis

privateVector3DtRotateY(Vector3DinVec,floatamount)

Page 142: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

{Matrix3DyMat=newMatrix3D();

yMat.rotateY((double)amount);

Vector3Dresult=inVec.mult(yMat);

returnresult;

}

//utilityfunctionforrotatingapointaroundtheZaxis

privatePoint3DtRotateZ(Point3DinPt,floatamount)

{Matrix3DzMat=newMatrix3D();

zMat.rotateZ((double)amount);

Point3Dresult=inPt.mult(zMat);

returnresult;

}

//accessorsforthetorusindicesandvertices

publicint[]getIndices(){returnindices;}

publicVertex3D[]getVertices(){returnvertices;}

}

UsingtheTorusclass(withOpenGLindexing)...

myTorus=newTorus(0.5f,0.2f,48); //ininit()

...

privatevoidsetupVertices()

{GL4gl=(GL4)GLContext.getCurrentGL();

Vertex3D[]vertices=myTorus.getVertices();

int[]indices=myTorus.getIndices();

float[]pvalues=newfloat[vertices.length*3];

float[]tvalues=newfloat[vertices.length*2];

float[]nvalues=newfloat[vertices.length*3];

for(inti=0;i<vertices.length;i++)

{pvalues[i*3]=(float)(vertices[i]).getX(); //vertex

position

pvalues[i*3+1]=(float)(vertices[i]).getY();

pvalues[i*3+2]=(float)(vertices[i]).getZ();

tvalues[i*2]=(float)(vertices[i]).getS(); //texture

coordinates

tvalues[i*2+1]=(float)(vertices[i]).getT();

nvalues[i*3]=(float)(vertices[i]).getNormalX(); //normalvector

nvalues[i*3+1]=(float)(vertices[i]).getNormalY();

nvalues[i*3+2]=(float)(vertices[i]).getNormalZ();

}

gl.glGenVertexArrays(vao.length,vao,0);

gl.glBindVertexArray(vao[0]);

gl.glGenBuffers(vbo.length,vbo,0); //generateVBOsasbefore,

plusoneforindices

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]); //vertexpositions

FloatBuffervertBuf=Buffers.newDirectFloatBuffer(pvalues);

gl.glBufferData(GL_ARRAY_BUFFER,vertBuf.limit()*4,vertBuf,

Page 143: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

GL_STATIC_DRAW);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]); //texture

coordinates

FloatBuffertexBuf=Buffers.newDirectFloatBuffer(tvalues);

gl.glBufferData(GL_ARRAY_BUFFER,texBuf.limit()*4,texBuf,GL_STATIC_DRAW);

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[2]); //normalvectors

FloatBuffernorBuf=Buffers.newDirectFloatBuffer(nvalues);

gl.glBufferData(GL_ARRAY_BUFFER,norBuf.limit()*4,norBuf,GL_STATIC_DRAW);

gl.glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,vbo[3]); //indices

IntBufferidxBuf=Buffers.newDirectIntBuffer(indices);

gl.glBufferData(GL_ELEMENT_ARRAY_BUFFER,idxBuf.limit()*4,idxBuf,

GL_STATIC_DRAW);

}

indisplay()...

intnumIndices=myTorus.getIndices().length;

gl.glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,vbo[3]);

gl.glDrawElements(GL_TRIANGLES,numIndices,GL_UNSIGNED_INT,0);

...

NoteinthecodethatusestheTorusclassthattheloopinsetupVertices()nowstoresthedataassociatedwitheachvertexonce,ratherthanonceforeachindexentry(aswasthecaseinthesphereexample).ThisdifferenceisalsoreflectedinthedeclaredarraysizesforthedatabeingenteredintotheVBOs.Alsonotethatinthetorusexample,ratherthanusingtheindexvalueswhenretrievingvertexdata, theyaresimply loaded intoVBO#3.Since thatVBOisdesignated as a GL_ELEMENT_ARRAY_BUFFER, OpenGL knows that that VBO contains vertexindices.

Figure6.12showstheresultofinstantiatingatorus,andtexturingitwiththebricktexture.

Figure6.12Procedurally-generatedtorus.

6.3 LOADINGEXTERNALLYPRODUCEDMODELS

Page 144: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Complex 3D models, such as characters found in videogames or computer-generatedmovies,aretypicallyproducedusingmodelingtools.Such“DCC”(DigitalContentCreation)toolsmakeitpossibleforpeople(suchasartists)tobuildarbitraryshapesin3Dspace,andautomaticallyproducethevertices,texturecoordinates,vertexnormals,andsoon.Therearetoomanysuchtoolstolist,butsomeexamplesareMaya,Blender,Lightwave,Cinema4D,andmanyothers.Blenderisfreeandopensource.Figure6.13showsanexampleBlenderscreenduringtheeditingofa3Dmodel.

Figure6.13ExampleBlendermodelcreation[BL16].

InorderforustouseaDCC-createdmodelinourOpenGLscenes,thatmodelneedstobesaved(exported)inaformatthatwecanread(import)intoourprogram.Thereareseveralstandard 3Dmodel file formats; again, there are toomany to list, but some examples areWavefront (.obj), 3D Studio Max (.3ds), Stanford Scanning Repository (.ply), Ogre3D(.mesh),andmanyothers.Arguably thesimplest isWavefront (usuallydubbedOBJ), sowewillexaminethatone.

OBJfilesaresimpleenoughthatwecandevelopabasicimporterrelativelyeasily.InanOBJfile,linesoftextspecifyvertexgeometricdata,texturecoordinates,normals,andotherinformation. It has some limitations—for example, OBJ files have no way of specifyingmodelanimation.

LinesinanOBJfilestartwithacharactertagindicatingwhatkindofdataisonthatline.Somecommontagsinclude:

geometric(vertexlocation)datatexturecoordinatesvertexnormalface(typicallyverticesinatriangle)

Other tagsmake itpossible tostore theobjectname,materials ituses,curves,shadows,andmanyotherdetails.Wewilllimitourdiscussiontothefourtagslistedabove,whichare

Page 145: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

sufficientforimportingawidevarietyofcomplexmodels.

Suppose we use Blender to build a simple pyramid such as the one we developed forProgram4.3.HereisascreenshotofasimilarpyramidbeingcreatedinBlender:

Figure6.14PyramidbuiltinBlender.

In Blender, if we now export our pyramid model, specify .obj format, and also setBlendertooutputtexturecoordinatesandvertexnormals,anOBJfileiscreatedthatincludesallofthisinformation.TheresultingOBJfileisshowninFigure6.15.(Theactualvaluesofthetexturecoordinatescanvarydependingonhowthemodelisbuilt.)

Page 146: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure6.15ExportedOBJfileforthepyramid.

Wehavecolor-codedtheimportantsectionsoftheOBJfileforreference.Thelinesatthetopbeginningwith“#”arecommentsplacedtherebyBlender,whichour importer ignores.Thisisfollowedbyalinebeginningwith“o”givingthenameoftheobject.Ourimportercanignorethislineaswell.Later,thereisalinebeginningwith“s”, thatspecifies that thefacesshouldn’tbesmoothed.Ourcodewillalsoignorelinesstartingwith“s”.

ThefirstsubstantivesetoflinesintheOBJfilearethosestartingwith“v”,coloredblue.They specify theX,Y, and Z local spatial coordinates of the five vertices of our pyramidmodelrelativetotheorigin,whichinthiscaseisatthecenterofthepyramid.

The values colored red (starting with “vt”) are the various texture coordinates. Thereasonthatthelistoftexturecoordinatesislongerthanthelistofverticesisthatsomeofthe

Page 147: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

verticesparticipateinmorethanonetriangle,andinthosecasesdifferenttexturecoordinatesmightbeused.

Thevaluescoloredgreen(startingwith“vn”)arethevariousnormalvectors.Thislisttooisoftenlongerthanthelistofvertices(althoughnotinthisexample),againbecausesomeoftheverticesparticipateinmorethanonetriangle,andinthosecasesdifferentnormalvectorsmightbeused.

The values colored purple (startingwith “f”), near the bottom of the file, specify thetriangles (i.e., “faces”). In this example, each face (triangle) has three elements, each withthreevaluesseparatedby“/”(OBJallowsotherformatsaswell).Thevaluesforeachelementareindicesintothelistsofvertices,texturecoordinates,andnormalvectors,respectively.Forexample,thethirdfaceis:

f2/7/35/8/33/9/3

This indicates that the 2nd, 5th, and 3rd vertices from the list of vertices (in blue)compriseatriangle(notethatOBJindicesstartat1).Thecorrespondingtexturecoordinatesare the 7th, 8th, and9th from the list of texture coordinates in the section colored red.Allthree vertices have the same normal vector, the 3rd in the list of normals in the sectioncoloredgreen.

Models in OBJ format are not required to have normal vectors, or even texturecoordinates.Ifamodeldoesnothavetexturecoordinatesornormals,thefacevalueswouldspecifyonlythevertexindices:

f253

If a model has texture coordinates, but not normal vectors, the format would be asfollows:

f2/75/83/9

And,ifthemodelhasnormalsbutnottexturecoordinates,theformatwouldbe:f2//35//33//3

Itisnotunusualforamodeltohavetensofthousandsofvertices.TherearehundredsofsuchmodelsavailablefordownloadontheInternetfornearlyeveryconceivableapplication,includingmodelsofanimals,buildings,cars,planes,mythicalcreatures,people,andsoon.

Programsof varying sophistication that can import anOBJmodel are available on theInternet.Alternatively,itisrelativelyeasytowriteaverysimpleOBJloaderfunctionthatcanhandle thebasic tagswehave seen (v,vt,vn, andf).Program 6.3 shows one such loader,albeitaverylimitedone.Itincorporatesaclasstoholdanarbitraryimportedmodel,whichinturncallstheimporter.

BeforewedescribethecodeinoursimpleOBJimporter,wemustwarnthereaderofitslimitations:

It only supportsmodels that includeall three face attribute fields.That is, vertex

Page 148: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

positions,texturecoordinates,andnormalsmustallbepresentand in theform:f#/#/##/#/##/#/#

Thematerial tag is ignored—texturingmust be done using themethods seen inChapter5.Only OBJ models comprised of a single triangle mesh are supported (the OBJformatsupportscompositemodels,butoursimpleimporterdoesnot).Itassumesthatelementsoneachlineareseparatedbyexactlyonespace.

IfyouhaveanOBJmodel thatdoesn’t satisfyallof theabovecriteria,andyouwish toimportitusingthesimpleloaderinProgram6.3,itmaystillbefeasibletodoso.It isoftenpossible to load such a model into Blender, and then export it to another OBJ file thataccommodates the loader ’s limitations. For instance, if the model doesn’t include normalvectors, it is possible to haveBlender produce normal vectorswhile it exports the revisedOBJfile.

Another limitationofourOBJ loaderhas todowith indexing.Observe in thepreviousdescriptions that the “face” tag allows for the possibility of mix-and-matching vertexpositions, texture coordinates, and normal vectors. For example, two different “face” rowsmay include indiceswhichpoint to thesameventry,butdifferentvtentries.Unfortunately,OpenGL’s indexing mechanism does not support this level of flexibility—index entries inOpenGL can only point to a particular vertex along with its attributes. This complicateswriting an OBJ model loader somewhat, as we cannot simply copy the references in thetriangle face entries into an index array. Rather, using OpenGL indexing would requireensuringthatentirecombinationsofv,vt,andvnvaluesforafaceentryeachhavetheirownreferences in the index array.A simpler, albeit less efficient, alternative is to create a newvertexforeverytrianglefaceentry.Weoptforthissimplerapproachhereintheinterestofclarity, despite the space-saving advantage of using OpenGL indexing (especially whenloadinglargermodels).

TheModelImporterclassincludesaparseOBJ()functionthatreadsineachlineofanOBJfile one-by-one, handling separately the four cases v, vt, vn, and f. In each case, thesubsequentnumbersonthelineareextracted,firstbyusingsubstring()toskiptheinitialv,vt, vn, or f character(s), and then using the iterator returned by the split() function toextracteachsubsequentparametervalue,storingtheminanArrayList.Astheface(f)entriesareprocessed,theverticesarebuilt,withcorrespondingentriesinparallelarraysforvertexpositions,texturecoordinates,andnormalvectors.

The ModelImporter class is embedded in the ImportedModel class, which simplifiesloading and accessing the vertices of anOBJ file by putting the imported vertices into anarray of Vertex3D objects. Recall that Vertex3D is defined in graphicslib3D, and that aVertex3Dobjectincludesfieldsforvertexposition,texturecoordinates,andnormalvectors.TheaccessorsintheImportedModelclassthenmakethisarrayofVertex3DobjectsavailabletotheJava/JOGLapplicationinmuchthesamemanneraswasdoneintheSphereandTorusclasses.

FollowingtheModelImporterandImportedModelclassesisanexamplesequenceofcalls

Page 149: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

forloadinganOBJfile,andthentransferringthevertexinformationintoasetofVBOsforsubsequentrendering.

Figure6.16showsarenderedmodelofthespaceshuttle,downloadedasanOBJfilefromtheNASAwebsite[NA16],importedusingthecodefromProgram6.3,andtexturedusingthecode from Program 5.1 with the associated NASA texture image file, with anisotropicfiltering. This texture image is an example of the use of UV-mapping, where texturecoordinatesinthemodelarecarefullymappedtoparticularregionsofthetextureimage.(Asmentionedearlier,thedetailsofUV-mappingareoutsidethescopeofthisbook.)

Figure6.16NASAspaceshuttlemodelwithtexture.

Program6.3Simple(Limited)OBJLoader

ImportedModelclasspublicclassImportedModel

{privateVertex3D[]vertices;

privateintnumVertices;

publicImportedModel(Stringfilename)

{ModelImportermodelImporter=newModelImporter();

try

{modelImporter.parseOBJ(filename); //usesthemodelImportertoget

vertexinformation

numVertices=modelImporter.getNumVertices();

float[]verts=modelImporter.getVertices();

float[]tcs=modelImporter.getTextureCoordinates();

float[]normals=modelImporter.getNormals();

vertices=newVertex3D[numVertices];

for(inti=0;i<vertices.length;i++)

{vertices[i]=newVertex3D();

vertices[i].setLocation(verts[i*3],verts[i*3+1],verts[i*3+2]);

vertices[i].setST(tcs[i*2],tcs[i*2+1]);

vertices[i].setNormal(normals[i*3],normals[i*3+1],normals[i*3+2]);

}

}catch(IOExceptione)

{e.printStackTrace();

Page 150: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}}

publicVertex3D[]getVertices(){returnvertices;} //accessors

applicationtoobtain

publicintgetNumVertices(){returnnumVertices;}

ModelImporternestedclassprivateclassModelImporter

{//valuesasreadinfromOBJfile

privateArrayList<Float>vertVals=newArrayList<Float>();

privateArrayList<Float>stVals=newArrayList<Float>();

privateArrayList<Float>normVals=newArrayList<Float>();

//valuesstoredforlateruseasvertexattributes

privateArrayList<Float>triangleVerts=newArrayList<Float>();

privateArrayList<Float>textureCoords=newArrayList<Float>();

privateArrayList<Float>normals=newArrayList<Float>();

publicvoidparseOBJ(Stringfilename)throwsIOException

{InputStreaminput=ModelImporter.class.getResourceAsStream(filename);

BufferedReaderbr=newBufferedReader(newInputStreamReader(input));

Stringline;

while((line=br.readLine())!=null)

{if(line.startsWith("v")) //vertexposition("v"case)

{for(Strings:(line.substring(2)).split(""))

{vertVals.add(Float.valueOf(s)); //extractthevertex

positionvalues

}}

elseif(line.startsWith("vt")) //texture

coordinates("vt"case)

{for(Strings:(line.substring(3)).split(""))

{stVals.add(Float.valueOf(s)); //extractthetexture

coordinatevalues

}}

elseif(line.startsWith("vn")) //vertexnormals

("vn"case)

{for(Strings:(line.substring(3)).split(""))

{normVals.add(Float.valueOf(s)); //extractthenormalvector

values

}}

elseif(line.startsWith("f")) //trianglefaces

("f"case)

{for(Strings:(line.substring(2)).split(""))

{Stringv=s.split("/")[0]; //extracttriangleface

references

Stringvt=s.split("/")[1];

Stringvn=s.split("/")[2];

intvertRef=(Integer.valueOf(v)-1)*3;

inttcRef=(Integer.valueOf(vt)-1)*2;

intnormRef=(Integer.valueOf(vn)-1)*3;

triangleVerts.add(vertVals.get(vertRef)); //buildarrayof

vertices

triangleVerts.add(vertVals.get(vertRef+1));

Page 151: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

triangleVerts.add(vertVals.get(vertRef+2));

textureCoords.add(stVals.get(tcRef)); //corresponding

textureCoords.add(stVals.get(tcRef+1)); //texture

coordinates.

normals.add(normVals.get(normRef)); //…andnormals

normals.add(normVals.get(normRef+1));

normals.add(normVals.get(normRef+2));

}}}

input.close();

}

//accessorsforretrievingthenumberofvertices,thevertices

themselves,

//andthecorrespondingtexturecoordinatesandnormals(onlycalledonce

permodel)

publicintgetNumVertices(){return(triangleVerts.size()/3);}

publicfloat[]getVertices()

{float[]p=newfloat[triangleVerts.size()];

for(inti=0;i<triangleVerts.size();i++)

{p[i]=triangleVerts.get(i);

}

returnp;

}

//similaraccessorsfortexturecoordinatesandnormalvectorsgohere

}

}

UsingtheModelImporter...

myObj=newImportedModel("shuttle.obj"); //ininit()

...

privatevoidsetupVertices(GL4gl)

{Vertex3D[]vertices=myObj.getVertices();

numObjVertices=myObj.getNumVertices();

float[]pvalues=newfloat[numObjVertices*3];//vertexpositions

float[]tvalues=newfloat[numObjVertices*2];//texturecoordinates

float[]nvalues=newfloat[numObjVertices*3];//normalvectors

for(inti=0;i<numObjVertices;i++)

{pvalues[i*3]=(float)(vertices[i]).getX();

pvalues[i*3+1]=(float)(vertices[i]).getY();

pvalues[i*3+2]=(float)(vertices[i]).getZ();

tvalues[i*2]=(float)(vertices[i]).getS();

tvalues[i*2+1]=(float)(vertices[i]).getT();

nvalues[i*3]=(float)(vertices[i]).getNormalX();

nvalues[i*3+1]=(float)(vertices[i]).getNormalY();

nvalues[i*3+2]=(float)(vertices[i]).getNormalZ();

}

Page 152: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl.glGenVertexArrays(vao.length,vao,0);

gl.glBindVertexArray(vao[0]);

gl.glGenBuffers(vbo.length,vbo,0);

//VBOforvertexlocations

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

FloatBuffervertBuf=Buffers.newDirectFloatBuffer(pvalues);

gl.glBufferData(GL_ARRAY_BUFFER,vertBuf.limit()*4,vertBuf,

GL_STATIC_DRAW);

//VBOfortexturecoordinates

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

FloatBuffertexBuf=Buffers.newDirectFloatBuffer(tvalues);

gl.glBufferData(GL_ARRAY_BUFFER,texBuf.limit()*4,texBuf,GL_STATIC_DRAW);

//VBOfornormalvectors

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[2]);

FloatBuffernorBuf=Buffers.newDirectFloatBuffer(nvalues);

gl.glBufferData(GL_ARRAY_BUFFER,norBuf.limit()*4,norBuf,GL_STATIC_DRAW);

}

indisplay():...

intnumVerts=myModel.getVertices().length;

gl.glDrawArrays(GL_TRIANGLES,0,numVerts);

SUPPLEMENTALNOTES

Althoughwediscussed theuseofDCC tools for creating3Dmodels,wedidn’tdiscusshowtousesuchtools.Thereisawealthof tutorialvideosanddocumentationforallof thepopulartoolssuchasBlenderandMaya,andsuchinstructionisoutsidethescopeofthistext.

Thetopicof3Dmodelingisitselfarichfieldofstudy.Ourcoverageinthischapterhasbeen just a rudimentary introduction, with emphasis on its relationship to OpenGL.Manyuniversities offer entire courses in 3D modeling, and readers interested in learning moreabout it are encouraged to consult some of the popular resources that offer greater detail(e.g.,[BL16],[C13],[V12]).

We reiterate that theOBJ importerwepresented in this chapter is limited, andcanonlyhandle a subset of the features supported by the OBJ format. Although sufficient for ourneeds, itwill fail on someOBJ files. In those cases itwould be necessary to first load themodel intoBlender (orMaya, etc.), and re-export it as anOBJ file that complieswith theimporter ’slimitationsasdescribedearlierinthischapter.

Exercises

6.1 ModifyProgram4.4sothatthe“sun,”“planet,”and“moon”aretexturedspheres,such

Page 153: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

astheonesshowninFigure4.11.6.2 (PROJECT)ModifyyourprogramfromExercise6.1sothattheimportedNASAshuttle

objectfromFigure6.16alsoorbitsthe“sun.”You’llwanttoexperimentwiththescaleandrotationappliedtotheshuttletomakeitlookrealistic.

6.3 (RESEARCH&PROJECT)LearnthebasicsofhowtouseBlendertocreatea3Dobjectofyourown.TomakefulluseofBlenderwithyourOpenGLapplications,you’llwanttolearnhowtouseBlender ’sUV-unwrappingtoolstogeneratetexturecoordinatesandanassociatedtextureimage.YoucanthenexportyourobjectasanOBJfileandloaditusingthecodefromProgram6.3.

References

[BL16] Blender,TheBlenderFoundation,accessedJuly2016,https://www.blender.org/.

[CH11] A.Chopine,3DArtEssentials:TheFundamentalsof3DModeling,Texturing,andAnimation(FocalPress,2011).

[CH16]ComputerHistoryMuseum,AccessedJuly2016,http://www.computerhistory.org/revolution/computer-graphics-music-and-art/15/206.

[GL16] GLUTandOpenGLUtilityLibraries,accessedJuly2016,https://www.opengl.org/resources/libraries/.

[NA16] NASA3DResources,accessedJuly2016,http://nasa3d.arc.nasa.gov/.

[PP07] P.Baker,Paul’sProjects,2007,accessedJuly2016,www.paulsprojects.net.

[VA12] V.Vaughan,DigitalModeling(NewRiders,2012).

[VE16] VisibleEarth,NASAGoddardSpaceFlightCenterImage,AccessedJuly2016,http://visibleearth.nasa.gov.

Page 154: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 7

LIGHTING

7.1 LightingModels7.2 Lights7.3 Materials7.4 ADSLightingComputations7.5 ImplementingADSLighting7.6 CombiningLightingandTextures

SupplementalNotesHistoricalNote

Lightaffectstheappearanceofourworldinvariedandsometimesdramaticways.Whenaflashlightshinesonanobject,weexpectittoappearbrighteronthesidefacingthelight.Theearthonwhichweliveisitselfbrightlylitwhereitfacesthesunatnoon,butasitturns,thatdaytimebrightnessgraduallyfadesintoevening,untilbecomingcompletelydarkatmidnight.

Objectsalsoresponddifferentlytolight.Besideshavingdifferentcolors,objectscanhavedifferentreflectivecharacteristics.Consider twoobjects,bothgreen,butwhereoneismadeofclothversusanothermadeofpolishedsteel—thelatterwillappearmore“shiny.”

7.1 LIGHTINGMODELS

Light is theproduct of photonsbeing emittedbyhigh energy sources and subsequentlybouncing around until some of the photons reach our eyes. Unfortunately, it iscomputationallyinfeasibletoemulatethisnaturalprocess,asitwouldrequiresimulatingandthentrackingthemovementofahugenumberofphotons,addingmanyobjects(andmatrices)toourscene.Whatweneedisalightingmodel.

Page 155: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Lightingmodelsaresometimescalledshadingmodels,althoughinthepresenceofshaderprogramming, that can be a bit confusing. Sometimes the term reflection model is used,complicating the terminology further. We will try and stick to whichever terminology issimpleandmostpractical.

The most common lighting models today are called “ADS” models, because they arebasedonthreetypesofreflectionlabeledA,D,andS:

Ambient reflection simulates a low-level illumination that equally affectseverythinginthescene.Diffuse reflection brightens objects to various degrees depending on the light’sangleofincidence.Specular reflection conveys the shininess of an object by strategically placing ahighlightofappropriatesizeon theobject’ssurfacewhere light is reflectedmostdirectlytowardsoureyes.

ADSmodelscanbeusedtosimulatedifferentlightingeffects,andavarietyofmaterials.

Figure7.1illustratestheambient,diffuse,andspecularcontributionsofapositionallightonashinygoldtorus.

Figure7.1ADSLightingcontributions.

Recallthatasceneisultimatelydrawnbyhavingthefragmentshaderoutputacolorforeachpixelonthescreen.UsinganADSlightingmodelrequiresspecifyingcontributionsduetolightingonapixel’sRGBAoutputvalue.Factorsinclude:

Thetypeoflightsource,anditsambient,diffuse,andspecularcharacteristicsTheobject’smaterial’sambient,diffuse,andspecularcharacteristicsTheobject’smaterial’sspecified“shininess”TheangleatwhichthelighthitstheobjectTheanglefromwhichthesceneisbeingviewed

7.2 LIGHTS

Page 156: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Therearemanytypesoflights,eachwithdifferentcharacteristicsandrequiringdifferentstepstosimulatetheireffects.Sometypesinclude:

Global (usually called “global ambient” because it includes only an ambientcomponent)Directional(or“distant”)Positional(or“pointsource”)Spotlight

Globalambient light is thesimplest typeof light tomodel.Globalambient lighthasnosourceposition—the light is equal everywhere, at eachpixeloneveryobject in the scene,regardless of where the objects are. Global ambient lighting simulates the real-worldphenomenonoflightthathasbouncedaroundsomanytimesthatitssourceanddirectionareundeterminable.Globalambientlighthasonlyanambientcomponent,specifiedasanRGBAvalue; it has no diffuse or specular components. For example, global ambient light can bedefinedasfollows:

float[]globalAmbient=newfloat[]{0.7f,0.7f,0.7f,1.0f};//

RGBAvaluesrangefrom0to1

Alternatively,globalambientlightcanbespecifiedusingthegraphicslib3DAmbientLightclass,whichisavailableasasingleton:

AmbientLightglobalAmbient=getAmbientLight();

globalAmbient.setRed(0.7f);

globalAmbient.setGreen(0.7f);

globalAmbient.setBlue(0.7f);

globalAmbient.setAlpha(1.0f);

Directional, or distant light also doesn’t have a source location, but it does have adirection.Itisusefulforsituationswherethesourceofthelightissofarawaythatthelightraysareeffectivelyparallel,suchas lightcomingfromthesun. Inmanysuchsituationswemayonlybeinterestedinmodelingthelight,andnot theobject thatproducesthelight.Theeffect of directional light on an object depends on the light’s angle of impact; objects arebrighteronthesidefacingadirectionallightthanonatangentialoroppositeside.Modelingdirectional light requiresspecifying itsdirection (asavector),and itsambient,diffuse,andspecularcharacteristics(asRGBAvalues).AreddirectionallightpointingdownthenegativeZaxismightbespecifiedthusly:

float[]dirLightAmbient=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]dirLightDiffuse=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]dirLightSpecular=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]dirLightDirection=newfloat[]{0.0f,0.0f,-1.0f};

Directionallightsareincludedingraphicslib3DintheclassDistantLight,whichincludesan accessor for direction, as well as accessors for ambient, diffuse, and specular fields.Instantiatinga reddirectional lightpointingdown thenegativeZaxis ingraphicslib3Dcanthereforebedonethusly:

DistantLightdl1=newDistantLight();

Page 157: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Vector3DdirectionVector=newVector3D(0,0,-1);

float[]amb=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]dif=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]spec=newfloat[]{1.0f,0.0f,0.0f,1.0f};

dl1.setAmbient(amb);

dl1.setDiffuse(dif);

dl1.setSpecular(spec);

dl1.setDirection(directionVector);

APositionallighthasaspecificlocationinthe3Dscene.Lightsourcesthatarenearthescene,suchaslamps,candles,andsoforth,areexamples.Likedirectionallights,theeffectofapositionallightdependsonangleofimpact;however,itsdirectionisnotspecified,asitisdifferent for each vertex in our scene. Positional lights may also incorporate attenuationfactors,inordertomodelhowtheirintensitydiminisheswithdistance.Aswiththeothertypesof lights we have seen, positional lights have ambient, diffuse, and specular propertiesspecified asRGBvalues.A red positional light at location (5, 2, -3) could for example bespecifiedthusly:

float[]posLightAmbient=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]posLightDiffuse=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]posLightSpecular=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]posLightLocation=newfloat[]{5.0f,2.0f,-3.0f};

Positional lights are included in graphicslib3D in the class PositionalLight, whichincludesanaccessorforposition(specifiedasaPoint3D),accessorsforconstant,linear,andquadraticattenuationfactors,aswellasaccessorsforitsambient,diffuse,andspecularfields.Instantiatingaredpositionallightatposition(5,2,-3),alongwithmodestattenuationfactors,ingraphicslib3Dcanthereforebedonethusly(theattenuationfactorsaredescribedbelowthecode):

PositionalLightpl1=newPositionalLight();

Point3Dplocation=newPoint3D(5,2,-3);

float[]amb=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]dif=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]spec=newfloat[]{1.0f,0.0f,0.0f,1.0f};

pl1.setAmbient(amb);

pl1.setDiffuse(dif);

pl1.setSpecular(spec);

pl1.setPosition(plocation);

pl1.setConstantAtt(1.0f);

pl1.setLinearAtt(0.1f);

pl1.setQuadraticAtt(0.1f);

Attenuation factors can bemodeled in a variety ofways.As shown above,we providetunablenon-negativeparametersforconstant,linear,andquadraticattenuation(kc,kl,andkq,respectively).Theseparametersarethencombined,takingintoaccountthedistance(d) fromthelightsource:

Page 158: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Multiplyingthisfactorbythelightintensitycausestheintensitytobedecreasedthegreaterthedistanceistothelightsource.Notethatkcshouldalwaysbesetgreaterthanorequalto1,sothattheattenuationfactorwillalwaysbeintherange(0...1)andapproach0asthedistancedincreases.

Spotlightshavebothapositionandadirection.Theeffectofthespotlight’s“cone”canbesimulatedbyspecifyingacutoffangleθbetween0°and90°indicatingthewidthofthelightbeam,andafalloffexponenttomodelthevariationofintensityacrosstheangleofthebeam.As shown in Figure 7.2, we determine the angle ϕ between the spotlight’s direction and avector from the spotlight to the pixel.We then compute an intensity factor by raising thecosineofϕtothefalloffexponentwhenϕislessthanθ(whenϕisgreaterthanθtheintensityfactorissetto0).Theresultisanintensityfactorthatrangesfrom0to1.Thefalloffexponentadjuststherateatwhichtheintensityfactortendsto0astheangleϕincreases.Theintensityfactoristhenmultipliedbythelight’sintensitytosimulatetheconeeffect.

Figure7.2Spotlightcomponents.

Aredspotlightat location(5,2,-3)pointingdownthenegativeZaxiscouldbespecifiedthusly:

float[]spotLightAmbient=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]spotLightDiffuse=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]spotLightSpecular=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]spotLightLocation=newfloat[]{5.0f,2.0f,-3.0f};

float[]spotLightDirection=newfloat[]{0.0f,0.0f,-1.0f};

float[]spotLightCutoff=45.0f;

float[]spotLightExponent=2.0f;

Becausespotlightssharecharacteristicsofbothdirectionalandpositionallights,itisabittricky settling on a class hierarchy given Java’s absence of multiple inheritance. Ingraphicslib3D,SpotLight is a subclassofPositionalLight. Inaddition todirection, it alsoincludesaccessorsforcutoffandexponent.Instantiatingtheredspotlightdescribedabovecanbedoneasfollows:

SpotLightsl1=newSpotLight();

Point3Dplocation=newPoint3D(5,2,-3);

Vector3DdirectionVector=newVector3D(0,0,-1);

float[]amb=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]dif=newfloat[]{1.0f,0.0f,0.0f,1.0f};

float[]spec=newfloat[]{1.0f,0.0f,0.0f,1.0f};

sl1.setAmbient(amb);

Page 159: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

sl1.setDiffuse(dif);

sl1.setSpecular(spec);

sl1.setPosition(plocation);

sl1.setDirection(directionVector);

sl1.setCutoffAngle(45.0f);

sl1.setFalloffExponent(2.0f);

Spotlightsalsocanincludeattenuationfactorsinthesamemanneraspositionallights.TheSpotLightclassingraphicslib3DinheritstheseattributesfromPositionalLight,sotheyareavailableifdesired.

Historically, spotlights have been iconic in computer graphics since Pixar ’s popularanimatedshort“LuxoJr.”appearedin1986[DI16].

7.3 MATERIALS

The“look”oftheobjectsinourscenehassofarbeenhandledexclusivelybycolorandtexture.Theadditionoflightingallowsustoalsoconsiderthereflectancecharacteristicsofthesurfaces.Bythat,wemeanhowtheobjectinteractswithourADSlightingmodel.Thiscanbemodeledbyconsideringeachobjecttobe“madeof”acertainmaterial.

MaterialscanbesimulatedinanADSlightingmodelbyspecifyingfourvalues,threeofwhichwearealreadyfamiliarwith—ambient,diffuse,andspecularRGBcolors.Thefourthiscalledshininess,which,aswewillsee,isusedtobuildanappropriatespecularhighlightfortheintendedmaterial.ADSandshininessvalueshavebeendevelopedformanydifferenttypesofcommonmaterials.Forexample,“pewter”canbespecifiedthusly:

float[]pewterMatAmbient=newfloat[]{.11f,.06f,.11f,1.0f};

float[]pewterMatDiffuse=newfloat[]{.43f,.47f,.54f,1.0f};

float[]pewterMatSpecular=newfloat[]{.33f,.33f,.52f,1.0f};

floatpewterMatShininess=9.85f;

ADSRGBAvaluesforafewothermaterialsaregiveninFigure7.3(from[BA16]).

Sometimes other properties are included in the material properties. Transparency ishandled in the RGBA specifications in the fourth (or “alpha”) channel, which specifies anopacity; a value of 1.0 represents completely opaque and 0.0 represents completelytransparent.Formostmaterialsitissimplysetto1.0,althoughforcertainmaterialsaslighttransparency plays a role. For example, in Figure 7.3, note that the materials “jade” and“pearl”includeasmallamountoftransparency(valuesslightlylessthan1.0)toaddrealism.

EmissivenessisalsosometimesincludedinanADSmaterialspecification.Thisisusefulwhensimulatingamaterialthatemitsitsownlight,suchasphosphorescentmaterials.

Page 160: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure7.3MaterialADScoefficients.

Ingraphicslib3D,theclassMaterialincludesaccessorsforambient,diffuse,andspecularRGBAvalues (as float arrays), aswell as for shininessandemission. Instantiatinga“jade”materialcouldthereforebedoneasfollows:

Materialjd1=newMaterial();

float[]amb=newfloat[]{.135f,.2225f,.1575f,.95f};

float[]dif=newfloat[]{.54f,.89f,.63f,.95f};

float[]spec=newfloat[]{.3162f,.3162f,.3162f,.95f};

jd1.setAmbient(amb);

jd1.setDiffuse(dif);

jd1.setSpecular(spec);

jd1.setShininess(12.8f);

Inaddition,theMaterialclassincludesthreepredefinedstaticmaterials:GOLD,SILVER,andBRONZE. For example, silver can be accessed with the following:graphicslib3D.Material.SILVER.

Notethatthegraphicslib3Dlightandmaterialsclassesdescribedinthesesectionsdonotactuallyperformlighting.Theymerelyprovideanorganizedwaytospecifyandstoredesiredlight andmaterial properties for elements in a scene.We still need to actually compute thelighting ourselves. This is going to require some serious mathematical processing in ourshadercode.Solet’snowdiveinto thenuts-and-boltsof implementingADSlightinginourJava/JOGLandGLSLgraphicsprograms.

7.4 ADSLIGHTINGCOMPUTATIONS

Page 161: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Aswedrawourscene,recallthateachvertexistransformedsoastosimulatea3Dworldon a 2D screen. Pixel colors are the result of rasterization, as well as texturing andinterpolation.Wemustnowincorporatetheadditionalstepofadjustingthoserasterizedpixelcolors toeffect thelightingandmaterials inourscene.ThebasicADScomputationthatweneedtoperformistodeterminethereflectionintensity(I) foreachpixel inourscene.Thiscomputationtakesthefollowingform:

That is, we need to compute and sum the ambient, diffuse, and specular reflectioncontributionsforeachpixel,foreachlightsource.Thiswillofcoursedependonthetypeoflight(s)inourscene,andthetypeofmaterialassociatedwiththerenderedmodel.

Ambientcontributionisthesimplest.Itistheproductofthespecifiedambientlightandthespecifiedambientcoefficientofthematerial:

Keeping in mind that light and material intensities are specified via RGB values, thecomputationismoreprecisely:

Diffuse contribution is more complex because it depends on the angle of incidencebetweenthelightandthesurface.Lambert’sCosineLaw(publishedin1760)specifiesthattheamountoflightthatreflectsfromasurfaceisproportionaltothecosineofthelight’sangleofincidence.Thiscanbemodeledasfollows:

Asbefore,theactualcomputationsinvolvered,green,andbluecomponents.

Determiningtheangleofincidenceθrequiresusto(a)findavectorfromthepixelbeingdrawntothelightsource(or,similarly,avectoroppositethelightdirection),and(b)findavectorthatisnormal(perpendicular)tothesurfaceoftheobjectbeingrendered.Let’sdenotethesevectors and ,respectively,asshownbelowinFigure7.4:

Figure7.4

Page 162: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Angleoflightincidence.

Dependingonthenatureof thelights inthescene, couldbecomputedbynegating thelight direction vector, or by computing a vector from the pixel to the location of the lightsource. Determining vector may be trickier—normal vectors may be available for theverticesinthemodelbeingrendered,butifthemodeldoesn’tincludenormals, ,wouldneedtobeestimatedgeometricallybasedonthelocationsofneighboringvertices.Fortherestofthechapter,wewillassumethatthemodelbeingrenderedincludesnormalvectorsforeachvertex(thisiscommoninmodelsconstructedinmodelingtoolssuchasMayaorBlender).

Itturnsoutthatinthiscase,itisn’tnecessarytocomputeθitself.Whatwereallydesireiscos(θ), and recall from Chapter 3 that this can be found using the dot product. Thus, thediffusecontributioncanbecomputedasfollows:

Thediffusecontributionisonlyrelevantwhenthesurfaceisexposedtothelight,whichoccurswhen-90<θ<90;thatis,whencos(θ)>0.Thuswemustreplacetherightmosttermabovewith:

Specularcontributiondetermineswhetherthepixelbeingrenderedshouldbebrightenedbecauseitispartofa“specularhighlight.”Itinvolvesnotonlytheangleofincidenceofthelight source, but also the angle between the reflection of the light on the surface, and theviewingangleofthe“eye”relativetotheobject’ssurface:

Figure7.5Viewangleincidence.

InFigure7.5, represents thedirectionof reflectionof the light,and (called theviewvector)isavectorfromthepixeltotheeye.Notethat isthenegativeofthevectorfromtheeyetothepixel(incameraspace,theeyeisattheorigin).Thesmallertheangleφbetweenand , themore theeye ison-axis,or“looking into” thereflection,and themore thispixelcontributestothespecularhighlight(andthusthebrighteritshouldappear).

The manner in which φ is used to compute the specular contribution depends on thedesired“shininess”oftheobjectbeingrendered.Objectsthatareextremelyshiny,suchasa

Page 163: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

mirror,haveverysmallspecularhighlights—thatis,theyreflecttheincominglighttotheeyeexactly.Materialsthatarelessshinyhavespecularhighlightsthataremore“spreadout,”andthusmorepixelsareapartofthehighlight.

Shininess is generally modeled with a falloff function that expresses how quickly thespecular contribution reduces to zero as the angle φ grows.We can use cos(φ) to modelfalloff,andincreaseordecreasetheshininessbyusingpowersofthecosinefunction,suchascos(φ),cos2(φ),cos3(φ),cos10(φ),cos50(φ),andsoon,asillustratedinFigure7.6.

Figure7.6Shininessmodeledascosineexponent.

Notethatthehigherthevalueoftheexponent, thefasterthefalloff,andthusthesmallerthe specular contributionof pixelswith light reflections that are off-axis from the viewingangle.

Wecall theexponentn,asusedinthecosn(φ) falloff function, theshininess factor foraspecifiedmaterial. Note back in Figure 7.3 that shininess factors for each of thematerialslistedisspecifiedintherightmostcolumn.

Wenowcanspecifythefullspecularcalculation:

Note that we use the max() function in a similar manner as we did for the diffusecomputation.Inthiscase,weneedtoensurethatthespecularcontributiondoesnoteverutilizenegativevaluesforcos(φ),whichcouldproducestrangeartifactssuchas“darkened”specularhighlights.

And of course as before, the actual computations involve red, green, and bluecomponents.

7.5 IMPLEMENTINGADSLIGHTING

The computations described in Section7.4 have so far beenmostly theoretical, as theyhaveassumedthatwecanperformthemforeverypixel.Thisiscomplicatedbythefactthatnormal( )vectorsaretypicallyavailabletousonlyfortheverticesthatdefinethemodels,

Page 164: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

not foreachpixel.Thusweneed toeithercomputenormals foreachpixel,whichcouldbetime-consuming, or find some way of estimating the values that we need to achieve asufficienteffect.

One approach is called “faceted shading” or “flat shading.”Herewe assume that everypixelineachrenderedprimitive(i.e.,polygonortriangle)hasthesamelightingvalue.Thusweneedonlydothelightingcomputationsforonevertexineachpolygoninthemodel,andthencopythose lightingvaluesacross thenearbyrenderedpixelsonaper-polygonorper-trianglebasis.

Facetedshadingisrarelyusedtoday,becausetheresultingimagestendtonot lookveryrealistic, and because modern hardware makes more accurate computations feasible. Anexampleofafaceted-shadedtorus,inwhicheachtrianglebehavesasaflatreflectivesurface,isshowninFigure7.7.

Figure7.7Toruswithfacetedshading.

Although faceted shading can be adequate in some circumstances, usually a betterapproach is “smooth shading,” in which the lighting intensity is computed for each pixel.Smooth shading is feasible because of the parallel processing done on modern graphicscards, and because of the interpolated rendering that takes place in the OpenGL graphicspipeline.

Wewillexaminetwopopularmethodsforsmoothshading:GouraudshadingandPhongshading.

7.5.1 GouraudShading

TheFrenchcomputerscientistHenriGouraudpublishedasmoothshadingalgorithmin1971 thathascome tobeknownasGouraudshading [GO71]. It isparticularlywell suited tomoderngraphicscards,becauseittakesadvantageoftheautomaticinterpolatedrenderingthatisavailablein3DgraphicspipelinessuchasinOpenGL.TheprocessforGouraudshadingisasfollows:

Page 165: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

1. Determinethecolorofeachvertex,incorporatingthelightingcomputations.2. Allowthosecolorstobeinterpolatedacrossinterveningpixelsthroughthenormal

rasterization process (which will also in effect interpolate the lightingcontributions).

InOpenGL/JOGL, thismeans that the lighting computationswill be done in the vertexshader.Thefragmentshaderwillsimplybeapass-through,soastorevealtheautomaticallyinterpolatedlightedcolors.

Figure7.8outlinesthestrategywewillusetoimplementourGouraudshaderinOpenGL,forascenewithatorusandonepositionallight.ThestrategyisthenimplementedinProgram7.1.

Figure7.8ImplementingGouraudshading.

Program7.1ToruswithPositionalLightandGouraudShading//previousimportsstillapply–thefollowingadditionalimportswillbe

necessary:

importgraphicslib3D.*;

importgraphicslib3D.light.*;

publicclassCodeextendsJFrameimplementsGLEventListener

{privateGLCanvasmyCanvas;

//creationofm,v,mv,andprojmatricesasbefore.

//declarationsforbuildingshadersandrenderingprogram,asbefore.

//declarationofoneVAO,andtwoVBOs,asbefore.

...

privateTorusmyTorus=newTorus(0.5f,0.2f,48);

privateintnumTorusVertices;

//lightandmaterialdeclarations

privateMaterialcurrentMaterial=Material.GOLD;

privatePositionalLightcurrentLight=newPositionalLight();

privatePoint3DlightLoc=newPoint3D(10.0f,10.0f,10.0f);

privatefloat[]globalAmbient=newfloat[]{0.7f,0.7f,0.7f,1.0f};

//locationoftorusandcamera

privatePoint3DtorusLoc=newPoint3D(0,0,-1);

privatePoint3DcameraLoc=newPoint3D(0,0,1.0f);

Page 166: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

publicCode()

{//constructorsameasbefore

}

publicvoidinit(GLAutoDrawabledrawable)

{GL4gl=(GL4)drawable.getGL();

rendering_program=createShaderProgram();

setupVertices();

}

privatevoidsetupVertices()

{//codetoextractgraphicslib3Dtorusindices,verticesandnormals,

asdescribedbefore.

...

//codetogenerateVAO,bindvertexarray0,andgeneratetwobuffers,

asdescribedbefore.

...

numTorusVertices=indices.length;

...

//puttheTorusverticesintothefirstbuffer

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

FloatBuffervertBuf=Buffers.newDirectFloatBuffer(fvalues);

gl.glBufferData(GL_ARRAY_BUFFER,vertBuf.limit()*4,vertBuf,

GL_STATIC_DRAW);

//loadthetorusnormalvectorsintothesecondbuffer

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

FloatBuffernorBuf=Buffers.newDirectFloatBuffer(nvalues);

gl.glBufferData(GL_ARRAY_BUFFER,norBuf.limit()*4,norBuf,

GL_STATIC_DRAW);

}

//main(),dispose(),reshape(),shadermodules,andperspective

computationsasbefore.

publicvoiddisplay(GLAutoDrawabledrawable)

{//setupofGL4object,clearingofdepthbufferasinearlier

examples.

...

//setupofprojectionmatrixasinearlierexamples.

...

gl.glUseProgram(rendering_program);

//uniformsformodel-view,projection,andnormalmatrices.

mv_location=gl.glGetUniformLocation(rendering_program,"mv_matrix");

proj_location=gl.glGetUniformLocation(rendering_program,"proj_matrix");

n_location=gl.glGetUniformLocation(rendering_program,"norm_matrix");

//buildtheMODELmatrixbasedonthetoruslocation

m_matrix.setToIdentity();

m_matrix.translate(torusLoc.getX(),torusLoc.getY(),torusLoc.getZ());

m_matrix.rotateX(35.0);

//buildtheVIEWmatrixbasedonthecameralocation

v_matrix.setToIdentity();

v_matrix.translate(-cameraLoc.getX(),-cameraLoc.getY(),-cameraLoc.getZ());

Page 167: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//setuplightsbasedonthecurrentlight’sposition

currentLight.setPosition(lightLoc);

installLights(v_matrix);

//buildtheMODEL-VIEWmatrixbyconcatenatingmatricesvandm,as

before

...

//puttheMV,PROJ,andNORMALmatricesintothecorrespondinguniform

variables

gl.glUniformMatrix4fv(mv_location,1,false,mv_matrix.getFloatValues(),

0);

gl.glUniformMatrix4fv(proj_location,1,false,

proj_matrix.getFloatValues(),0);

//NOTE--thetransformationmatrixforthenormalvectoristhe

inversetransposeofMV.

gl.glUniformMatrix4fv(n_location,1,false,

(mv_matrix.inverse()).transpose().getFloatValues(),0);

//bindtheverticesbuffertovertexattribute#0inthevertexshader

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

//bindthenormalsbuffertovertexattribute#1inthevertexshader

gl.glBindBuffer(GL_ARRAY_BUFFER,vbo[1]);

gl.glVertexAttribPointer(1,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(1);

gl.glClear(GL_DEPTH_BUFFER_BIT);

gl.glEnable(GL_CULL_FACE);

gl.glFrontFace(GL_CCW);

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

}

privatevoidinstallLights(Matrix3Dv_matrix,GL4)

{GL4gl=(GL4)GLContext.getCurrentGL();

//convertlight’spositiontoviewspace,andsaveitinafloatarray

Point3DlightP=currentLight.getPosition();

Point3DlightPv=lightP.mult(v_matrix);

float[]viewspaceLightPos=

newfloat[]{(float)lightPv.getX(),(float)lightPv.getY(),

(float)lightPv.getZ()};

//setthecurrentglobalAmbientsettings

intglobalAmbLoc=gl.glGetUniformLocation(rendering_program,

"globalAmbient");

gl.glProgramUniform4fv(rendering_program,globalAmbLoc,1,globalAmbient,

0);

//getthelocationsofthelightandmaterialfieldsintheshader

intambLoc=gl.glGetUniformLocation(rendering_program,"light.ambient");

Page 168: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

intdiffLoc=gl.glGetUniformLocation(rendering_program,"light.diffuse");

intspecLoc=gl.glGetUniformLocation(rendering_program,

"light.specular");

intposLoc=gl.glGetUniformLocation(rendering_program,"light.position");

intMambLoc=gl.glGetUniformLocation(rendering_program,

"material.ambient");

intMdiffLoc=gl.glGetUniformLocation(rendering_program,

"material.diffuse");

intMspecLoc=gl.glGetUniformLocation(rendering_program,

"material.specular");

intMshiLoc=gl.glGetUniformLocation(rendering_program,

"material.shininess");

//settheuniformlightandmaterialvaluesintheshader

gl.glProgramUniform4fv(rendering_program,ambLoc,1,

currentLight.getAmbient(),0);

gl.glProgramUniform4fv(rendering_program,diffLoc,1,

currentLight.getDiffuse(),0);

gl.glProgramUniform4fv(rendering_program,specLoc,1,

currentLight.getSpecular(),0);

gl.glProgramUniform3fv(rendering_program,posLoc,1,viewspaceLightPos,

0);

gl.glProgramUniform4fv(rendering_program,MambLoc,1,

currentMaterial.getAmbient(),0);

gl.glProgramUniform4fv(rendering_program,MdiffLoc,1,

currentMaterial.getDiffuse(),0);

gl.glProgramUniform4fv(rendering_program,MspecLoc,1,

currentMaterial.getSpecular(),0);

gl.glProgramUniform1f(rendering_program,MshiLoc,

currentMaterial.getShininess());

}}

Mostof theelementsofProgram7.1 shouldbe familiar.Thegraphicslib3Dclasses forMaterial, Torus, and PositionalLight are instantiated and configured. Torus vertices andassociated normals are loaded into buffers. The display() function is similar to that inpreviousprograms,exceptthatitalsosendsthelightandtoruslocationstothevertexshader.The installLights() function loads the light and material ADS characteristics intocorrespondinguniformvariablestomakethemavailabletotheshadercode.

OneinterestingdetailisthatthetransformationmatrixMV,usedtomovevertexpositionsinto view space, doesn’t always properly adjust normal vectors into view space. This isbecausesimplyapplyingtheMVmatrixtothenormaldoesn’tguaranteethattheywillremainperpendiculartotheobjectsurface.ThecorrecttransformationmatrixisequaltotheinversetransposeofMV,asdescribedearlierinthesupplementalnotestoChapter3. InProgram7.1,thisadditionalmatrix,named“norm_matrix,”issenttotheshadersinauniformvariable.

The variable lightPv contains the light’s position in camera space. We only need tocompute thisonceper frame,sowedo itinstallLights() (calledfromdisplay()) ratherthanintheshader.

Theshadersareshownahead.Thevertexshaderutilizessomenotations thatwehaven’tyetseen.Noteforexample thevectoradditiondoneat theendof thevertexshader—vector

Page 169: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

additionwasdescribedinChapter3,andisavailableasshownhereinGLSL.Wewilldiscusssomeoftheothernotationsafterpresentingtheshaders.

(Program7.1,continued)

VertexShader#version430

layout(location=0)invec3vertPos;

layout(location=1)invec3vertNormal;

outvec4varyingColor;

structPositionalLight

{vec4ambient;

vec4diffuse;

vec4specular;

vec3position;

};

structMaterial

{vec4ambient;

vec4diffuse;

vec4specular;

floatshininess;

};

uniformvec4globalAmbient;

uniformPositionalLightlight;

uniformMaterialmaterial;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

uniformmat4norm_matrix;//fortransformingnormals

voidmain(void)

{vec4color;

//convertvertexpositiontoviewspace

vec4P=mv_matrix*vec4(vertPos,1.0);

//convertnormaltoviewspace

vec3N=normalize((norm_matrix*vec4(vertNormal,1.0)).xyz);

//calculateviewspacelightvector(fromvertextolight)

vec3L=normalize(light.position-P.xyz);

//viewvectorisequivalenttothenegativeofviewspacevertexposition

vec3V=normalize(-P.xyz);

//Risreflectionof-LwithrespecttosurfacenormalN

vec3R=reflect(-L,N);

//ambient,diffuse,andspecularcontributions

vec3ambient=((globalAmbient*material.ambient)+(light.ambient*

material.ambient)).xyz;

vec3diffuse=light.diffuse.xyz*material.diffuse.xyz*max(dot(N,L),

0.0);

Page 170: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

vec3specular=

material.specular.xyz*light.specular.xyz*pow(max(dot(R,V),0.0f),

material.shininess);

//sendthecoloroutputtothefragmentshader

varyingColor=vec4((ambient+diffuse+specular),1.0);

//sendthepositiontothefragmentshader,asbefore

gl_Position=proj_matrix*mv_matrix*vec4(vertPos,1.0);

}

FragmentShader#version430

invec4varyingColor;

outvec4fragColor;

//uniformsmatchthoseinthevertexshader,

//butaren’tuseddirectlyinthisfragmentshader

structPositionalLight

{vec4ambient;

vec4diffuse;

vec4specular;

vec3position;

};

structMaterial

{vec4ambient;

vec4diffuse;

vec4specular;

floatshininess;

};

uniformvec4globalAmbient;

uniformPositionalLightlight;

uniformMaterialmaterial;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

uniformmat4norm_matrix;

//interpolatelightedcolor(interpolationofgl_Positionisautomatic)

voidmain(void)

{fragColor=varyingColor;

}

TheoutputofProgram7.1isshowninFigure7.9.

Page 171: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure7.9ToruswithGouraudShading.

Thevertexshadercontainsourfirstexampleofusingthestructnotation.AGLSL“struct”islikeadatatype;ithasaname,andasetoffields.Whenavariableisdeclaredusingthenameof a struct, it then contains those fields, which are accessed using the “.” notation. Forexample,variable“light”isdeclaredoftype“PositionalLight”,sowecanthereafterrefertoitsfieldslight.ambient,light.diffuse,andsoforth.

Alsonote the fieldselectornotation“.xyz”,used in severalplaces in thevertex shader.Thisisashortcutforconvertingavec4 toanequivalentvec3containingonly its first threeelements.

The vertex shader iswheremost of the lighting computations are performed. For eachvertex, the appropriatematrix transforms are applied to the vertex position and associatednormalvector,andvectorsforlightdirection( )andreflection( )arecomputed.TheADScomputationsdescribedinsection7.4arethenperformed,resultinginacolorforeachvertex(called varyingColor in the code). The colors are interpolated as part of the normalrasterizationprocess.Thefragmentshaderisthenasimplepass-through.Thelengthylistofuniformvariable declarations is also present in the fragment shader, but none of them areactuallyusedthere.

Note theuseof theGLSLfunctionsnormalize(),whichconvertsavector tounit lengthand isnecessary forproperapplicationof thedotproduct,andreflect(),which computesthereflectionofonevectoraboutanother.

Artifactsareevident in theoutput torusshowninFigure7.9.Specularhighlightshaveablocky, facetedappearance.Thisartifact ismorepronounced if theobject is inmotion (wecan’tillustratethathere).

Gouraud shading is susceptible to other artifacts. If the specular highlight is entirelycontainedwithinoneofthemodel’striangles—thatis,ifitdoesn’tcontainatleastoneofthemodel vertices—then it may disappear entirely. The specular component is calculated pervertex,soifamodelvertexwithaspecularcontributiondoesnotexist,noneoftherasterizedpixelswillincludespecularlighteither.

7.5.2 PhongShading

BuiTuongPhongdevelopedasmoothshadingalgorithmwhileagraduatestudentattheUniversityofUtah,anddescribedit inhis1973dissertation[PH73]andpublished in [PH75].ThestructureofthealgorithmissimilartothealgorithmforGouraudshading,exceptthatthelighting computations are done per-pixel rather than per-vertex. Since the lightingcomputationsrequireanormalvector andalightvector ,whichareonlyavailableinthemodel on a per-vertex basis, Phong shading is often implemented using a clever “trick,”whereby and are computed in the vertex shader and interpolated during rasterization.Figure7.10outlinesthestrategy:

Page 172: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure7.10ImplementingPhongshading.

TheJava/JOGLcodeiscompletelyunchanged.Someofthecomputationspreviouslydonein the vertex shader are nowmoved into the fragment shader. The effect of interpolatingnormalvectorsisillustratedinFigure7.11:

Figure7.11Interpolationofnormalvectors.

Wenowarereadytoimplementourtoruswithpositionallighting,usingPhongshading.MostofthecodeisidenticaltothatusedforGouraudshading.SincetheJava/JOGLcodeisunchanged,wepresentonlytherevisedvertexandfragmentshaders,showninProgram7.2.Examining theoutputofProgram7.2,asshowninFigure7.12,Phongshadingcorrects theartifactspresentinGouraudshading.

Figure7.12ToruswithPhongshading.

Program7.2ToruswithPhongShading

Page 173: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

VertexShader#version430

layout(location=0)invec3vertPos;

layout(location=1)invec3vertNormal;

outvec3varyingNormal;//eye-spacevertexnormal

outvec3varyingLightDir;//vectorpointingtothelight

outvec3varyingVertPos;//vertexpositionineyespace

//structsanduniformssameasforGouraudshading

...

voidmain(void)

{//outputvertexposition,lightdirection,andnormaltotherasterizer

forinterpolation

varyingVertPos=(mv_matrix*vec4(vertPos,1.0)).xyz;

varyingLightDir=light.position-varyingVertPos;

varyingNormal=(norm_matrix*vec4(vertNormal,1.0)).xyz;

gl_Position=proj_matrix*mv_matrix*vec4(vertPos,1.0);

}

FragmentShader#version430

invec3varyingNormal;

invec3varyingLightDir;

invec3varyingVertPos;

outvec4fragColor;

//structsanduniformssameasforGouraudshading

...

voidmain(void)

{//normalizethelight,normal,andviewvectors:

vec3L=normalize(varyingLightDir);

vec3N=normalize(varyingNormal);

vec3V=normalize(-varyingVertPos);

//computelightreflectionvectorwithrespecttoN:

vec3R=normalize(reflect(-L,N));

//gettheanglebetweenthelightandsurfacenormal:

floatcosTheta=dot(L,N);

//anglebetweentheviewvectorandreflectedlight:

floatcosPhi=dot(V,R);

//computeADScontributions(perpixel),andcombinetobuildoutput

color:

vec3ambient=((globalAmbient*material.ambient)+(light.ambient*

material.ambient)).xyz;

vec3diffuse=light.diffuse.xyz*material.diffuse.xyz*max(cosTheta,0.0);

vec3specular=

light.specular.xyz*material.specular.xyz*pow(max(cosPhi,0.0),

material.shininess);

fragColor=vec4((ambient+diffuse+specular),1.0);

}

Although Phong shading offers better realism than Gouraud shading, it does so while

Page 174: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

incurring a performance cost.One optimization to Phong shadingwas proposed by JamesBlinnin1977[BL77],andisreferredtoastheBlinn-Phongreflectionmodel.ItisbasedontheobservationthatoneofthemostexpensivecomputationsinPhongshadingisdeterminingthereflectionvector .

Blinn observed that the vector itself actually is not needed— is only produced as ameansofdetermining theangleφ. It turnsout thatφcanbefoundwithoutcomputing ,byinsteadcomputingavector that ishalfwaybetween and .AsshowninFigure7.13, theangleαbetween and isconvenientlyequalto½(φ).Althoughαisn’tidenticaltoφ,Blinnshowedthatreasonableresultscanbeobtainedbyusingαinsteadofφ.

Figure7.13Blinn-Phongreflection.

The“halfway”vector ismosteasilydeterminedbyfinding (seeFigure7.14),afterwhichcos(α)canbefoundusingthedotproduct .

Figure7.14Blinn-Phongcomputation.

Thecomputationscanbedoneinthefragmentshader,oreveninthevertexshader(withsome tweaks) if necessary for performance. Figure 7.15 shows the torus rendered usingBlinn-Phong shading; the quality is largely indistinguishable from Phong shading, withsubstantialperformancecostsavings.

Page 175: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure7.15ToruswithBlinn-Phongshading.

Program 7.3 shows the revised vertex and fragment shaders for converting the Phongshading example shown in Program 7.2 to Blinn-Phong shading. As before, there is nochangetotheJava/JOGLcode.

Program7.3Torusw/Blinn-PhongShading

VertexShader...

//half-vector“H”isanadditionaloutputvarying

outvec3varyingHalfVector;

...

voidmain(void)

{//computationssameasbefore,plusthefollowingthatcomputesL+V

varyingHalfVector=(varyingLightDir+(-varyingVertPos)).xyz;

//(therestofthevertexshaderisunchanged)

}

FragmentShader...

invec3varyingHalfVector;

...

voidmain(void)

{//notethatitisnolongernecessarytocomputeRinthefragmentshader

vec3L=normalize(varyingLightDir);

vec3N=normalize(varyingNormal);

vec3V=normalize(-varyingVertPos);

vec3H=normalize(varyingHalfVector);

...

//getanglebetweenthenormalandthehalfwayvector

floatcosPhi=dot(H,N);

//halfwayvectorHwascomputedinthevertexshader,andthen

interpolated.

vec3ambient=((globalAmbient*material.ambient)+(light.ambient*

material.ambient)).xyz;

Page 176: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

vec3diffuse=light.diffuse.xyz*material.diffuse.xyz*max(cosTheta,0.0);

vec3specular=

light.specular.xyz*material.specular.xyz*pow(max(cosPhi,0.0),

material.shininess*3.0);

//themultiplicationby3.0attheendisa“tweak”to

improvethespecularhighlight.

fragColor=vec4((ambient+diffuse+specular),1.0);

}

Figure 7.16 shows two examples of the effect of Phong shading on more complexexternallygeneratedmodels.ThetopimageshowsarenderingofanOBJmodelofadolphincreatedbyJayTurbervilleatStudio522Productions[TU16].Thebottomimageisarenderingofthewell-known“StanfordDragon,”theresultofa3Dscanofanactualfigurine,donein1996[ST96].Bothmodelswererenderedusingthe“gold”materialfromgraphicslib3D.TheStanforddragoniswidelyusedfortestinggraphicsalgorithmsandhardwarebecauseofitssize—itcontainsover800,000triangles.

Figure7.16ExternalmodelswithPhongshading.

7.6 COMBININGLIGHTINGANDTEXTURES

Page 177: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Sofar,ourlightingmodelhasassumedthatweareusinglightswithspecifiedADSvaluestoilluminateobjectsmadeofmaterialthathasalsobeendefinedwithADSvalues.However,aswesawinChapter5, someobjectsmay insteadhavesurfacesdefinedby texture images.Therefore,weneedawayofcombiningcolors retrievedbysamplinga texture,andcolorsproducedfromalightingmodel.

Themannerinwhichwecombinelightingandtexturesdependsonthenatureoftheobjectandthepurposeofitstexture.Thereareseveralscenarios,afewofwhichinclude:

The texture image very closely reflects the actual appearance of the object’ssurface.Theobjecthasbothamaterialandatexture.Thetexturecontainsshadoworreflectioninformation(coveredinChapters8and10).Therearemultiplelights,and/ormultipletexturesinvolved.

Let’sconsiderthefirstcase,wherewehaveasimpletexturedobjectandwewishtoaddlightingtoit.Onesimplewayofaccomplishingthisinthefragmentshaderistoremovethematerialspecificationentirely,andtousethetexelcolorreturnedfromthetexturesamplerinplace of the material ADS values. The following is one such strategy (expressed inpseudocode):

fragColor=textureColor*(ambientLight+diffuseLight)+specularLight

Here the texture color contributes to the ambient and diffuse computation, while thespecularcolor isdefinedentirelyby the light. It iscommontoset thespecularcontributionsolelybasedon the lightcolor, especially formetallicor“shiny” surfaces.However, somelessshinysurfaces,suchasclothorunvarnishedwood(andevenafewmetals,suchasgold)havespecularhighlightsthatincludethecoloroftheobjectsurface.Inthosecases,asuitableslightlymodifiedstrategywouldbe:

fragColor=textureColor*(ambientLight+diffuseLight+specularLight)

There are also cases inwhich anobject has anADSmaterial that is supplementedby atextureimage,suchasanobjectmadeofsilverthathasatexturethataddssometarnishtothesurface.Inthosesituations,thestandardADSmodelwithbothlightandmaterial,asdescribedin previous sections, can be combined with the texture color using a weighted sum. Forexample:

textureColor=texture(sampler,texCoord)

lightColor=(ambLight*ambMaterial)+(diffLight*diffMaterial)+

specLight

fragColor=0.5*textureColor+0.5*lightColor

This strategy for combining lighting,materials, and textures can be extended to scenesinvolvingmultiplelightsand/ormultipletextures.Forexample:

texture1Color=texture(sampler1,texCoord)

texture2Color=texture(sampler2,texCoord)

light1Color=(ambLight1*ambMaterial)+(diffLight1*diffMaterial)+

Page 178: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

specLight1

light2Color=(ambLight2*ambMaterial)+(diffLight2*diffMaterial)+

specLight2

fragColor=0.25*texture1Color

+0.25*texture2Color

+0.25*light1Color

+0.25*light2Color

Figure7.17showstheStudio522DolphinwithaUV-mappedtextureimage(producedbyJay Turberville [TU16]), and the NASA shuttle model we saw earlier in Chapter 5. BothtexturedmodelsareenhancedwithBlinn-Phonglighting,withouttheinclusionofmaterials,andwith specular highlights that utilize light only. In both cases, the relevant output colorcomputationinthefragmentshaderis:

vec4texColor=texture(sampler,texCoord);

fragColor=texColor*(globalAmbient+lightAmb+lightDiff*

max(dot(L,N),0.0))

+lightSpec*pow(max(dot(H,N),0.0),matShininess*3.0);

Notethat it ispossiblefor thecomputationthatdeterminesfragColor toproducevaluesgreaterthan1.0.Whenthathappens,OpenGLclampsthecomputedvalueto1.0.

SUPPLEMENTALNOTES

The faceted-shaded torus shown in Figure 7.7 was created by adding the “flat”interpolationqualifiertothecorrespondingnormalvectorvertexattributedeclarationsinthevertexandfragmentshaders.Thisinstructstherasterizertonotperforminterpolationonthespecifiedvariable,andinsteadassignthesamevalueforeachfragment(bydefaultitchoosesthevalueassociatedwiththefirstvertexinthetriangle).InthePhongshadingexample, thiscouldbedoneasfollows:

flatoutvec3varyingNormal;inthevertexshader,andflatinvec3varyingNormal;inthefragmentshader.

Page 179: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure7.17Combininglightingandtextures.

Animportantkindoflightsourcethatwehaven’tdiscussedisadistributedlightorarealight,whichisalightthatischaracterizedbyhavingasourcethatoccupiesanarearatherthanbeinga singlepoint location.A real-world examplewouldbe a fluorescent tube-style lightcommonly found in an office or classroom. Interested readers can findmore details aboutarealightsin[MH02].

HISTORICALNOTE

We took the liberty of over-simplifying some of the terminology in this chapter, withrespect to the contributions of Gouraud and Phong. Gouraud is credited with Gouraudshading—thenotionofgeneratingasmoothlycurvedsurfaceappearancebycomputinglightintensitiesatverticesandallowingtherasterizertointerpolatethesevalues(sometimescalled“smoothshading”).PhongiscreditedwithPhongshading,another formofsmoothshadingthatinsteadinterpolatesnormalsandcomputeslightingper-pixel.Phongisalsocreditedwithpioneeringthesuccessfulincorporationofspecularhighlightsintosmoothshading.Forthisreason, theADSlightingmodel,whenapplied tocomputergraphics, isoftenreferred toasthe Phong Reflection Model. So, our example of Gouraud shading is, more accurately,GouraudshadingwithaPhongreflectionmodel.SincePhong’sreflectionmodelhasbecomesoubiquitousin3Dgraphicsprogramming,itiscommontodemonstrateGouraudshadinginthepresenceofPhongreflection,althoughitisabitmisleadingbecauseGouraud’soriginal

Page 180: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

1971workdidnot,forexample,includeanyspecularcomponent.

Exercises

7.1 (PROJECT)ModifyProgram7.1sothatthelightcanbepositionedbymovingthemouse.Afterdoingthis,movethemousearoundandnotethemovementofthespecularhighlight,andtheappearanceoftheGouraudshadingartifacts.Youmayfinditconvenienttorenderapoint(orsmallobject)atthelocationofthelightsource.

7.2 RepeatExercise7.1,butappliedtoProgram7.2.ThisshouldonlyrequiresubstitutingtheshadersforPhongshadingintoyoursolutiontoExercise7.1.TheimprovementfromGouraudtoPhongshadingshouldbeevenmoreapparenthere,whenthelightisbeingmovedaround.

7.3 (PROJECT)ModifyProgram7.2sothatitincorporatesTWOpositionallights,placedindifferentlocations.Thefragmentshaderwillneedtoblendthediffuseandspecularcontributionsofeachofthelights.Tryusingaweightedsum,similartotheoneshowninSection7.6.Youcanalsotrysimplyaddingthem,andclampingtheresultsoitdoesn’texceedthemaximumlightvalue.

7.4 (RESEARCHANDPROJECT)ReplacethepositionallightinProgram7.2witha“spot”light,asdescribedinSection7.2.

References

[BA16] N.Barradeu,accessedJuly2016,http://www.barradeau.com/nicoptere/dump/materials.html.

[BL77]J.Blinn,“ModelsofLightReflectionforComputerSynthesizedPictures,”Proceedingsofthe4thAnnualConferenceonComputerGraphicsandInteractiveTechniques,1977.

[DI16] LuxoJr.(Pixar–copyrightheldbyDisney),accessedJuly2016,http://www.pixar.com/short_films/Theatrical-Shorts/Luxo-Jr.#.

[GO71] H.Gouraud,“ContinuousShadingofCurvedSurfaces,”IEEETransactionsonComputersC-20,no.6(June1971).

[MH02] T.Akenine-Möller,andE.Haines,Real-TimeRendering,2nded.(A.K.Peters,2002).

[PH73] B.Phong,“IlluminationofComputer-GeneratedImages,”(PhDthesis,UniversityofUtah,1973).

[PH75] B.Phong,“IlluminationforComputerGeneratedPictures,”CommunicationsoftheACM18,no.6(June1975):311–317.

[ST96] StanfordComputerGraphicsLaboratory,accessedJuly2016,http://graphics.stanford.edu/data/3Dscanrep/.

Page 181: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

[TU16] J.Turberville,Studio522Productions,Scottsdale,AZ,www.studio522.com(dolphinmodeldeveloped2016).

Page 182: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 8

SHADOWS

8.1 TheImportanceofShadows8.2 ProjectiveShadows8.3 ShadowVolumes8.4 ShadowMapping8.5 AShadowMappingExample8.6 ShadowMappingArtifacts

SupplementalNotes

8.1 THEIMPORTANCEOFSHADOWS

In Chapter 7, we learned how to add lighting to our 3D scenes. However, we didn’tactually add light—instead, we simulated the effects of light on objects—using the ADSmodel—andmodifiedhowwedrewthoseobjectsaccordingly.

Thelimitationsofthisapproachbecomeapparentwhenweuseittolightmorethanoneobject in the same scene.Consider the scene in Figure8.1,which includes both our brick-textured torusandagroundplane (thegroundplane is the topofagiantcubewithagrasstexture[LU16]).

At first glance our scenemay appear reasonable.However, closer examination revealsthat there is somethingvery importantmissing. Inparticular, it is impossible todiscern thedistancebetweenthetorusandthelargetexturedcubebelowit.Isthetorusfloatingabovethecube,orisitrestingontopofthecube?

Page 183: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure8.1Scenewithoutshadows.

Thereasonwecannotanswerthisquestionisduetothelackofshadowsinthescene.Weexpect to see shadows, and our brain uses shadows to help build amore completementalmodeloftheobjectsweseeandwheretheyarelocated.

Consider the same scene, shown in Figure 8.2, with shadows incorporated. It is nowobviousthatthetorusisrestingonthegroundplaneintheleftexample,andfloatingaboveitintherightexample.

8.2 PROJECTIVESHADOWS

Avarietyofinterestingmethodshavebeendevisedforaddingshadowsto3Dscenes.Onemethod that is well-suited to drawing shadows on a ground plane (such as our image inFigure8.1),andrelativelycomputationallyinexpensive,iscalledprojectiveshadows.Givenapointlightsourceposition(XI,YI,ZI),anobjecttorender,andaplaneonwhichtheobject’sshadowistobecast,itispossibletoderiveatransformationmatrixthatwillconvertpoints(XW, YW, ZW) on the object to corresponding shadow points (XS, 0, ZS) on the plane. Theresulting“shadowpolygon”isthendrawn,typicallyasadarkobjectblendedwiththetextureonthegroundplane,asillustratedinFigure8.3.

The advantages of projective shadow casting are that it is efficient and simple toimplement.However,itonlyworksonaflatplane—themethodcan’tbeusedtocastshadowson a curved surface or on other objects. It is still useful for performance-intensiveapplicationsinvolvingoutdoorscenes,suchasinmanyvideogames.

Figure8.2

Page 184: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Lightingwithshadows.

Figure8.3Projectiveshadow.

Development of projective shadow transformation matrices is discussed in [BL88],[AS14],and[KS16].

8.3 SHADOWVOLUMES

Another importantmethod,proposedbyCrow in1977, is to identify thespatialvolumeshadowedbyanobject,andreducethecolorintensityofpolygonsinsidetheintersectionofthe shadow volume with the view volume [CR77]. Figure 8.4 shows a cube in a shadowvolume,sothecubewouldbedrawndarker.

Shadowvolumes have the advantage of being highly accurate,with fewer artifacts thanother methods. However, finding the shadow volume, and then computing whether eachpolygon is inside of it, is computationally expensive even on modern GPU hardware.Geometry shaders can be used to generate shadowvolumes, and the stencil buffer1 can beused to determine whether a pixel is within the volume. Some graphics cards includehardwaresupportforoptimizingcertainshadowvolumeoperations.

Page 185: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure8.4Shadowvolume.

8.4 SHADOWMAPPING

One the most practical and popular methods for casting shadows is called shadowmapping.Althoughitisnotalwaysasaccurateasshadowvolumes(andisoftenaccompaniedbypeskyartifacts),shadowmappingiseasiertoimplement,canbeusedinawidevarietyofsituations,andenjoyspowerfulhardwaresupport.

Wewouldberemiss ifwefailed toclarifyouruseof theword“easier” in thepreviousparagraph. Although shadowmapping is simpler than shadow volumes (both conceptuallyand in practice), it is by nomeans “easy”!Students often find shadowmapping among themost difficult techniques to implement in a 3D graphics course. Shader programs are bynaturedifficult todebug, and shadowmapping requires theperfect coordinationof severalcomponents and shader modules. Be advised that successful implementation of shadowmappingwillbegreatlyfacilitatedbyliberaluseofthedebuggingtoolsdescribedearlierinSection2.2.

Shadowmappingisbasedonaverysimpleandcleveridea:namely,anythingthatcannotbeseenbythelight is inshadow.Thatis, ifobject#1blocksthelightfromreachingobject#2,itisthesameasthelightnotbeingableto“see”object#2.

Thereasonthis idea issopowerful is thatwealreadyhaveamethodfordeterminingifsomethingcanbe“seen”—thehiddensurfaceremovalalgorithm(HSR)usingtheZ-buffer,asdescribed in Section 2.1.7. So, a strategy for finding shadows is to temporarily move thecamera to the location of the light, apply the Z-buffer HSR algorithm, and then use theresultingdepthinformationtofindshadows.

Renderingour scenewill require twopasses:one to render the scene from thepointofviewof the light (but not actually drawing it to the screen), and a secondpass to render itfromthepointofviewofthecamera.ThepurposeofpassoneistogenerateaZ-bufferfromthelight’spointofview.Aftercompletingpassone,weneedtoretaintheZ-bufferanduseit

Page 186: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

tohelpusgenerateshadowsinpasstwo.Passtwoactuallydrawsthescene.

Ourstrategyisnowbecomingmorerefined:

(“pass 1”) Render the scene from the light’s position. The depth

bufferthencontains,foreachpixel,thedistancebetweenthelight

andthenearestobjecttoit.

Copythedepthbuffertoaseparate“shadowbuffer.”

(“pass 2”) Render the scene normally. For each pixel, look up the

correspondingpositionintheshadowbuffer.Ifthedistancetothe

point being rendered is greater than the value retrieved from the

shadowbuffer,thentheobjectbeingdrawnatthispixelisfurther

fromthelightthantheobjectnearestthelight,andthereforethis

pixelisinshadow.

When a pixel is found to be in shadow, we need to make it darker. One simple andeffectiveway of doing this is to render only its ambient lighting, ignoring its diffuse andspecularcomponents.

The method described above is often called “shadow buffering.” The term “shadowmapping” ariseswhen, in the second step, we instead copy the depth buffer into a texture.Whenatextureobjectisusedinthisway,wewillrefertoitasashadowtexture,andOpenGLhas support for shadow textures in the formof asampler2DShadow type (discussed below).This allows us to leverage the power of hardware support for texture units and samplervariables (i.e., “texture mapping”), in the fragment shader, to quickly perform the depthlookupinpass2.Ourrevisedstrategynowis:

(pass1)asbefore.

Copythedepthbufferintoatexture.

(pass 2) as before, except that the shadow buffer is now a shadow

texture.

Let’snowimplementthesesteps.

8.4.1 ShadowMapping(PASSONE)–“Draw”ObjectsfromLightPosition

Instepone,wefirstmoveourcamera to the light’sposition,and thenrender thescene.Ourgoalhereisnottoactuallydrawthesceneonthedisplay,buttocompletejustenoughoftherenderingprocessthatthedepthbufferisproperlyfilled.Thusitwillnotbenecessarytogeneratecolorsforthepixels,andsoourfirstpasswillutilizeavertexshader,butwillnotrequireafragmentshader.

Page 187: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure8.5Shadowmappingpass1vertexshader.

Of course, moving the camera involves constructing an appropriate view matrix.Dependingonthecontentsofthescene,wewillneedtodecideonanappropriatedirectiontoviewthescenefromthelight.Typically,wewouldwantthisdirectiontobetowardstheregionthatwouldultimatelyberenderedinstep3.Thisisoftenapplicationspecific—inourscenes,wewillgenerallybepointingthecamerafromthelighttotheorigin.

Herearesomeofthedetailshandledinpassone:

Configurethebufferandshadowtexture.Disablecoloroutput.Buildalook-atmatrixfromthelighttowardstheobjectsinview.Enable the GLSL pass one shader program, containing only the simple vertexshadershowninFigure8.5 thatexpects to receiveanMVPmatrix. In thiscase, theMVPmatrixwillincludetheobject’smodelmatrixM,thelook-atmatrixcomputedinthepreviousstep(servingastheviewmatrixV),andtheperspectivematrixP.WecallthisMVPmatrix“shadowMVP”becauseitisbasedonthepointofviewofthelightratherthanthecamera.Sincetheviewfromthelightisn’tactuallybeingdisplayed,thepassoneshaderprogramdoesn’tincludeafragmentshader.For each object, create theshadowMVPmatrix, and call glDrawArrays(). It is notnecessary to include textures or lighting in pass one, because objects are notrenderedtothescreen.

8.4.2 ShadowMapping(IntermediateStep)–CopyingtheZ-BuffertoaTexture

OpenGLoffers twomethodsforputtingZ-bufferdepthdataintoa textureunit.Thefirstmethod is to generate an empty shadow texture, and then use the commandglCopyTexImage2D()tocopytheactivedepthbufferintotheshadowtexture.

Thesecondmethodistobuilda“customframebuffer”backinpassone(ratherthanusethe default Z-buffer), and attach the shadow texture to it using the commandglFrameBufferTexture(). This command was introduced into OpenGL in version 3.0, tofurthersupportshadowmapping.Whenusingthisapproach,it isn’tnecessaryto“copy”theZ-bufferintoatexture,becausethebufferalreadyhasatextureattachedtoit,andsothedepthinformationisputintothetexturebyOpenGLautomatically.Thisisthemethodwewillusein

Page 188: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

ourimplementation.

8.4.3 ShadowMapping(PASSTWO)–RenderingtheScenewithShadows

Much of pass twowill resemblewhatwe saw in Chapter 7. Namely, it is here that werenderourcompletesceneandalloftheitemsinit,alongwiththelighting,materials,andanytextures adorning the objects in the scene. We also need to add the necessary code todetermine,foreachpixel,whetherornotitisinshadow.

Animportantfeatureofpass twois that itutilizes twoMVPmatrices.One is thestandardMVPmatrixthattranformsobjectcoordinatesintoscreencoordinates(asseeninmostofourpreviousexamples).TheotheristheshadowMVPmatrixthatwasgeneratedinpassoneforuseinrenderingfromthelight’spointofview—thiswillnowbeusedinpasstwoforlookingupdepthinformationfromtheshadowtexture.

A complication arises in pass twowhenwe try to lookuppixels in a texturemap.TheOpenGLcamerautilizes a[-1,+1] coordinate space,whereas texturemaps utilize a[0,1]space.Acommonsolutionistobuildanadditionalmatrixtransform,typicallycalledB, thatconverts (or “biases,” hence the name) from camera space to texture space. Deriving B isfairlysimple—ascaleby½followedbyatranslateby½.

TheBmatrixisasfollows:

BisthenconcatenatedontotheshadowMVPmatrixforuseinpasstwo,thusly:

shadowMVP2=[B][shadowMVP(pass1)]

Assuming that we use the method whereby a shadow texture has been attached to ourcustomframebuffer,OpenGLprovidessomerelativelysimpletoolsfordeterminingwhethereachpixelisinshadowaswedrawtheobjects.Hereisasummaryofthedetailshandledinpasstwo:

Buildthe“B”transformmatrixforconvertingfromlighttotexturespace(actuallythisismoreappropriatelydoneininit()).Enabletheshadowtextureforlook-up.Enablecoloroutput.EnabletheGLSLpasstworenderingprogram,containingbothvertexandfragmentshaders.Build MVP matrix for the object being drawn based on the camera position (asnormal).BuildtheshadowMVP2matrix(incorporatingtheBmatrix,asdescribedearlier)—theshaderswillneedittolookuppixelcoordinatesintheshadowtexture.

Page 189: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Sendthematrixtransformstoshaderuniformvariables.Enable buffers containing vertices, normal vectors, and texture coordinates (ifused),asusual.CallglDrawArrays().

Inadditiontotheirrenderingduties,thevertexandfragmentshadershaveadditionaltasks:

Thevertexshaderconvertsvertexpositionsfromcameraspacetolightspace,andsendstheresultingcoordinatestothefragmentshaderinavertexattributesothattheywillbeinterpolated.Thismakesitpossibletoretrievethecorrectvaluesfromtheshadowtexture.The fragment shader calls the textureProj() function, which returns a 0 or 1indicatingwhetherornotthepixelisinshadow(thismechanismisexplainedlater).If it is in shadow, the shader darkens the pixel by not including its diffuse andspecularcontributions.

Shadowmapping is such a common task thatGLSLprovides a special type of samplervariablecalledasampler2DShadow(aspreviouslymentioned)thatcanbeattachedtoashadowtexturein theJava/JOGLapplication.ThetextureProj() functionisusedto lookupvaluesfrom a shadow texture, and is similar to texture() that we saw previously in Chapter 5,except that it uses a vec3 to index the texture rather than the usual vec2. Since a pixelcoordinateisavec4,itisnecessarytoprojectthatonto2Dtexturespaceinordertolookupthedepthvalueintheshadowtexturemap.Aswewillseebelow,textureProj()doesallofthisforus.

TheremainderofthevertexandfragmentshadercodeimplementsBlinn-Phongshading.They are shown in Figure 8.6 and Figure 8.7, with the added code for shadow mappinghighlighted.

Let’s examine more closely how we use OpenGL to perform the depth comparisonbetween thepixelbeingrenderedand thevalue in theshadowtexture.Westart in thevertexshaderwithavertexcoordinateineyespace,whichwemultiplybyshadowMVP2toproducethecorresponding coordinate in “light space” (that is, from the light’s point of view). Theinterpolated(3D)lightspacecoordinates(x,y,z)areusedinthefragmentshaderasfollows.Thezcomponentrepresents thedistancefromthe light to thepixel.The(x,y)componentsareused to retrieve thedepth informationstored in the (2D)shadow texture.This retrievedvalue (the distance to the object nearest the light) is compared with z. This comparisonproducesa“binary”resultthattellsuswhetherthepixelwearerenderingisfurtherfromthelightthantheobjectnearestthelight(i.e.,whetherthepixelisinshadow).

Page 190: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure8.6Shadowmappingpass2vertexshader.

Figure8.7Shadowmappingpass2fragmentshader.

Page 191: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

IfinOpenGLweuseglFrameBufferTexture()asdescribedearlier,andweenabledepthtesting, thenusingasampler2DShadow andtextureProj() as shown in the fragment shader(Figure8.7)willdoexactlywhatweneed.Thatis,textureProj()willoutputeithera0ora1dependingonthedepthcomparison.Basedonthisvalue,wecantheninthefragmentshaderomit thediffuseandspecularcontributionswhen thepixel is further fromthe light than theobject nearest the light, effectively creating the shadowwhen appropriate. An overview isshowninFigure8.8.

Figure8.8Automaticdepthcomparison.

WearenowreadytobuildourJOGLapplication toworkwith thepreviouslydescribedshaders.

8.5 ASHADOWMAPPINGEXAMPLE

ConsiderthesceneinFigure8.9thatincludesatorusandapyramid.Apositionallighthasbeenplacedontheleft(notethespecularhighlights—thepyramidshouldcastashadowonthetorus).

Page 192: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure8.9Lightedscenewithoutshadows.

Toclarifythedevelopmentoftheexample,ourfirststepwillbetorenderpassonetothescreen tomake sure it is working properly. To do this, wewill temporarily add a simplefragment shader (itwillnotbe included in the finalversion) topassone that justoutputs aconstantcolor(e.g.,red);forexample:

#version430

outvec4fragColor;

voidmain(void)

{fragColor=vec4(1.0,0.0,0.0,0.0);

}

Let’sassume that theoriginof the sceneabove is situatedat thecenterof the figure, inbetweenthepyramidandthetorus.Inpassoneweplacethecameraatthelight’sposition(atthe left inFigure8.10) and point it towards(0,0,0). Ifwe then draw the objects in red, itproducestheoutputshownattherightinFigure8.10.Notethetorusnearthetop—fromthisvantagepointitispartiallybehindthepyramid.

Figure8.10Passone:Scene(left)fromlight’spointofview(right).

The complete two-pass JOGL code with lighting and shadow mapping is shown inProgram8.1:

Program8.1ShadowMapping//Muchisthesameaswehaveseenbefore.Newsectionstosupportshadows

arehighlighted.

//Theimportsnecessaryforlighting,etc.,wouldbeincludedatthestart,

arethesameasbefore,

//andarenotshownhere.

publicclassCodeextendsJFrameimplementsGLEventListener

{//variablesforlights,materials,objects,renderingprograms,uniforms,

buffers,shadersources,

//etc.,wouldgohere.

...

privateTorusmyTorus=newTorus(0.6f,0.4f,48);

...

Page 193: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//setlocationoftorus,pyramid,camera,andlight

privatePoint3DtorusLoc=newPoint3D(1.6,0.0,-0.3);

privatePoint3DpyrLoc=newPoint3D(-1.0,0.1,0.3);

privatePoint3DcameraLoc=newPoint3D(0.0,0.2,6.0);

privatePoint3DlightLoc=newPoint3D(-3.8f,2.2f,1.1f);

//lightandcameraviewmatrixtransformsarealldeclaredhere

(m_matrix,v_matrix,etc.)and

//areoftypeMatrix3D.

...

//shadow-relatedvariables

privateintscreenSizeX,screenSizeY;

privateint[]shadow_tex=newint[1];

privateint[]shadow_buffer=newint[1];

privateMatrix3DlightV_matrix=newMatrix3D();

privateMatrix3DlightP_matrix=newMatrix3D();

privateMatrix3DshadowMVP=newMatrix3D();

privateMatrix3DshadowMVP2=newMatrix3D();

privateMatrix3Db=newMatrix3D();

publicCode()

{//TheconstructorisunchangedfromProgram7-3,andsoitisnot

shownhere.

//ThisexampleassumesthatFPSAnimator()isbeingused.

}

//Theinit()routineperformstheusualcallstocompileshadersand

initializeobjects.Italsocalls

//setupShadowBuffers()toinstantiatethebuffersrelatedtoshadow-

mapping

//Lastly,itbuildstheBmatrixforconvertingfromlightspaceto

texturespace.

publicvoidinit(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

createShaderProgram();

setupVertices();

setupShadowBuffers();

b.setElementAt(0,0,0.5);b.setElementAt(0,1,0.0);

b.setElementAt(0,2,0.0);b.setElementAt(0,3,0.5f);

b.setElementAt(1,0,0.0);b.setElementAt(1,1,0.5);

b.setElementAt(1,2,0.0);b.setElementAt(1,3,0.5f);

b.setElementAt(2,0,0.0);b.setElementAt(2,1,0.0);

b.setElementAt(2,2,0.5);b.setElementAt(2,3,0.5f);

b.setElementAt(3,0,0.0);b.setElementAt(3,1,0.0);

b.setElementAt(3,2,0.0);b.setElementAt(3,3,1.0f);

}

publicvoidsetupShadowBuffers()

{GL4gl=(GL4)GLContext.getCurrentGL();

screenSizeX=myCanvas.getWidth();

screenSizeY=myCanvas.getHeight();

//createthecustomframebuffer

gl.glGenFramebuffers(1,shadow_buffer,0);

Page 194: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//createtheshadowtextureandconfigureittoholddepth

information.

//thesestepsaresimilartothoseinProgram5.2

gl.glGenTextures(1,shadow_tex,0);

gl.glBindTexture(GL_TEXTURE_2D,shadow_tex[0]);

gl.glTexImage2D(GL_TEXTURE_2D,0,GL_DEPTH_COMPONENT32,

screenSizeX,screenSizeY,0,GL_DEPTH_COMPONENT,GL_FLOAT,

null);

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_COMPARE_MODE,

GL_COMPARE_REF_TO_TEXTURE);

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_COMPARE_FUNC,GL_LEQUAL);

}

//display()managesthesetupofthecustomframebuffer,theshadow

texture

//inpreparationforpass1andpass2respectively.Newshadow-related

featuresarehighlighted.

publicvoiddisplay(GLAutoDrawabledrawable)

{GL4gl=(GL4)GLContext.getCurrentGL();

//setuplightandperspectivematrixasbefore

currentLight.setPosition(lightLoc);

aspect=(float)myCanvas.getWidth()/(float)myCanvas.getHeight();

proj_matrix=perspective(50.0f,aspect,0.1f,1000.0f);

//clearthecolorbuffertothedesiredbackgroundcolor

FloatBufferbg=FloatBuffer.allocate(4);

bg.put(0,0.0f);bg.put(1,0.0f);bg.put(2,0.0f);bg.put(3,1.0f);

gl.glClearBufferfv(GL_COLOR,0,bg);

//makethecustomframebuffercurrent,andassociateitwiththe

shadowtexture

gl.glBindFramebuffer(GL_FRAMEBUFFER,shadow_buffer[0]);

gl.glFramebufferTexture(GL_FRAMEBUFFER,GL_DEPTH_ATTACHMENT,

shadow_tex[0],0);

//disabledrawingcolors,butenablethedepthcomputation

gl.glDrawBuffer(GL_NONE);

gl.glEnable(GL_DEPTH_TEST);

passOne();

//restorethedefaultdisplaybuffer,andre-enabledrawing

gl.glBindFramebuffer(GL_FRAMEBUFFER,0);

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_2D,shadow_tex[0]);

gl.glDrawBuffer(GL_FRONT);//drawingonlyfrontfacesallowsback

faceculling

passTwo();

}

//Whatfollowsnowarethemodulesforthefirstandsecondpasses.

//Theyarelargelyidenticaltothingswehaveseenbefore.

//Shadow-relatedadditionsarehighlighted.

Page 195: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

publicvoidpassOne()

{GL4gl=(GL4)GLContext.getCurrentGL();

//rendering_program1containsonlythepassonevertexshader

gl.glUseProgram(rendering_program1);

//buildthelight’sPandVmatricestolook-attheorigin

Point3Dorigin=newPoint3D(0.0,0.0,0.0);

Vector3Dup=newVector3D(0.0,1.0,0.0);

lightV_matrix.setToIdentity();

lightP_matrix.setToIdentity();

lightV_matrix=lookAt(currentLight.getPosition(),origin,up);//

vectorfromlighttoorigin

lightP_matrix=perspective(50.0f,aspect,0.1f,1000.0f);

//thefollowingblocksofcoderenderthetorusintothedepthbuffer

m_matrix.setToIdentity();

m_matrix.translate(torusLoc.getX(),torusLoc.getY(),torusLoc.getZ());

m_matrix.rotateX(25.0);

//wearedrawingfromthelight’spointofview,soweusethelight’s

PandVmatrices

shadowMVP.setToIdentity();

shadowMVP.concatenate(lightP_matrix);

shadowMVP.concatenate(lightV_matrix);

shadowMVP.concatenate(m_matrix);

intshadow_location=gl.glGetUniformLocation(rendering_program1,

"shadowMVP");

gl.glUniformMatrix4fv(shadow_location,1,false,

shadowMVP.getFloatValues(),0);

//weonlyneedtosetuptorusverticesbuffer–wedon’tneedits

texturesornormalsforpassone.

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

gl.glClear(GL_DEPTH_BUFFER_BIT);

gl.glEnable(GL_CULL_FACE);

gl.glFrontFace(GL_CCW);

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

//repeatforthepyramid(butdon’tcleartheGL_DEPTH_BUFFER_BIT)

...

gl.glDrawArrays(GL_TRIANGLES,0,pyramid.getNumVertices());

}

publicvoidpassTwo()

{GL4gl=(GL4)GLContext.getCurrentGL();

//rendering_program2includesthepasstwovertexandfragmentshaders

gl.glUseProgram(rendering_program2);

Page 196: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//drawthetorus–thistimeweneedtoincludelighting,materials,

normals,etc.

//WealsoneedtoprovideMVPtranformsforBOTHcameraspaceand

lightspace.

thisMaterial=Material.BRONZE;

installLights(rendering_program2,v_matrix);

mv_location=gl.glGetUniformLocation(rendering_program2,"mv_matrix");

proj_location=gl.glGetUniformLocation(rendering_program2,

"proj_matrix");

n_location=gl.glGetUniformLocation(rendering_program2,"norm_matrix");

intshadow_location=gl.glGetUniformLocation(rendering_program2,

"shadowMVP2");

//buildtheMVmatrixfromthecamera’spointofview

m_matrix.setToIdentity();v_matrix.setToIdentity();

mv_matrix.setToIdentity();

m_matrix.translate(torusLoc.getX(),torusLoc.getY(),torusLoc.getZ());

v_matrix.translate(-cameraLoc.getX(),-cameraLoc.getY(),-cameraLoc.getZ());

mv_matrix.concatenate(v_matrix);

mv_matrix.concatenate(m_matrix);

//buildtheMVPmatrixfromthelight’spointofview

shadowMVP2.setToIdentity();

shadowMVP2.concatenate(b);

shadowMVP2.concatenate(lightP_matrix);

shadowMVP2.concatenate(lightV_matrix);

shadowMVP2.concatenate(m_matrix);

//puttheMVandPROJmatricesintothecorrespondinguniforms

gl.glUniformMatrix4fv(mv_location,1,false,mv_matrix.getFloatValues(),

0);

gl.glUniformMatrix4fv(proj_location,1,false,

proj_matrix.getFloatValues(),0);

gl.glUniformMatrix4fv(n_location,1,false,

(mv_matrix.inverse()).transpose().getFloatValues(),0);

gl.glUniformMatrix4fv(shadow_location,1,false,

shadowMVP2.getFloatValues(),0);

//setuptorusverticesandnormalbuffers(andtexturecoordinate

bufferifused)

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[0]);//torusvertices

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[2]);//torusnormals

gl.glVertexAttribPointer(1,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(1);

gl.glClear(GL_DEPTH_BUFFER_BIT);

gl.glEnable(GL_CULL_FACE);

gl.glFrontFace(GL_CCW);

gl.glEnable(GL_DEPTH_TEST);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

Page 197: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//repeatforthepyramid(butdon’tcleartheGL_DEPTH_BUFFER_BIT)

...

gl.glDrawArrays(GL_TRIANGLES,0,pyramid.getNumVertices());

}

Program8.1shows the relevantportionsof theJava/JOGLapplication that interactwiththepassoneandpasstwoshaderspreviouslydetailed.Notshownaretheusualmodulesforreadinginandcompilingtheshaders,buildingthemodelsandtheirrelatedbuffers,installingthepositionallight’sADScharacteristics,andperformingtheperspectiveandlook-atmatrixcomputations.Thoseareunchangedfrompreviousexamples.

8.6 SHADOWMAPPINGARTIFACTS

Althoughwehaveimplementedallof thebasicrequirementsforaddingshadowstoourscene,runningProgram8.1producesmixedresults,asshowninFigure8.11.

Thegoodnewsisthatourpyramidisnowcastingashadowonthetorus!Unfortunately,thissuccessisaccompaniedbyasevereartifact.Therearewavylinescoveringmanyofthesurfaces in the scene. This is a common by-product of shadow mapping, and is called“shadowacne,”orerroneousself-shadowing.

Figure8.11Shadow“acne.”

Shadowacneiscausedbyroundingerrorsduringdepthtesting.Thetexturecoordinatescomputed when looking up the depth information in a shadow texture often don’t exactlymatchtheactualcoordinates.Thus,thelookupmayreturnthedepthforaneighboringpixel,ratherthantheonebeingrendered.Ifthedistancetotheneighboringpixelisfurther,thenourpixelwillappeartobeinshadowevenifitisn’t.

Shadowacnecanalsobecausedbydifferencesinprecisionbetweenthetexturemapandthe depth computation. This too can lead to rounding errors and subsequent incorrectassessmentofwhetherornotapixelisinshadow.

Fortunately, fixing shadow acne is fairly easy. Since shadow acne typically occurs on

Page 198: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

surfaces thatarenot in shadow,a simple trick is tomoveeverypixel slightlycloser to thelightduringpassone,andthenmovethembacktotheirnormalpositionsforpasstwo.Thisisusually sufficient to compensate for either type of rounding error. An easyway is to callglPolygonOffset()inthedisplay()function,asshowninFigure8.12(highlighted):

Figure8.12Combatingshadowacne.

Adding these few lines of code to our display() function improves the output of ourprogramconsiderably,as shown inFigure8.13.Note also that the inner circleof the torusnowdisplaysasmallcorrectlycastshadowonitself.

Although fixing shadow acne is easy, sometimes the repair causes new artifacts. The“trick”ofmovingtheobjectbeforepassonecansometimescauseagaptoappearinsideanobject’s shadow. An example of this is shown in Figure 8.14. This artifact is often called“peter panning,” because sometimes it causes the shadow of a resting object toinappropriatelyseparatefromtheobject’sbase(thusmakingportionsoftheobject’sshadowdetachfromtherestoftheshadow,reminiscentofJ.M.Barrie’scharacterPeterPan[PP16]).Fixing this artifact requires adjusting the glPolygonOffset() parameters. If they are toosmall,shadowacnecanappear;iftoolarge,peterpanninghappens.

Page 199: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure8.13Renderedscenewithshadows.

There are many other artifacts that can happen during shadowmapping. For example,shadowscanrepeatasaresultoftheregionofthescenebeingrenderedinpassone(intotheshadowbuffer) being different than the region of the scene rendered in pass two (they arefrom different vantage points). Because of this difference, those portions of the scenerenderedinpasstwothatfalloutsidetheregionrenderedinpassonewillattempttoaccesstheshadow texture using texture coordinates outside of the range (0..1). Recall that the defaultbehaviorinthiscaseisGL_REPEAT,whichcanresultinincorrectlyduplicatedshadows.

OnepossiblesolutionistoaddthefollowinglinesofcodetosetupShadowBuffers(), tosetthetexturewrapmodeto“clamptoedge”:

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);

gl.glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);

Thiscausesvaluesoutsideofatextureedgetobeclampedtothevalueatedge(insteadofrepeating).Note that this approach can introduce its own artifacts, namely,when a shadowexists at theedgeof the shadow texture, clamping to theedgecanproducea “shadowbar”extendingtotheedgeofthescene.

Figure8.14“Peterpanning.”

Anothercommonerrorisjaggedshadowedges.Thiscanhappenwhentheshadowbeingcast is significantly larger than the shadow buffer can accurately represent. This usually

Page 200: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

depends on the location of the objects and light(s) in the scene. In particular, it commonlyoccurswhen the light source is relativelydistant from theobjects involved.An example isshowninFigure8.15.

Figure8.15Jaggedshadowedges.

Eliminatingjaggedshadowedgesisnotassimpleasforthepreviousartifacts.Thereareanumber of effective approaches that are available, but these are outside the scope of thistextbook. See the supplemental chapter notes for references to advanced techniques forimprovingshadowmapping.

SUPPLEMENTALNOTES

Inthischapterwehaveonlygiventhemostbasicofintroductionstotheworldofshadowsin3Dgraphics.Evenusing thebasicmethodofshadowmappingpresentedherewill likelyrequirefurtherstudyifusedinmorecomplexscenes.

Forexample,whenaddingshadowstoasceneinwhichsomeoftheobjectsaretextured,itis necessary to ensure that the fragment shader properly distinguishes between the shadowtextureandothertextures.Asimplewayofdoingthisistobindthemtodifferenttextureunits,suchas:

layout(binding=0)uniformsampler2DShadowshTtex;

layout(binding=1)uniformsampler2DotherTexture;

Then,theJOGLapplicationcanrefertothetwosamplersbytheirbindingvalues.

When a scene utilizesmultiple lights,multiple shadow textures are necessary—one foreach lightsource.And,apassonewillneed tobeperformed foreachone,with the resultsblendedinpasstwo.

Page 201: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure8.16Softshadowreal-worldexample.

Figure8.17Softshadoweffect.

Although we have used perspective projection at each phase of shadowmapping, it isworthnotingthatorthographicprojection isoftenpreferredwhenthe lightsource isdistantanddirectional,ratherthanthepositionallightweutilized.

Finally, the methods presented here are limited to producing hard shadows. These areshadows with sharp edges. However, most shadows that occur in the real world are softshadows. That is, their edges are blurred to various degrees, such as in the real-worldexample shown inFigure8.16. There aremany causes of soft shadows, themost commonbeingthatlightsourcesarerarelypoints—moreoftentheyoccupysomearea.ThisgivesrisetothepenumbraeffectshowninFigure8.17.

Therearemanywaysofsimulatingthepenumbraeffectinsoftshadows.Forexample,theshadowbuffer can be sampledmultiple timeswith a bit of randomness incorporated. Suchmethodsincursomeamountofoverheadthatcanimpactperformance.

Generatingrealisticshadowsisarichandcomplexareaofcomputergraphics,andmanyoftheavailabletechniquesareoutsidethescopeofthistext.Readersinterestedinmoredetailareencouragedtoinvestigatemorespecializedresourcessuchas[GP10]and[MI16].

Page 202: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Exercises

8.1 InProgram8.1,experimentwithdifferentsettingsforglPolygonOffset(),andobservetheeffectsonshadowartifactssuchaspeterpanning.

8.2 (PROJECT)ModifyProgram8.1sothatthelightcanbepositionedbymovingthemouse,similartoExercise7.1.Youwillprobablynoticethatsomelightingpositionsexhibitshadowartifacts,whileotherslookfine.

8.3 (PROJECT)AddanimationtoProgram8.1,suchthateithertheobjects,orthelight(orboth)movearoundontheirown—suchasonerevolvingaroundtheother.Theshadoweffectswillbemorepronouncedifyouaddagroundplanetothescene,suchastheoneillustratedinFigure8.14.

References

[AS14] E.AngelandD.Shreiner,InteractiveComputerGraphics:ATop-DownApproachwithWebGL,7thed.(Pearson,2014).

[BL88] J.Blinn,“MeandMy(Fake)Shadow,”IEEEComputerGraphicsandApplications8,no.2(1988).

[CR77] F.Crow,“ShadowAlgorithmsforComputerGraphics,”ProceedingsofSIGGRAPH’7711,no.2(1977).

[GP10] GPUPro(series),ed.WolfgangEngel(A.K.Peters,Ltd.,2010–2016).

[KS16]J.Kessenich,G.Sellers,andD.Shreiner,OpenGLProgrammingGuide:TheOfficialGuidetoLearningOpenGL,Version4.5withSPIR-V,9thed.(Addison-Wesley,2016).

[LU16] F.Luna,Introductionto3DGameProgrammingwithDirectX12,2nded.(MercuryLearning,2016).

[MI16]CommonTechniquestoImproveShadowDepthMaps(MicrosoftCorp.,2016),accessedJuly2016,https://msdn.microsoft.com/en-us/library/windows/desktop/ee416324(v=vs.85).aspx.

[PP16] PeterPan,Wikipediapage,accessedJuly2016,https://en.wikipedia.org/wiki/Peter_Pan.

1Thestencilbuffer is a thirdbuffer—alongwith thecolorbuffer and thez-buffer—accessible throughOpenGL. The stencilbufferisnotdescribedinthistextbook.

Page 203: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 9

SKYANDBACKGROUNDS

9.1 Skyboxes9.2 Skydomes9.3 ImplementingaSkybox9.4 EnvironmentMapping

SupplementalNotes

Therealisminanoutdoor3Dscenecanoftenbeimprovedbygeneratingarealisticeffectat the horizon. As we look beyond our nearby buildings and trees, we are accustomed toseeinglargedistantobjectssuchasclouds,mountains,orthesun(oratnight,themoonandstars). However, adding such objects to our scene as individual models may come at anunacceptable performance cost. A skybox or skydome provides a relatively simple way ofefficientlygeneratingaconvincinghorizon.

9.1 SKYBOXES

Theconceptofaskyboxisaremarkablycleverandsimpleone:

1. Instantiateacubeobject.2. Texturethecubewiththedesiredenvironment.3. Positionthecubesoitsurroundsthecamera.

Wealreadyknowhowtodoallofthesesteps.Thereareafewadditionaldetails,however.

• Howdowemakethetextureforourhorizon?

Acubehassixfaces,andwewillneedtotextureallofthem.Onewayistousesiximagefiles,andsixtextureunits.Anothercommon(andefficient)wayistouseasingleimagethat

Page 204: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

containstexturesforallsixfaces,suchasshowninFigure9.1.

Figure9.1Six-facedskyboxtexturecubemap.

Animagethatcantextureallsixfacesofacubewithasingletextureunitisanexampleofa texturecubemap.The sixportionsof the cubemapcorrespond to the top,bottom, front,back,andtwosidesofthecube.When“wrapped”aroundthecube,itactsasahorizonforacameraplacedinsidethecube,asshownbelowinFigure9.2.

Figure9.2Cubemapwithcameraplacedinside.

Texturing the cube with a texture cube map requires specifying appropriate texturecoordinates.Figure9.3showsthedistributionoftexturecoordinatesthatareinturnassignedtoeachofthecubevertices.

Page 205: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure9.3Cubemaptexturecoordinates.

• Howdowemaketheskyboxappear“distant”?

Another important factor in building a skybox is ensuring that the texture appears as adistanthorizon.Atfirst,onemightassumethiswouldrequiremakingtheskyboxverylarge.However,it turnsoutthatthisisn’tdesirablebecauseitwouldstretchanddistortthetexture.Instead, it ispossible tomake theskyboxappearvery large (and thusdistant),byusing thefollowingtwo-parttrick:

Disabledepth testing,and render theskyboxfirst (re-enablingdepth testingwhenrenderingtheotherobjectsinthescene).Movetheskyboxwiththecamera(ifthecameramoves).

Bydrawingtheskyboxfirstwithdepthtestingdisabled,thedepthbufferwillstillbefilledcompletelywith1.0’s (i.e.,maximallyfaraway).Thusallotherobjects in thescenewillbefullyrendered;thatis,noneoftheotherobjectswillbeblockedbytheskybox.Thiscausesthewallsof theskyboxtoappearfurtherawaythaneveryotherobject,regardlessof theactualsizeof the skybox.The actual skyboxcube itself canbequite small, as longas it ismovedalongwiththecamerawheneverthecameramoves.Figure9.4showsviewingasimplescene(actuallyjustabrick-texturedtorus)frominsideaskybox.

Page 206: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure9.4Viewingascenefrominsideaskybox.

• Howdoweconstructthetexturecubemap?

Building a texture cube map image, from artwork or photos, requires care to avoid“seams”at thecube face junctions,and tocreateproperperspective so that the skyboxwillappear realistic and undistorted. Many tools exist for assisting in this regard: Terragen,Autodesk3dsMax,Blender,andAdobePhotoshophavetoolsforbuildingorworkingwithcubemaps.Therearealsomanywebsitesofferingavarietyofoff-the-shelfcubemaps,someforaprice,someforfree.

9.2 SKYDOMES

Anotherwayofbuildingahorizoneffectistouseaskydome.Thebasicideaisthesameasforaskybox,exceptthatinsteadofusingatexturedcube,weuseatexturedsphere(orhalfasphere). As was done for the skybox, we render the skydome first (with depth testingdisabled),andkeepthecamerapositionedatthecenteroftheskydome.(TheskydometextureinFigure9.5wasmadeusingTerragen[TE16].)

Skydomeshavesomeadvantagesoverskyboxes.Forexample,theyarelesssusceptibletodistortionandseams(althoughsphericaldistortionatthepolesmustbeaccountedforinthetextureimage).Onedisadvantageofaskydomeisthatasphereordomeisamorecomplexmodel than a cube,withmanymore vertices and a potentially varying number of verticesdependingonthedesiredprecision.

Figure9.5Skydomewithcameraplacedinside.

When using a skydome to represent an outdoor scene, it is usually combined with agroundplaneorsomesortofterrain.Whenusingaskydometorepresentasceneinspace,suchasastarfield,itisoftenmorepracticaltouseasphere,suchasshownbelowinFigure9.6(adashedlinehasbeenaddedforclarityinvisualizingthesphere).

Page 207: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure9.6Skydomeofstarsusingasphere(starfieldfrom[BO01]).

9.3 IMPLEMENTINGASKYBOX

Despite the advantages of a skydome, skyboxes are stillmore common. They also arebetter supported in OpenGL, which is advantageous when doing environment mapping(coveredlaterinthischapter).Forthesereasons,wewillfocusonskyboximplementation.

Thereare twomethodsof implementinga skybox:buildinga simpleone fromscratch,andusingthecubemapfacilitiesinOpenGL.Eachhasitsadvantages,sowewillcoverthemboth.

9.3.1 BuildingaSkyboxfromScratch

We have already covered almost everything needed to build a simple skybox. A cubemodelwaspresentedinChapter4;wecanassignthetexturecoordinatesasshownearlierinthischapterinFigure9.3.Wesawhowtoreadintextures,andhowtopositionobjectsin3Dspace.Wewillseehowtoeasilyenableanddisabledepthtesting(it’sasinglelineofcode).

Program9.1showsthecodeorganizationforoursimpleskybox,withasceneconsistingofjustasingletexturedtorus.Thetexturecoordinateassignmentsandcallsthatenable/disablethedepthbufferarehighlighted.

Program9.1SimpleSkybox//importsasbefore

...

publicclassCodeextendsJFrameimplementsGLEventListener,

{

//allvariabledeclarations,constructor,andinit()sameasbefore

...

publicvoiddisplay(GLAutoDrawabledrawable)

Page 208: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

{//clearcoloranddepthbuffers,andcreateprojectionandcameraview

matrixasbefore

...

gl.glUseProgram(rendering_program);

//Preparetodrawtheskyboxfirst.TheMmatrixplacestheskyboxat

thecameralocation

m_matrix.setToIdentity();

m_matrix.translate(camera.getLocation().getX(),

camera.getLocation().getY(),

camera.getLocation().getZ());

//buildtheMODEL-VIEWmatrix

mv_matrix.setToIdentity();

mv_matrix.concatenate(v_matrix);

mv_matrix.concatenate(m_matrix);

//putMVandPROJmatricesintouniforms,asbefore

...

//setupbuffercontainingvertices

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[2]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

//setupbuffercontainingtexturecoordinates

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[3]);

gl.glVertexAttribPointer(1,2,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(1);

//activatetheskyboxtexture

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_2D,textureID2);

gl.glEnable(GL_CULL_FACE);

gl.glFrontFace(GL_CCW);//cubehasCWwindingorder,butweare

viewingitsinterior

gl.glDisable(GL_DEPTH_TEST);

gl.glDrawArrays(GL_TRIANGLES,0,36);//drawtheskyboxwithoutdepth

testing

gl.glEnable(GL_DEPTH_TEST);

//nowdrawdesiredsceneobjectsasbefore

...

gl.glDrawArrays(...);//asbeforeforsceneobjects

}

privatevoidsetupVertices()

{//cube_verticesdefinedsameasbefore

//cubetexturecoordinatesfortheskybox:

float[]cube_texture_coord=

{.25f,.66f,.25f,.33f,.50f,.33f,//fronttriangles

.50f,.33f,.50f,.66f,.25f,.66f,//

.50f,.33f,.75f,.33f,.50f,.66f,//righttriangles

.75f,.33f,.75f,.66f,.50f,.66f,//

.75f,.33f,1.0f,.33f,.75f,.66f,//backtriangles

1.0f,.33f,1.0f,.66f,.75f,.66f,//

Page 209: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

0.0f,.33f,.25f,.33f,0.0f,.66f,//lefttriangles

.25f,.33f,.25f,.66f,0.0f,.66f,//

.25f,0.0f,.50f,0.0f,.50f,.33f,//bottomtriangles

.50f,.33f,.25f,.33f,.25f,0.0f,//

.25f,.66f,.50f,.66f,.50f,1.0f,//toptriangles

.50f,1.0f,.25f,1.0f,.25f,.66f//

};

//setupbuffersforcubeandsceneobjectsasusual

}

//modulesforloadingshaders,textures,etc.asbefore

}

Standardtexturingshadersarenowusedforallobjectsinthescene,includingthecubemap:

VertexShader#version430

layout(location=0)invec3position;

layout(location=1)invec2tex_coord;

outvec2tc;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

layout(binding=0)uniformsampler2Ds;

voidmain(void)

{tc=tex_coord;

gl_Position=proj_matrix*mv_matrix*vec4(position,1.0);

}

FragmentShader#version430

invec2tc;

outvec4fragColor;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

layout(binding=0)uniformsampler2Ds;

voidmain(void)

{fragColor=texture(s,tc);

}

TheoutputofProgram9.1 is shown inFigure9.7, for each of two different cubemaptextures.

Asmentionedearlier,skyboxesaresusceptibletoimagedistortionandseams.Seamsarelinesthataresometimesvisiblewheretwotextureimagesmeet,suchasalongtheedgesofthecube.Figure9.8showsanexampleofaseamintheupperpartoftheimagethatisanartifactof running the previous Program9.1.Avoiding seams requires careful construction of thecube map image, and assignment of precise texture coordinates. There exist tools for

Page 210: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

reducing seams along image edges (such as [GI16]); this topic is outside the scope of thisbook.

Figure9.7Simpleskyboxresults.

9.3.2 UsingOpenGLCubeMaps

AnotherwaytobuildaskyboxistouseanOpenGLtexturecubemap.OpenGLcubemapsareabitmorecomplexthanthesimpleapproachwesawintheprevioussection.Thereareadvantages,however, tousingOpenGLcubemaps, suchas seamreductionandsupport forenvironmentmapping.

Figure9.8Skybox"seam"artifact.

Page 211: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Texture cube maps are similar to 3D textures that we will study later, in that they areaccessed using three texture coordinates—often labeled (s,t,r)—rather than two. Anotherunique characteristic ofOpenGL texture cubemaps is that the images in themare orientedwithtexturecoordinate(0,0,0)attheupperleft(ratherthantheusuallowerleft)ofthetextureimage;thisisoftenasourceofconfusion.

WhereasthemethodshowninProgram9.1readsinasingleimagefortexturingthecubemap,theloadCubeMap()functionshowninProgram9.2readsinsixseparatecubefaceimagefiles.The cubemap is generated first as a JOGLTexture of typeGL_TEXTURE_CUBE_MAP. Itthen reads the image files byusing thenewTextureData() function in the JOGLTextureIOclass,whichresultsinsixobjectsoftypeTextureData.Sincethetextureisspecifiedasbeingacubemap,thatmeansthatithassixsub-images.Eachsub-imageisassociatedwithoneofthesixTextureDataobjectscontainingtheimagefiledata,usingtheupdateImage() function inthe JOGL Texture class. The OpenGL texture cube map object is then extracted from theJOGLtextureobjectusinggetTextureObject().ThecompletedOpenGLtexturecubemapisreferencedbyanintidentifier.Aswasthecaseforshadow-mapping,artifactsalongabordercanbereducedbysettingthetexturewrapmodeto“clamptoedge.”Inthiscase,itcanhelpreduceseams.Notethatthisissetforallthreetexturecoordinates:s,t,andr.

The init() function now includes a call to enable GL_TEXTURE_CUBE_MAP_SEAMLESS,which tellsOpenGL to attempt toblendadjoiningedgesof the cube to reduceor eliminateseams.Indisplay(),thecube’sverticesaresentdownthepipelineasbefore,butthistimeitisunnecessarytosendthecube’stexturecoordinates.Aswewillsee,thisisbecauseanOpenGLtexturecubemapusuallyusesthecube’svertexpositionsasitstexturecoordinates.Theviewmatrix is “flipped” on theY andZ axis (with a scale function) to account for the reversedorientationdescribedabove.Afterdisablingdepthtesting,thecubeisdrawn.Depthtestingisthenre-enabledfortherestofthescene.

The texture is accessed in the fragment shader with a special type of sampler called asamplerCube. Ina texturecubemap, thevaluereturnedfromthesampler is the texel“seen”fromtheoriginasviewedalongthedirectionvector(s,t,r).Asaresult,wecanusuallysimplyusetheincominginterpolatedvertexpositionsasthetexturecoordinates.Inthevertexshader,weassignthecubevertexpositionsintotheoutgoingtexturecoordinateattribute,sothattheywillbeinterpolatedwhentheyreachthefragmentshader.Notealsointhevertexshaderthatweconvert the incomingviewmatrix to3x3, and thenback to4x4.This“trick”effectivelyremovesthetranslationcomponent,whileretainingtherotation(recallthattranslationvaluesarefoundinthe4thcolumnofatransformationmatrix).Thisfixesthecubemapatthedefaultcameralocation,whilestillallowingthesyntheticcamerato“lookaround.”

Program9.2OpenGLCubeMapSkybox

Java/JOGLapplicationpublicvoidinit(GLAutoDrawabledrawable)

{//sameasbefore,plusthefollowing:

...

Page 212: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

textureID2=loadCubeMap();

gl.glEnable(GL_TEXTURE_CUBE_MAP_SEAMLESS);

}

publicvoiddisplay(GLAutoDrawabledrawable)

{//clearcoloranddepthbuffers,projectionandcameraviewmatrixas

before.

...

//drawcubemapfirst-notethatitrequiresadifferentrendering

program

gl.glUseProgram(rendering_program_cube_map);

//puttheVmatrixintothecorrespondinguniform

cubeV_matrix=(Matrix3D)v_matrix.clone();

cubeV_matrix.scale(1.0,-1.0,-1.0);//flipviewmatrixforreversed

axesinOpenGLcubemap

v_location=gl.glGetUniformLocation(rendering_program_cube_map,"v_matrix");

gl.glUniformMatrix4fv(v_location,1,false,cubeV_matrix.getFloatValues(),

0);

//putthePmatrixintothecorrespondinguniform

intploc=gl.glGetUniformLocation(rendering_program_cube_map,"p_matrix");

gl.glUniformMatrix4fv(ploc,1,false,proj_matrix.getFloatValues(),0);

//setupverticesbufferforcube(bufferfortexturecoordinatesnot

necessary)

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[2]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

//makethecubemaptheactivetexture

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_CUBE_MAP,textureID2);

//disabledepthtesting,andthendrawthecubemap

gl.glEnable(GL_CULL_FACE);

gl.glFrontFace(GL_CCW);

gl.glDisable(GL_DEPTH_TEST);

gl.glDrawArrays(GL_TRIANGLES,0,36);

gl.glEnable(GL_DEPTH_TEST);

//drawremainderofthescene

...

}

privateintloadCubeMap()

{GL4gl=(GL4)GLContext.getCurrentGL();

GLProfileglp=gl.getGLProfile();

Texturetex=newTexture(GL_TEXTURE_CUBE_MAP);

try

{//getimagesfromimagefiles

TextureDatatopFile=TextureIO.newTextureData(glp,new

File("code/top.jpg"),false,"jpg");

TextureDataleftFile=TextureIO.newTextureData(glp,new

File("code/left.jpg"),false,"jpg");

TextureDatafntFile=TextureIO.newTextureData(glp,new

File("code/center.jpg"),false,"jpg");

Page 213: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

TextureDatarightFile=TextureIO.newTextureData(glp,new

File("code/right.jpg"),false,"jpg");

TextureDatabkFile=TextureIO.newTextureData(glp,new

File("code/back.jpg"),false,"jpg");

TextureDatabotFile=TextureIO.newTextureData(glp,new

File("code/bottom.jpg"),false,"jpg");

//attachtexturestoeachfaceoftheactivetexture

tex.updateImage(gl,rightFile,GL_TEXTURE_CUBE_MAP_POSITIVE_X);

tex.updateImage(gl,leftFile,GL_TEXTURE_CUBE_MAP_NEGATIVE_X);

tex.updateImage(gl,botFile,GL_TEXTURE_CUBE_MAP_POSITIVE_Y);

tex.updateImage(gl,topFile,GL_TEXTURE_CUBE_MAP_NEGATIVE_Y);

tex.updateImage(gl,fntFile,GL_TEXTURE_CUBE_MAP_POSITIVE_Z);

tex.updateImage(gl,bkFile,GL_TEXTURE_CUBE_MAP_NEGATIVE_Z);

}catch(IOException|GLExceptione){e.printStackTrace();}

int[]textureIDs=newint[1];

gl.glGenTextures(1,textureIDs,0);

inttextureID=tex.getTextureObject();

//anytextureparametersettingsgohere,suchas…

gl.glTexParameteri(GL_TEXTURE_CUBE_MAP,GL_TEXTURE_WRAP_S,

GL_CLAMP_TO_EDGE);

gl.glTexParameteri(GL_TEXTURE_CUBE_MAP,GL_TEXTURE_WRAP_T,

GL_CLAMP_TO_EDGE);

gl.glTexParameteri(GL_TEXTURE_CUBE_MAP,GL_TEXTURE_WRAP_R,

GL_CLAMP_TO_EDGE);

returntextureID;

}

Vertexshader#version430

layout(location=0)invec3position;

outvec3tc;

uniformmat4v_matrix;

uniformmat4p_matrix;

layout(binding=0)uniformsamplerCubesamp;

voidmain(void)

{

tc=position;//texturecoordinatesare

simplythevertexcoordinates

mat4vrot_matrix=mat4(mat3(v_matrix));//removestranslationfromview

matrix

gl_Position=p_matrix*vrot_matrix*vec4(position,1.0);

}

Fragmentshader#version430

invec3tc;

outvec4fragColor;

Page 214: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

uniformmat4v_matrix;

uniformmat4p_matrix;

layout(binding=0)uniformsamplerCubesamp;

voidmain(void)

{fragColor=texture(samp,tc);

}

9.4 ENVIRONMENTMAPPING

When we looked at lighting and materials, we considered the “shininess” of objects.However,wenevermodeledveryshinyobjects,suchasamirrororsomethingmadeoutofchrome. Such objects don’t just have small specular highlights; they actually reflect theirsurroundings.Whenwelookatthem,weseethingsintheroom,orsometimesevenourownreflection.TheADSlightingmodeldoesn’tprovideawayofsimulatingthiseffect.

Texturecubemaps,however,offerarelativelysimplewaytosimulatereflectivesurfaces—at least partially. The trick is touse the cubemap to texture the reflective object itself1.Doingthissothatitappearsrealisticrequiresfindingtexturecoordinatesthatcorrespondtowhichpart of the surrounding environmentwe should see reflected in the object fromourvantagepoint.

Figure 9.9 illustrates the strategy of using a combination of the view vector and thenormalvectortocalculateareflectionvectorwhichisthenusedtolookupatexelfromthecube map. The reflection vector can thus be used to access the texture cube map directly.Whenthecubemapperformsthisfunction,itisreferredtoasanenvironmentmap.

Figure9.9Environmentmappingoverview.

We computed reflection vectors earlier when we studied Blinn-Phong lighting. Theconcepthereissimilar,exceptthatnowweareusingthereflectionvectortolookupavaluefromatexturemap.Thistechniqueiscalledenvironmentmapping,orreflectionmapping. If

Page 215: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

thecubemapisimplementedusingthesecondmethodwedescribed(inSection9.3.2;thatis,asanOpenGLGL_TEXTURE_CUBE_MAP), thenOpenGLcanperformtheenvironmentmappinglookup in the samemanneraswasdone for texturing thecubemap itself.Weuse theviewvector and the surface normal to compute a reflection of the view vector off the object’ssurface.Thereflectionvectorcanthenbeusedtosamplethetexturecubemapimagedirectly.The lookup is facilitatedby theOpenGLsamplerCube; recall fromtheprevioussection thatthe samplerCube is indexed by a view direction vector. The reflection vector is thus well-suitedforlookingupthedesiredtexel.

The implementation requiresa relatively smallamountofadditionalcode.Program 9.3showsthechangesthatwouldbemadetothedisplay()andinit()functionsandtherelevantshaders for rendering a “reflective” torus using environment mapping. The changes arehighlighted.ItisworthnotingthatifBlinn-Phonglightingispresent,manyoftheseadditionswouldlikelyalreadybepresent.Theonlytrulynewsectionofcodeisinthefragmentshader(inthemain()method).

In fact, itmightat firstappear that thehighlightedcode inProgram9.3 (i.e., theyellowsections)aren’treallynewatall.Indeed,wehaveseennearlyidenticalcodebefore,whenwestudied lighting. However, in this case, the normal and reflection vectors are used for anentirelydifferentpurpose.PreviouslytheywereusedtoimplementtheADSlightingmodel.Here they are instead used to compute texture coordinates for environment mapping. Wehighlightedtheselinesofcodesothatthereadercanmoreeasilytracktheuseofnormalsandreflectioncomputationsforthisnewpurpose.

Theresult,showinganenvironment-mapped“chrome”toruswithinacubemapofstormclouds,isshowninFigure9.10.

Figure9.10Exampleofenvironmentmappingtocreateareflectivetorus.

Program9.3EnvironmentMappingpublicvoiddisplay(GLAutoDrawabledrawable)

{//thecodefordrawingthecubemapisunchanged.

...

Page 216: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//thechangesareallindrawingthetorus:

gl.glUseProgram(rendering_program);

//uniformlocationsformatrixtransforms,includingthetransformfor

normals

mv_location=gl.glGetUniformLocation(rendering_program,"mv_matrix");

proj_location=gl.glGetUniformLocation(rendering_program,"proj_matrix");

n_location=gl.glGetUniformLocation(rendering_program,"norm_matrix");

//buildtheMODELmatrix,asbefore

m_matrix.setToIdentity();

m_matrix.translate(torusLocX,torusLocY,torusLocZ);

//buildtheMODEL-VIEWmatrix,asbefore

mv_matrix.setToIdentity();

mv_matrix.concatenate(v_matrix);

mv_matrix.concatenate(m_matrix);

//thenormalstransformisnowincludedintheuniforms:

gl.glUniformMatrix4fv(mv_location,1,false,mv_matrix.getFloatValues(),0);

gl.glUniformMatrix4fv(proj_location,1,false,proj_matrix.getFloatValues(),

0);

gl.glUniformMatrix4fv(n_location,1,false,

((mv_matrix.inverse()).transpose()).getFloatValues(),0);

//activatethetorusverticesbuffer,asbefore

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[0]);

gl.glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(0);

//weneedtoactivatethetorusnormalsbuffer:

gl.glBindBuffer(GL_ARRAY_BUFFER,bufferIDs[2]);

gl.glVertexAttribPointer(1,3,GL_FLOAT,false,0,0);

gl.glEnableVertexAttribArray(1);

//thetorustextureisnowthecubemap

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_CUBE_MAP,textureID2);

//drawingthetorusisotherwiseunchanged

gl.glClear(GL_DEPTH_BUFFER_BIT);

gl.glEnable(GL_CULL_FACE);

gl.glFrontFace(GL_CCW);

gl.glDepthFunc(GL_LEQUAL);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

}

Vertexshader#version430

layout(location=0)invec3position;

layout(location=1)invec3normal;

outvec3varyingNormal;

outvec3varyingVertPos;

uniformmat4mv_matrix;

Page 217: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

uniformmat4proj_matrix;

uniformmat4norm_matrix;

layout(binding=0)uniformsamplerCubetex_map;

voidmain(void)

{varyingVertPos=(mv_matrix*vec4(position,1.0)).xyz;

varyingNormal=(norm_matrix*vec4(normal,1.0)).xyz;

gl_Position=proj_matrix*mv_matrix*vec4(position,1.0);

}

Fragmentshader#version430

invec3varyingNormal;

invec3varyingVertPos;

outvec4fragColor;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

uniformmat4norm_matrix;

layout(binding=0)uniformsamplerCubetex_map;

voidmain(void)

{vec3r=reflect(normalize(-varyingVertPos),normalize(varyingNormal));

fragColor=texture(tex_map,r);

}

Although twosetsofshadersare requiredfor thisscene—oneset for thecubemapandanothersetforthetorus—onlytheshadersusedtodrawthetorusareshowninProgram9.3.This isbecause theshadersused for rendering thecubemapareunchanged fromProgram9.2.ThechangesmadetoProgram9.2,resultinginProgram9.3,aresummarizedasfollows:

ininit():

A buffer of normals for the torus is created (actually done in setupVertices(),calledbyinit()).Thebufferoftexturecoordinatesforthetorusisnolongerneeded.

indisplay():

The matrix for transforming normals (dubbed “norm_matrix” in Chapter 7) iscreatedandlinkedtotheassociateduniformvariable.Thetorusnormalbufferisactivated.Thetexturecubemapisactivatedasthetextureforthetorus(ratherthanthe“brick”texture).

inthevertexshader:

Thenormalvectorsandnorm_matrixareadded.Thetransformedvertexandnormalvectorareoutputinpreparationforcomputingthereflectionvector,similartowhatwasdoneforlightingandshadows.

inthefragmentshader:

Page 218: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Thereflectionvectoriscomputedinasimilarwaytowhatwasdoneforlighting.Theoutputcolorisretrievedfromthetexture(nowthecubemap),withthelookuptexturecoordinatenowbeingthereflectionvector.

The resulting rendering shown inFigure9.10 is an excellent example of how a simpletrick can achieve a powerful illusion.By simply painting the background on an object,wehavemadetheobjectlook“metallic,”whennosuchADSmaterialmodelinghasbeendoneatall.Ithasalsogiventheappearancethatlightisreflectingoffoftheobject(thereevenseemtobespecularhighlights),eventhoughnoADSlightingwhatsoeverhasbeenincorporatedintothescene!

SUPPLEMENTALNOTES

Amajorlimitationofenvironmentmapping,aspresentedinthischapter,isthatitisonlycapableofconstructingobjectsthatreflectthecubemap.Otherobjectsrenderedinthescenearenotreflected in thereflection-mappedobject.Dependingon thenatureof thescene, thismight or might not be acceptable. If other objects are present that must be reflected in amirror or chrome object, other methods must be used. A common approach utilizes thestencil buffer (mentioned earlier in Chapter 8), and is described in various web tutorials([OV12],[NE14],and[GR16],forexample),butisoutsidethescopeofthistext.

We didn’t include an implementation of skydomes, although they are in some waysarguably simpler than skyboxes and they can be less susceptible to distortion. Evenenvironmentmappingissimpler—atleastthemath—buttheOpenGLsupportforcubemapsoftenmakesskyboxesmorepractical.

Of the topicscovered in the later sectionsof this textbook, skyboxesand skydomesarearguably among the simplest conceptually. However, getting them to look convincing canconsumealotoftime.Wehavedealtonlybrieflywithsomeoftheissuesthatcanarise(suchas seams), but dependingon the texture image filesused, other issues canoccur, requiringadditionalrepair.Thisisespeciallytruewhenthesceneisanimated,orwhenthecameracanbemovedinteractively.

Wealsoglossedoverthegenerationofusableandconvincingtexturecubemapimages.There are excellent tools for doing this, one of the most popular being Terragen [TE16].SeveralofthecubemapsinthischapterweremadeusingTerragen,includingthoseinFigure9.7,madebytheauthors.

Exercises

9.1 (PROJECT)InProgram9.2,addtheabilitytomovethecameraaroundwiththemouse.Todothis,youwillneedtoutilizethecodeyoudevelopedearlierinExercise4.2forconstructingaviewmatrix.You’llalsoneedtoassignmouseorkeyboardactionstofunctionsthatmovethecameraforwardandbackward,andfunctionsthatrotatethe

Page 219: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

cameraononeormoreofitsaxes(you’llneedtowritethesefunctionstoo).Afterdoingthis,youshouldbeableto“flyaround”inyourscene,notingthattheskyboxalwaysappearstoremainatthedistanthorizon.

9.2 (PROJECT)AddanimationtoProgram9.3sothatone(ormore)environment-mappedobject(s)inthescenerotateortumble.Thesimulatedreflectivityoftheobjectshouldbeapparentastheskyboxtexturemovesontheobject’ssurface.

9.3 (PROJECT)ModifyProgram9.3sothattheobjectinthesceneblendsenvironment-mappingwithatexture.Useaweightedsuminthefragmentshader,asdescribedinChapter7.

9.4 (RESEARCH&PROJECT)LearnthebasicsofhowtouseTerragen[TE16]tocreateasimplecubemap.Thisgenerallyentailsmakinga“world”withthedesiredterrainandatmosphericpatterns(inTerragen),andthenpositioningTerragen’ssyntheticcameratosavesiximagesrepresentingtheviewsfront,back,right,left,top,andbottom.UseyourimagesinProgram9.2andProgram9.3toseetheirappearanceascubemaps,andwithenvironmentmapping.ThefreeversionofTerragenisquitesufficientforthisexercise.

References

[BO01] P.Bourke,“RepresentingStarFields,”June2001,accessedJuly2016,http://paulbourke.net/miscellaneous/starfield/.

[GI16] GNUImageManipulationProgram,accessedJuly2016,http://www.gimp.org.

[GR16]OpenGLResources,“PlanarReflectionsandRefractionsUsingtheStencilBuffer,”accessedJuly2016,https://www.opengl.org/archives/resources/code/samples/advanced/advanced97/notes/node90.html

[NE14] NeHeProductions,“ClippingandReflectionsUsingtheStencilBuffer,”2014,accessedJuly2016,http://nehe.gamedev.net/tutorial/clipping__reflections_using_the_stencil_buffer/17004/

[OV12] A.Overvoorde,“DepthandStencils,”2012,accessedJuly2016,https://open.gl/depthstencils

[TE16] Terragen,PlanetsideSoftware,LLC,accessedJuly2016,http://planetside.co.uk/.1Thiscanalsobedonewithaskydome.

Page 220: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 10

ENHANCINGSURFACEDETAIL

10.1 BumpMapping10.2 NormalMapping10.3 HeightMapping

SupplementalNotes

Supposewewanttomodelanobjectwithanirregularsurface—likethebumpysurfaceofanorange,thewrinkledsurfaceofaraisin,orthecrateredsurfaceofthemoon.Howwouldwe do it? So far, we have learned two potential methods: (a) we could model the entireirregularsurface,whichwouldoftenbeimpractical(ahighlycrateredsurfacewouldrequirea huge number of vertices), or (b) we could apply a texture-map image of the irregularsurface toasmoothversionof theobject.Thesecondoptionisofteneffective.However, ifthesceneincludeslights,andthelights(orcameraangle)move,itbecomesquicklyobviousthat the object is statically textured (and smooth) because the light and dark areas on thetexturewouldn’tchange,astheywouldiftheobjectwasactuallybumpy.

Inthischapterwearegoingtoexploreseveralrelatedmethodsforusinglightingeffectstomakeobjectsappeartohaverealisticsurfacetexture,eveniftheunderlyingobjectmodelissmooth. We will start by examining bump mapping and normal mapping, which can addconsiderable realism to the objects in our scenes when it would be too computationallyexpensive to include tinysurfacedetails in theobjectmodels.Wewillalso lookatwaysofactuallyperturbingtheverticesinasmoothsurfacethroughheightmapping,whichisusefulforgeneratingterrain(andotheruses).

10.1 BUMPMAPPING

InChapter 7, we saw how surface normals are critical to creating convincing lightingeffects. Light intensity at a pixel is determined largely by the reflection angle, taking into

Page 221: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

accountthelightsourcelocation,cameralocation,andthenormalvectorat thepixel.Thus,wecanavoidgeneratingdetailedverticescorresponding toabumpyorwrinkledsurface ifwecanfindawayofgeneratingthecorrespondingnormals.

Figure10.1illustratestheconceptofmodifiednormalscorrespondingtoasingle“bump.”

Figure10.1Perturbednormalvectorsforbumpmapping.

Thus,ifwewanttomakeanobjectlookasthoughithasbumps(orwrinkles,craters,etc.),onewayistocomputethenormalsthatwouldexistonsuchasurface.Thenwhenthesceneislit,thelightingwouldproducethedesiredillusion.ThiswasfirstproposedbyBlinnin1978[BL78], and became practical with the advent of the capability of performing per-pixellightingcomputationsinafragmentshader.

An example is illustrated in the vertex and fragment shaders shown in Program 10.1,whichproducesa toruswitha“golf-ball”surfaceasshown in theFigure10.2 that follows.Thecode isalmost identical toProgram7.2 thatwesawpreviously inChapter7.Theonlysignificant change is in the fragment shader—the incoming interpolated normal vectors(named“varyingNormal”intheoriginalprogram)arealteredwithbumpscalculatedusingasinewavefunctionintheX,Y,andZaxesappliedtotheoriginal(untransformed)verticesofthetorusmodel.Notethatthevertexshaderthereforenowneedstopasstheseuntransformedverticesdownthepipeline.

Alteringthenormalsinthismanner,withamathematicalfunctioncomputedatruntime,iscalledproceduralbumpmapping.

Program10.1ProceduralBumpMapping

VertexShader#version430

//sameasPhongshading,butaddthisoutputvertexattribute:

outvec3originalVertex;

...

Page 222: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

voidmain(void)

{//includethispass-throughoforiginalvertexforinterpolation:

originalVertex=vertPos;

...

}

FragmentShader#version430

//sameasPhongshading,butaddthisinputvertexattribute:

invec3originalVertex;

...

voidmain(void)

{...

//addthefollowingtoperturbtheincomingnormalvector:

floata=0.25;//acontrolsheightofbumps

floatb=100.0;//bcontrolswidthofbumps

floatx=originalVertex.x;

floaty=originalVertex.y;

floatz=originalVertex.z;

N.x=varyingNormal.x+a*sin(b*x);//perturbincomingnormalusingsine

function

N.y=varyingNormal.y+a*sin(b*y);

N.z=varyingNormal.z+a*sin(b*z);

N=normalize(N);

//lightingcomputationsandoutputfragColor(unchanged)nowutilizethe

perturbednormalN

...

}

Figure10.2Proceduralbumpmappingexample.

10.2 NORMALMAPPING

An alternative to bump mapping is to replace the normals using a lookup table. Thisallowsustoconstructbumpsforwhichthereisnomathematicalfunction,suchasthebumpscorresponding to the craters on themoon. A commonway of doing this is called normalmapping.

Page 223: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Tounderstandhowthisworks,westartbynotingthatavectorcanbestoredtoreasonableprecisioninthreebytes,oneforeachoftheX,Y,andZcomponents.Thismakesitpossibletostorenormalsinacolorimagefile,withtheR,G,andBcomponentscorresponding toX,Y,andZ.RGBBvaluesinanimagearestoredinbytesandareusuallyinterpretedasvaluesintherange[0…1],whereasvectorscanhavepositiveornegativecomponentvalues.Ifwerestrictnormal vector components to the range [-1…+1], a simple conversion to enable storing anormalvectorNasapixelinanimagefileis:

Normal mapping utilizes an image file (called a normal map) that contains normalscorrespondingtoadesiredsurfaceappearanceinthepresenceoflighting.Inanormalmap,thevectorsarerepresentedrelativetoanarbitraryplaneX-Y,withtheirXandYcomponentsrepresenting deviations from “vertical,” and their Z component set to 1. A vector strictlyperpendicular to the X-Y plane (i.e., with no deviation) would be represented [0,0,1],whereasnon-perpendicularvectorswouldhavenon-zeroXand/orYcomponents.Weusetheabove formulae to convert to RGB space; for example, [0,0,1] would be stored as[.5,.5,1],sinceactualoffsetsrange[-1..+1],butRGBvaluesrange[0..1].

Wecanmakeuseofsuchanormalmapthroughyetanothercleverapplicationoftextureunits:insteadofstoringcolorsinthetextureunit,westorethedesirednormalvectors.Wecanthenuse a sampler to lookup thevalue in thenormalmap for a given fragment, and thenrather than applying the returned value to the output pixel color (as we did in texturemapping),weinsteaduseitasthenormalvector.

OneexampleofsuchanormalmapimagefileisshowninFigure10.3.ItwasgeneratedbyapplyingtheGIMPnormalmappingplugin[GI16]toatexturefromLuna[LU16].Normal-mapping image files arenot intended forviewing;we show this one topoint out that suchimagesendupbeinglargelyblue.ThisisbecauseeveryentryintheimagefilehasaBvalueof1(maximumblue),makingtheimageappear“bluish”ifviewed.

Figure10.3Normalmappingimagefileexample[ME11].

Page 224: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure10.4Normalmappingexamples.

Figure10.4 shows twodifferent normalmap image files (both are built out of texturesfromLuna [LU16]), and the result of applying them to a sphere in the presence of Blinn-Phonglighting.

Normalvectorsretrievedfromanormalmapcannotbeutilizeddirectly,becausetheyaredefined relative to an arbitraryX-Y plane as described above, without taking into accounttheirpositionontheobjectandtheirorientationincameraspace.Ourstrategyforaddressingthiswillbetobuildatransformationmatrixforconvertingthenormalsintocameraspace,asfollows.

Ateachvertexonanobject,weconsideraplanethat is tangenttotheobject.Theobjectnormal at that vertex is perpendicular to this plane.We define twomutually perpendicularvectors in that plane, also perpendicular to the normal, called the tangent and bitangent(sometimescalledthebinormal).Constructingourdesiredtransformationmatrixrequiresthatourmodelsincludeatangentvectorforeachvertex(thebitangentcanbebuiltbycomputingthecrossproductofthetangentandthenormal).Ifthemodeldoesnotalreadyhavetangentvectorsdefined,theycouldbecomputed.Inthecaseofaspheretheycanbecomputedexactly,asshowninthefollowingmodificationstoProgram6.1fromChapter6:

...

for(inti=0;i<=prec;i++)

{for(intj=0;j<=prec;j++)

{floaty=(float)cos(toRadians(180-i*180/(prec+1)));

floatx=(float)cos(toRadians(j*360/(prec+1)))*

(float)abs(cos(asin(y)));

floatz=(float)sin(toRadians(j*360/(prec+1)))*

(float)abs(cos(asin(y)));

vertices[i*(prec+1)+j].setLocation(newPoint3D(x,y,z));

//calculatetangentvector

if(((x==0)&&(y==1)&&(z==0))||((x==0)&&(y==-1)&&(z==0)))

//ifnorthorsouthpole,

{tangent=newVector3D(0,0,-1);//

settangentto-Zaxis

}

else

{tangent=(newVector3D(0,1,0)).cross(newVector3D(x,y,z));//

otherwise,calculatetangent

Page 225: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}

vertices[i*(prec+1)+j].setTangent(tangent);//storethetangentin

vertexobject

...//remainingcomputationsareunchanged

}}

Formodelsthatdon’tlendthemselvestoexactanalyticderivationofsurfacetangents,thetangentscanbeapproximated,forexamplebydrawingvectorsfromeachvertextothenextastheyareconstructed(orloaded).Notethatsuchanapproximationcanleadtotangentvectorsthatarenotstrictlyperpendiculartothecorrespondingvertexnormals.Implementingnormalmapping thatworks across a variety ofmodels therefore needs to take this possibility intoaccount(oursolutionwill).

Thetangentvectorsaresentfromabuffer(VBO)toavertexattributeinthevertexshader,asisdoneforthevertices,texturecoordinates,andnormals.Thevertexshaderthenprocessesthem the same as is done for normal vectors, by applying the inverse-transpose of the MVmatrix, and forwarding the result down the pipeline for interpolation by the rasterizer andultimatelyintothefragmentshader.Thislaststepconvertsthenormalandtangentvectorsintocameraspace,afterwhichweconstructthebitangentusingthecrossproduct.

Oncewehavethenormal,tangent,andbitangentvectorsincameraspace,wecanusethemto construct amatrix (called the “TBN”matrix, after its components)which transform thenormalsretrievedfromthenormalmapintotheircorrespondingorientationincameraspacerelativetothesurfaceoftheobject.

Inthefragmentshader,thecomputingofthenewnormalisdoneinthecalcNewNormal()function.Thecomputationinthethirdlineofthefunction(theonecontainingdot(tangent,normal)) ensures that the tangent vector is perpendicular to the normal vector. A crossproductbetweenthenewtangentandthenormalproducesthebitangent.

WethencreateTBNasa3×3mat3matrix.Themat3constructor takes threevectorsandgeneratesamatrixcontainingthefirstvectorinthetoprow,thesecondvectorinthemiddlerow,andthethirdinthebottomrow(similartobuildingaviewmatrixfromcameraspace—seeFigure3.13).

The shader uses the fragment’s texture coordinates to extract the normal map entrycorrespondingtothecurrentfragment.Thesamplervariable“normMap”isusedforthis,andin thiscase isbound to textureunit0 (note: the Java/JOGLapplicationmust thereforehaveattachedthenormalmapimagetotextureunit0).Toconvert thecolorcomponentfromthestoredrange[0…1]toitsoriginalrange[-1…+1]wemultiplyby2.0andsubtract1.0.

TheTBNmatrixisthenappliedtotheresultingnormaltoproducethefinalnormalforthecurrent pixel. The rest of the shader is identical to the fragment shader used for Phonglighting.ThefragmentshaderisshowninProgram10.2,and isbasedonaversionbyEtayMeiri[ME11].

Program10.2alsocontainsourfirstexampleofaGLSLfunction(asidefromthe“main”).AsintheClanguage,functionsmustbedefinedbefore(or“above”)wheretheyarecalled,or

Page 226: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

else a forward declaration must be provided. In this example a forward declaration isn’trequiredbecausethefunctionhasbeendefinedabovethecalltoit.

Avariety of tools exist for developingnormalmap images.Some image editing tools,such asGIMP [GI16] andPhotoshop [PH16], have such capabilities. Such tools analyze theedgesinanimage,inferringpeaksandvalleys,andproducingacorrespondingnormalmap.

Figure10.5 shows a texturemap of the surface of themoon created byHastings-Trew[HT16] based on NASA satellite data. The corresponding normal map was generated byapplyingtheGIMPnormalmapplugin[GP16]toablackandwhitereductionalsocreatedbyHastings-Trew.

Figure10.5Moon,textureandnormalmap.

Program10.2NormalMappingFragmentShader#version430

invec3varyingLightDir;

invec3varyingVertPos;

invec3varyingNormal;

invec3varyingTangent;

invec3originalVertex;

invec2tc;

outvec4fragColor;

layout(binding=0)uniformsampler2DnormMap;

//remaininguniformssameasbefore

...

vec3calcNewNormal()

{vec3normal=normalize(varyingNormal);

vec3tangent=normalize(varyingTangent);

tangent=normalize(tangent-dot(tangent,normal)*normal);//tangentis

perpendiculartonormal

vec3bitangent=cross(tangent,normal);

Page 227: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

mat3tbn=mat3(tangent,bitangent,normal);//TBNmatrixtoconvertto

cameraspace

vec3retrievedNormal=texture(normMap,tc).xyz;

retrievedNormal=retrievedNormal*2.0-1.0;//convertfromRGBspace

vec3newNormal=tbn*retrievedNormal;

newNormal=normalize(newNormal);

returnnewNormal;

}

voidmain(void)

{//normalizethelight,normal,andviewvectors:

vec3L=normalize(varyingLightDir);

vec3V=normalize(-varyingVertPos);

vec3N=calcNewNormal();

//gettheanglebetweenthelightandsurfacenormal:

floatcosTheta=dot(L,N);

//computehalfvectorforBlinnoptimization:

vec3H=varyingHalfVector;

//anglebetweentheviewvectorandreflectedlight:

floatcosPhi=dot(H,N);

//computeADScontributions(perpixel):

fragColor=globalAmbient*material.ambient

+light.ambient*material.ambient

+light.diffuse*material.diffuse*max(cosTheta,0.0)

+light.specular*material.specular*pow(max(cosPhi,0.0),

material.shininess);

}

Figure10.6showsaspherewiththemoonsurfacerenderedintwodifferentways.Ontheleft, simply textured with the original texture map; on the right, textured with the imagenormalmap(forreference).Normalmappinghasnotbeenappliedineithercase.Asrealisticas the textured “moon” is, close examination reveals that the texture imagewas apparentlytakenwhenthemoonwasbeinglitfromtheleft,becauseridgeshadowsarecasttotheright(most clearly evident in the crater at the bottom center). Ifwewere to add lighting to thisscenewithPhongshading,andthenanimatethescenebymovingthemoon,thecamera,orthelight,thoseshadowswillnotchangeaswewouldexpectthemto.

Figure10.6Spheretexturedwithmoontexture(left)andnormalmap(right).

Page 228: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Furthermore,asthelightsourcemoves(orasthecameramoves),wewouldexpectmanyspecularhighlightstoappearontheridges.ButaplaintexturedspheresuchasattheleftofFigure10.6wouldproduceonlyonespecularhighlight,correspondingtowhatwouldappearonasmoothsphere,whichwouldlookveryunrealistic.Incorporationofthenormalmapcanimprovetherealismoflightingonobjectssuchasthisconsiderably.

Ifweusenormalmappingonthesphere(ratherthanapplyingthetexture),weobtaintheresultsshowninFigure10.7.Althoughnotasrealistic(yet)asstandardtexturing,itnowdoesrespondtolightingchanges.Thefirstimageislitfromtheleft,andthesecondislitfromtheright.Notetheblueandyellowarrowsshowingthechangeindiffuselightingaroundridgesandthemovementofspecularhighlights.

Figure10.7Normalmaplightingeffectsonmoon.

Figure10.8showstheeffectofcombiningnormalmappingwithstandardtexturing,inthepresenceofPhonglighting.Theimageofthemoonisenhancedwithdiffuse-litregionsandspecular highlights that respond to themovement of the light source (or camera or objectmovement).Lightinginthetwoimagesisfromtheleftandrightsides,respectively.

Figure10.8Texturingplusnormalmapping,withlightingfromtheleftandright.

Ourprogramnowrequirestwotextures—oneforthetextureandoneforthenormalmap—and thus two samplers. The fragment shader blends the texture color with the colorproduced by the lighting computation as shown in Program 10.3, with adjustable weightshighlighted.

Page 229: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Program10.3TexturingPlusNormalMap//variablesandstructsasinpreviousfragmentshader,plus:

layout(binding=0)uniformsampler2Ds0;//normalmap

layout(binding=1)uniformsampler2Ds1;//texture

voidmain(void)

{//computationssameasbefore,until:

vec3N=calcNewNormal();

vec4texel=texture(s1,tc);//standardtexture

...

//reflectioncomputationsasbefore,thenblendresults:

fragColor=0.6*texel

+0.4*(global.ambient*materal.ambient

+light.ambient*material.ambient

+light.diffuse*material.diffuse*max(cosTheta,0.0)

+light.specular*material.specular*pow(max(cosPhi,0.0),

material.shininess));

}

Interestingly,normalmappingcanbenefitfrommipmapping,becausethesame“aliasing”artifacts thatwe saw inChapter5 for texturing also occurwhen using a texture image fornormalmapping.Figure10.9showsanormal-mappedmoon,withandwithoutmipmapping.Although not easily shown in a still image, the sphere at the left (not mipmapped) hasshimmeringartifactsarounditsperimeter.

Figure10.9Normalmappingartifacts,correctedwithmipmapping.

Anisotropic filtering (AF) works even better, reducing sparkling artifacts whilepreservingdetail,asillustratedinFigure10.10.AversioncombiningequalpartstextureandlightingwithnormalmappingandAFisshownalongside,inFigure10.11.

Page 230: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure10.10NormalmappingwithAF.

Figure10.11Texturing+normalmappingw/AF.

Theresultsareimperfect.Shadowsappearingintheoriginaltextureimagewillstillshowontherenderedresult,regardlessoflighting.Also,whilenormalmappingcanaffectdiffuseand specular effects, it cannot cast shadows. Therefore, this method is best used when thesurfacefeaturesaresmall.

10.3 HEIGHTMAPPING

Wenowextendtheconceptofnormalmapping—whereatextureimageisusedtoperturbnormals—toinsteadperturb thevertex locations themselves.Actuallymodifyinganobject’sgeometryinthiswayhascertainadvantages,suchasmakingthesurfacefeaturesvisiblealongthe object’s edge, and enabling the features to respond to shadow-mapping. It can alsofacilitatebuildingterrain,aswewillsee.

Apracticalapproach is tousea texture image tostoreheightvalues,whichcan thenbeusedtoraise(orlower)vertexlocations.Animagethatcontainsheightinformationiscalled

Page 231: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

aheightmap, andusingaheightmap toalteranobject’svertices is calledheightmapping.Heightmapsusuallyencodeheight informationasgrayscalecolors:(0,0,0) (black)= lowheight, and (1,1,1) (white) = high height. This makes it easy to create height mapsalgorithmically,orbyusinga“paint”program.Thehighertheimagecontrast,thegreaterthevariation in height expressed by the map. These concepts are illustrated in Figure 10.12(showinga randomlygeneratedmap),andFigure10.13 (showingamapwithanorganizedpattern).

Figure10.12Heightmapexamples.

Figure10.13Heightmapinterpretation.

The usefulness of altering vertex locations depends on themodel being altered.Vertexmanipulationiseasilydoneinthevertexshader,andwhenthereisahighlevelofdetailinthemodelvertices(suchasinaspherewithsufficientlyhighprecision),thisapproachcanworkwell.However,when the underlying number of vertices is small (such as the corners of acube),renderingtheobject’ssurfacereliesonvertexinterpolationintherasterizertofill inthe detail.When there are very few vertices available in the vertex shader to perturb, theheights of many pixels would be interpolated rather than retrieved from the height map,leading to poor surface detail. Vertex manipulation in the fragment shader is, of course,impossiblebecausebythentheverticeshavebeenrasterizedintopixellocations.

Program 10.4 shows a vertex shader that moves the vertices “outwards” (i.e., in thedirection of the surface normal), by multiplying the vertex normal by the value retrievedfromtheheightmapandthenaddingthatproducttothevertexposition.

Page 232: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Program10.4HeightMappinginVertexShader#version430

layout(location=0)invec3vertPos;

layout(location=0)invec3vertNormal;

layout(location=0)invec2texCoord;

outvec2tc;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

layout(binding=0)uniformsampler2Dt;//fortexture

layout(binding=1)uniformsampler2Dh;//forheightmap

voidmain(void)

{//"p"isthevertexpositionalteredbytheheightmap.

//Sincetheheightmapisgrayscale,anyofthecolorcomponentscanbe

//used(weuse"r").Dividingby5.0istoadjusttheheight.

vec4p=vec4(vertPos,1.0)+vec4((vertNormal*((texture(h,texCoord).r)/

5.0f)),1.0f);

tc=tex_coord;

gl_Position=proj_matrix*mv_matrix*p;

}

Figure 10.14 shows a simple height map (top left) created by scribbling in a paintprogram.Awhitesquareisalsodrawnintheheightmapimage.Agreen-tintedversionoftheheightmap(bottomleft)isusedasatexture.Whentheheightmapisappliedtoarectangular100 × 100 grid model using the shader shown in Program 10.4, it produces a sort of“terrain”(shownontheright).Notehowthewhitesquareresultsintheprecipiceattheright.

Figure10.14Terrain,height-mappedinthevertexshader.

Figure10.15 shows another example of doing heightmapping in a vertex shader. Thistimetheheightmapisanoutlineofthecountriesoftheworld[HT16].Itisappliedtoaspheretexturedwithablue-tintedversionoftheheightmap(seetopleft—notetheoriginalblackandwhiteversionisnotshown),andlitwithBlinn-Phongshadingusinganormalmap(shownat

Page 233: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

the lower left) built using the tool SS_Bump_Generator [SS16]. The sphere precision wasincreasedto500toensureenoughverticestorenderthedetail.Notehowtheraisedverticesaffectnotonlythelighting,butalsothesilhouetteedges.

Figure10.15Vertexshader-basedheightmapping,appliedtoasphere.

TherenderedexamplesshowninFigure10.14andFigure10.15workacceptablybecausethe twomodels (gridandsphere)havea sufficientnumberofvertices to sample theheightmapvalues.That is, theyeachhaveafairlylargenumberofvertices,andtheheightmapisrelativelycoarseandadequatelysampledatalowresolution.However,closeinspectionstillreveals thepresenceofresolutionartifacts,suchasalongthebottomleftedgeof theraisedboxattherightoftheterraininFigure10.14.Thereasonthatthesidesoftheraisedboxdon’tappearperfectlysquare,andincludegradationsincolor,isbecausethe100×100resolutionoftheunderlyinggridcannotadequatelyalignperfectlywiththewhiteboxintheheightmap,andtheresultingrasterizationoftexturecoordinatesproducesartifactsalongthesides.

The limitationsofdoingheightmapping in thevertex shaderare further exposedwhentryingtoapplyitwithamoredemandingheightmap.ConsiderthemoonimageshownbackinFigure10.5.Normalmappingdidanexcellentjobofcapturingthedetailintheimage(asshownpreviouslyinFigure10.9andFigure10.11),andsince it isgrayscale, itwouldseemnatural to try applying it as a height map. However, vertex-shader-based height mappingwouldbeinadequateforthistask,becausethenumberofverticessampledinthevertexshader(even for a spherewith precision=500) is small compared to the fine level of detail in theimage.Bycontrast,normalmappingwasabletocapturethedetailimpressively,becausethenormalmapissampledinthefragmentshader,atthepixellevel.

We will revisit height mapping later in Chapter 12 when we discuss methods forgeneratingagreaternumberofverticesinatessellationshader.

SUPPLEMENTALNOTES

Page 234: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

One of the fundamental limitations of bump or normalmapping is that,while they arecapableofprovidingtheappearanceofsurfacedetailintheinteriorofarenderedobject,thesilhouette (outer boundary) doesn’t show any such detail (it remains smooth). Heightmapping, if used to actuallymodify vertex locations, fixes this deficiency, but has its ownlimitations.Aswewillseelaterinthisbook,sometimesageometryortessellationshadercanbeusedtoincreasethenumberofvertices,makingheightmappingmorepracticalandmoreeffective.

We have taken the liberty of simplifying some of the bump and normal mappingcomputations. More accurate and/or more efficient solutions are available for criticalapplications[BN12].

Exercises

10.1 ExperimentwithProgram10.1bymodifyingthesettingsand/orcomputationsinthefragmentshader,andobservingtheresults.

10.2 Usingapaintprogram,generateyourownheightmapanduseitinProgram10.4.Seeifyoucanidentifylocationswheredetailismissingastheresultofthevertexshaderbeingunabletoadequatelysampletheheightmap.YouwillprobablyfinditusefultoalsotexturetheterrainwithyourheightmapimagefileasshowninFigure10.14(orwithsomesortofpatternthatexposesthesurfacestructure,suchasagrid),sothatyoucanseethehillsandvalleysoftheresultingterrain.

10.3 (PROJECT)AddlightingtoProgram10.4,sothatsurfacestructureoftheheight-mappedterrainisfurtherexposed.

10.4 (PROJECT)Addshadow-mappingtoyourcodefromExercise10.3sothatyourheight-mappedterraincastsshadows.

References

[BL78] J.Blinn,“SimulationofWrinkledSurfaces,”ComputerGraphics12,no.3(1978):286–292.

[BN12]E.BrunetonandF.Neyret,“ASurveyofNon-LinearPre-FilteringMethodsforEfficientandAccurateSurfaceShading,”IEEETransactionsonVisualizationandComputerGraphics18,no.2(2012).

[GI16] GNUImageManipulationProgram,accessedJuly2016,http://www.gimp.org.

[GP16] GIMPPluginRegistry,normalmapplugin,accessedJuly2016,http://registry.gimp.org/node/69.

[HT16] J.Hastings-Trew,JHT’sPlanetaryPixelEmporium,accessedJuly2016,http://planetpixelemporium.com/.

[LU16] F.Luna,Introductionto3DGameProgrammingwithDirectX12,2nded.(MercuryLearning,2016).

Page 235: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

[ME11] E.Meiri,OGLdevtutorial26,2011,accessedJuly2016,http://ogldev.atspace.co.uk/index.html.

[PH16] AdobePhotoshop,accessedJuly2016,http://www.photoshop.com.

[SS16] SSBumpGenerator,accessedJuly2016,http://ssbump-generator.yolasite.com/.

Page 236: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 11

PARAMETRICSURFACES

11.1 QuadraticBézierCurves11.2 CubicBézierCurves11.3 QuadraticBézierSurfaces11.4 CubicBézierSurfaces

SupplementalNotes

WhileworkingattheRenaultcorporationinthe1950sand1960s,PierreBézierdevelopedsoftware systems for designing automobile bodies. His programs utilized mathematicalsystems of equations developed earlier by Paul de Casteljau, who was working for thecompeting Citroën automobile manufacturer [BE72, DC63]. The de Casteljau equationsdescribecurvesusingjustafewscalarparameters,andareaccompaniedbyacleverrecursivealgorithmdubbed“deCasteljau’salgorithm”forgeneratingthecurvestoarbitraryprecision.Nowknownas“Béziercurves”and“Béziersurfaces,”thesemethodsarecommonlyusedtoefficientlymodelmanykindsofcurved3Dobjects.

11.1 QUADRATICBÉZIERCURVES

A Quadratic Bézier curve is defined by a set of parametric equations that specify aparticular curved shape using three control points, each of which is a point in 2D space.1Consider,forexample,thesetofthreepoints[p0,p1,p2]shownbelowinFigure11.1:

Page 237: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure11.1ControlpointsforaBéziercurve.

Byintroducingaparametert,wecanbuildasystemofparametricequationsthatdefineacurve. The t represents a fraction of the distance along the line segment connecting onecontrolpoint to thenext controlpoint.Values fort arewithin the range[0..1] for pointsalong the segment. Figure 11.2 shows one such value, t = 0.75, applied to the linesconnecting p0-p1 and p1-p2, respectively. Doing this defines two new points p01(t) andp12(t)alongthetwooriginallines.Werepeatthisprocessforthelinesegmentconnectingthetwo new points p01(t) and p12(t) producing point P(t) where t = 0.75 along the linep01(t)-p12(t).P(t)isoneofthepointsontheresultingcurve,andforthisreasonisdenotedwithacapitalP.

Figure11.2Pointsatparametricpositiont=0.75.

CollectingmanypointsP(t)forvariousvaluesoftgeneratesacurve,asshowninFigure11.3.Themoreparametervaluesfortthataresampled,themorepointsP(t)aregenerated,andthesmoothertheresultingcurve.

Figure11.3BuildingaquadraticBéziercurve.

TheanalyticdefinitionforaquadraticBéziercurvecannowbederived.First,wenotethatan arbitrary point p on the line segment pa-pb connecting two points pa and pb can berepresentedintermsoftheparametertasfollows:

Using this, we find the points p01 and p12 (points on p0-p1 and p1-p2 respectively) asfollows:

Page 238: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Similarly,apointontheconnectinglinesegmentbetweenthesepointswouldbe:

Substitutingthedefinitionsofp12andp01gives:

Factoringandcombiningtermsthengives:

or,

where:

Thus, we find any point on the curve by a weighted sum of the control points. TheweightingfunctionB isoftencalleda“blending function” (although thename“B” actuallyderives fromSergeiBernstein[BE16]who first characterized this family of polynomials).Notethattheblendingfunctionsareallquadraticinform,whichiswhytheresultingcurveiscalledaquadraticBéziercurve.

11.2 CUBICBÉZIERCURVES

We now extend ourmodel to four control points, resulting in a cubic Bézier curve asshowninFigure11.4.CubicBéziercurvesarecapableofdefiningamuchrichersetofshapesthanarequadraticcurves,whicharelimitedtoconcaveshapes.

Figure11.4BuildingacubicBéziercurve.

Page 239: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Asforthequadraticcase,wecanderiveananalyticdefinitionforcubicBéziercurves:

Apointonthecurvewouldthenbe:

Substitutingthedefinitionsofp12-23andp01-12andcollectingtermsyields:

where:

There are many different techniques for rendering Bézier curves. One approach is toiterate through successive values of t, starting at 0.0 and ending at 1.0, using a fixedincrement.Forinstance,iftheincrementis0.1,thenwecouldusealoopwithtvalues0.0,0.1,0.2,0.3,andsoon.Foreachvalueoft,thecorrespondingpointontheBéziercurvewouldbecomputed,andaseriesoflinesegmentsconnectingthesuccessivepointswouldbedrawn,asdescribedinthealgorithminFigure11.5:

Page 240: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure11.5IterativealgorithmforrenderingBéziercurves.

AnotherapproachistousedeCasteljau’salgorithmtorecursivelysubdivide thecurveinhalf,wheret=½ateachrecursivestep.Figure11.6shows the leftsidesubdivision intonewcubiccontrolpoints(q0,q1,q2,q3)showningreen,asderivedbydeCasteljau(afullderivationcanbefoundin[AS14]).

Figure11.6SubdividingacubicBéziercurve.

ThealgorithmisshowninFigure11.7.Itsubdividesthecurvesegmentsinhalfrepeatedly,untileachcurvesegmentissufficientlystraightenoughthatfurthersubdivisionproducesnotangible benefit. In the limiting case (as the control points are generated closer and closertogether), thecurvesegment itself iseffectively thesameasastraight linebetweenthefirstandlastcontrolpoints(p0andp3).Determiningwhetheracurvesegmentis“straightenough”can therefore be done by comparing the distance from the first control point to the lastcontrol point, versus the sum of the lengths of the three lines connecting the four controlpoints:

Page 241: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Then, if D1-D2 is less than a sufficiently small tolerance, there is no point in furthersubdivision.

Figure11.7RecursivesubdivisionalgorithmforBéziercurves.

AninterestingpropertyofthedeCasteljaualgorithmisthatitispossibletogenerateallofthe points on the curvewithout actually using the previously described blending functions.Also, note that the center point atp(½) is “shared”; that is, it is both the rightmost controlpointintheleftsubdivision,andtheleftmostcontrolpointintherightsubdivision.Itcanbecomputedeitherusingtheblendingfunctionsatt=½,orbyusingtheformula(q2+r1)/2,asderivedbydeCasteljau.

Asasidenote,wepointoutthatthesubdivide()functionshowninFigure11.7assumesthat the incomingparametersp,q,andr are“reference”parameters (suchas Javaobjects),andhencethecomputationsinthefunctionmodifytheactualparametersinthecallsfromthedrawBezierCurve()functionlistedaboveit.

11.3 QUADRATICBÉZIERSURFACES

WhereasBéziercurvesdefinecurved lines (in2Dor3D space),Bézier surfaces define

Page 242: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

curvedsurfaces in3Dspace.Extending the conceptswe saw in curves to surfaces requiresextending our system of parametric equations from one parameter to two parameters. ForBéziercurves,wecalledthatparametert.ForBéziersurfaces,wewillrefertotheparametersas u and v. Whereas our curves were comprised of points P(t), our surfaces will becomprisedofpointsP(u,v),asshowninFigure11.8.

Figure11.8Parametricsurface.

ForquadraticBéziersurfaces, thereare threecontrolpointsoneachaxisuandv, foratotal ofnine control points. Figure 11.9 shows an example of a set of nine control points(typically called a control point “mesh”) in blue, and the associated corresponding curvedsurface(inred).

Figure11.9QuadraticBéziercontrolmeshandcorrespondingsurface.

Theninecontrolpointsinthemesharelabeledpij,whereiandjrepresenttheindicesintheu andv directions respectively.Each set of three adjacent control points, such as(p00,p01,p02),definesaBéziercurve.PointsP(u,v)onthesurfacearethendefinedasasumoftwoblendingfunctions,oneintheudirectionandoneinthevdirection.Theformofthetwoblending functions for buildingBézier surfaces then follows from themethodology givenpreviouslyforBéziercurves:

Page 243: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

The points P(u,v) comprising the Bézier surface are then generated by summing theproductofeachcontrolpointpijandtheithandjthblendingfunctionsevaluatedatparametricvaluesuandvrespectively:

The setofgeneratedpoints that compriseaBézier surface is sometimescalledapatch.Theterm“patch”cansometimesbeconfusing,aswewillseelaterwhenwestudytessellationshaders (useful for actually implementingBézier surfaces). There, it is the grid of controlpointsthatistypicallycalleda“patch.”

11.4 CUBICBÉZIERSURFACES

Moving from quadratic to cubic surfaces requires utilizing a larger mesh—4x4 ratherthan 3x3. Figure 11.10 shows an example of a 16-control-point mesh (in blue), and thecorrespondingcurvedsurface(inred).

Figure11.10CubicBéziercontrolmeshandcorrespondingsurface.

Asbefore,wecanderivetheformulaforpointsP(u,v)onthesurfacebycombiningtheassociatedblendingfunctionsforcubicBéziercurves:

Page 244: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

where:

Rendering Bézier surfaces can also be done with recursive subdivision, by alternatelysplittingthesurfaceinhalfalongeachdimension,asshowninFigure11.11.Eachsubdivisionproducesfournewcontrolpointmeshes,eachcontaining16pointswhichdefineonequadrantofthesurface.

Figure11.11RecursivesubdivisionforBéziersurfaces.

Similar methods exist for rendering Bézier surfaces as we saw previously for Béziercurves, such as by using recursive subdivision [AS14].When rendering Bézier curves, westopped subdividing when the curve was “straight enough.” For Bézier surfaces, we stoprecursingwhenthesurfaceis“flatenough.”Onewayofdoingthisistoensurethatalloftherecursivelygeneratedpointsinasub-quadrantcontrolmesharewithinsomesmallallowabledistancefromaplanedefinedbythreeofthefourcornerpointsofthatmesh.Thedistancedbetweenapoint(x,y,z)andaplane(A,B,C,D)is:

Page 245: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Ifd is less thansomesufficientlysmall tolerance, thenwestopsubdividing,andsimplyusethefourcornercontrolpointsofthesub-quadrantmeshtodrawtwotriangles.

ThetessellationstageoftheOpenGLpipelineoffersanattractivealternativeapproachforrenderingBéziersurfacesbasedontheiterativealgorithminFigure11.5.Thestrategy is tohave the tessellator generate a large grid of vertices, then use the blending functions toreposition those vertices onto the Bézier surface as specified by the cubic Bézier controlpoints.WeimplementthisinChapter12.

SUPPLEMENTALNOTES

ThischapterfocusedonthemathematicalfundamentalsofparametricBéziercurvesandsurfaces.WehavedeferredpresentinganimplementationofanyoftheminOpenGL,becauseanappropriatevehiclefor this is the tessellationstage,covered in thenextchapter.Wealsoskippedsomeofthederivations,suchasfortherecursivesubdivisionalgorithm.

In3Dgraphics, therearemanyadvantagestousingBéziercurvesformodelingobjects.First,thoseobjectscan,intheory,bescaledarbitrarilyandstillretainsmoothsurfaceswithout“pixelating.” Second,many objectsmade up of complex curves can be storedmuchmoreefficientlyassetsofBéziercontrolpoints,ratherthanusingthousandsofvertices.

Bézier curves have many real-world applications besides computer graphics andautomobiles.Theycanalsobefoundinthedesignofbridges,suchasintheChordsBridgeinJerusalem[CB16].SimilartechniquesareusedforbuildingTrueTypefonts,whichasaresultcanbescaled toanyarbitrary size,orzoomed in toanydegreeofcloseness,whilealwaysretainingsmoothedges.

Exercises

11.1 QuadraticBéziercurvesarelimitedtodefiningcurvesthatarewholly“concave”or“convex.”Describe(ordraw)anexampleofacurvethatbendsinamannerthatisneitherwhollyconcavenorconvex,andthuscouldnotpossiblybeapproximatedbyaquadraticBéziercurve.

11.2 Usingapenorpencil,drawanarbitrarysetoffourpointsonapieceofpaper,numberthemfrom1to4inanyorder,andthentrytodrawanapproximationofthecubicBéziercurvedefinedbythosefourorderedcontrolpoints.Thenrearrangethenumberingofthecontrolpoints(i.e.,theirorder,butwithoutchangingtheirpositions)andredrawthenewresultingcubicBéziercurve.TherearenumerousonlinetoolsfordrawingBéziercurvesyoucanusetocheckyourapproximation.

Page 246: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

References

[AS14] E.AngelandD.Shreiner,InteractiveComputerGraphics:ATop-DownApproachwithWebGL,7thed.(Pearson,2014).

[CB16] ChordsBridge,Wikipedia,accessedJuly2016,https://en.wikipedia.org/wiki/Chords_Bridge.

[BE16] S.Bernstein,Wikipedia,accessedJuly2016,https://en.wikipedia.org/wiki/Sergei_Natanovich_Bernstein.

[BE72] P.Bézier,NumericalControl:MathematicsandApplications(London:JohnWiley&Sons,1972).

[DC63] P.deCasteljau,Courbesetsurfacesàpôles,technicalreport(Paris:A.Citroën,1963).

1Ofcourse,acurvecanexistin3Dspace.However,aquadraticcurveliesentirelywithina2Dplane.

Page 247: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 12

TESSELLATION

12.1 TessellationinOpenGL12.2 TessellationforBézierSurfaces12.3 TessellationforTerrain/HeightMaps12.4 ControllingLevelofDetail(LOD)

SupplementalNotes

The English language term “tessellation” refers to a large class of design activities inwhich tilesofvariousgeometricshapesarearrangedadjacently toformpatterns,generallyon a flat surface. The purpose can be artistic or practical, with examples dating backthousandsofyears[TS16].

In3Dgraphics,tessellationreferstosomethingalittlebitdifferent,butnodoubtinspiredby its classical counterpart. Here, tessellation refers to the generation andmanipulation oflarge numbers of triangles for rendering complex shapes and surfaces, preferably inhardware. Tessellation is a rather recent addition to theOpenGL core, not appearing until2010withversion4.0.1

12.1 TESSELLATIONINOPENGL

OpenGLsupportforhardwaretessellationismadeavailablethroughthreepipelinestages:

1. thetessellationcontrolshader2. thetessellator3. thetessellationevaluationshader

Thefirstandthirdstagesareprogrammable;theinterveningsecondstageisnot.Inorderto use tessellation, the programmer generally provides both a control shader and an

Page 248: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

evaluationshader.

Thetessellator(itsfullnameis tessellationprimitivegenerator,orTPG) isahardware-supported engine that produces fixed grids of triangles.2 The control shader allows us toconfigurewhatsortoftrianglemeshthetessellatoristobuild.Theevaluationshaderthenletsusmanipulatethegridinvariousways.Themanipulatedtrianglemeshisthenthesourceofverticesthatproceedthroughthepipeline.RecallfromFigure2.2that tessellationsits in thepipelinebetweenthevertexandgeometryshaderstages.

Let’sstartwithanapplicationthatsimplyusesthetessellatortocreateatrianglemeshofvertices,and thendisplays itwithoutanymanipulation.For this,wewillneed thefollowingmodules:

1. Java/JOGLapplication:Creates a camera and associated mvp matrix. The view (v) and projection (p)matricesorientthecamera;themodel(m)matrixcanbeusedtomodifythelocationandorientationofthegrid.

2. Vertexshader:Essentially does nothing in this example; the vertices will be generated in thetessellator.

3. TessellationControlShader(TCS):Specifiesthegridforthetessellatortobuild.

4. TessellationEvaluationShader(TES):Appliesthemvpmatrixtotheverticesinthegrid.

5. FragmentShader:Simplyoutputsafixedcolorforeverypixel.

We first list the entire application code, and then discuss its elements in detail. Even asimpleexamplesuchasthisoneisfairlycomplex,somanyofthecodeelementswillrequireexplanation.

Program12.1BasicTessellatorMesh

Java/JOGLapplicationprivateintcreateShaderProgram()

{//sameasbefore,butnowincludesTCSandTES

...

int[]tescCompiled=newint[1];

int[]teseCompiled=newint[1];

tcShaderSource=readShaderSource("tessC.shader");

teShaderSource=readShaderSource("tessE.shader");

inttessCShader=gl.glCreateShader(GL_TESS_CONTROL_SHADER);

inttessEShader=gl.glCreateShader(GL_TESS_EVALUATION_SHADER);

Page 249: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl.glShaderSource(tessCShader,tcShaderSource.length,tcShaderSource,null,

0);

gl.glShaderSource(tessEShader,teShaderSource.length,teShaderSource,null,

0);

gl.glCompileShader(tessCShader);

gl.glCompileShader(tessEShader);

...

gl.glAttachShader(rendering_program,tessCShader);

gl.glAttachShader(rendering_program,tessEShader);

...

}

publicvoiddisplay(GLAutoDrawabledrawable)

{...

gl.glUseProgram(rendering_program);

...

//setupmvpmatrixforcamera

...

//parametersfordrawinglikebefore;drawasfollows:

gl.glPatchParameteri(GL_PATCH_VERTICES,1);

gl.glPolygonMode(GL_FRONT_AND_BACK,GL_LINE);

gl.glDrawArrays(GL_PATCHES,0,1);

}

VertexShader#version430

uniformmat4mvp;

voidmain(void){}

TessellationControlShader#version430

uniformmat4mvp;

layout(vertices=1)out;

voidmain(void)

{gl_TessLevelOuter[0]=6;

gl_TessLevelOuter[1]=6;

gl_TessLevelOuter[2]=6;

gl_TessLevelOuter[3]=6;

gl_TessLevelInner[0]=12;

gl_TessLevelInner[1]=12;

}

TessellationEvaluationShader#version430

uniformmat4mvp;

layout(quads,equal_spacing,ccw)in;

Page 250: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

voidmain(void)

{floatu=gl_TessCoord.x;

floatv=gl_TessCoord.y;

gl_Position=mvp*vec4(u,0,v,1);

}

FragmentShader#version430

outvec4color;

uniformmat4mvp;

voidmain(void)

{color=vec4(1.0,1.0,0.0,1.0);//yellow

}

TheresultingoutputmeshisshowninFigure12.1.

Figure12.1Tessellatortrianglemeshoutput.

AscanbeseeninFigure12.1,thetessellatorproducesameshofverticesdefinedbytwoparameters:innerlevelandouterlevel.Inthiscase,theinnerlevelis12andtheouterlevelis6—the outer edges of the grid are divided into 6 segments, while the lines spanning theinterioraredividedinto12segments.

The specific relevant new constructs in Program 12.1 are highlighted. Let’s start bydiscussingthefirstportion—theJava/JOGLcode.

Compiling the twonewshaders isdoneexactly thesameasfor thevertexandfragmentshaders. They are then attached to the same rendering program, and the linking call isunchanged. The only new items are the constants for specifying the type of shader beinginstantiated—thenewconstantsarenamed:

GL_TESS_CONTROL_SHADER

GL_TESS_EVALUATION_SHADER

Note the new items in the display() function. The glDrawArrays() call now specifiesGL_PATCHES.Whenusing tessellation, vertices sent from the Java/JOGLapplication into thepipeline(i.e.,inaVBO)aren’trendered,butareusuallycontrolpoints,suchasthosewesawforBézier curves.Asetof controlpoints is calleda“patch,”and in those sectionsof thecode

Page 251: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

using tessellation,GL_PATCHES is the only allowable primitive.Thenumber of vertices in apatchisspecifiedinthecalltoglPatchParameteri().Inthisparticularexample,therearen’tanycontrolpointsbeingsent,butwearestillrequiredtospecifyatleast1.Similarly,intheglDrawArrays()callweindicateastartvalueof0andavertexcountof1,even thoughwearen’tactuallysendinganyverticesfromtheJOGLprogram.

ThecalltoglPolygonMode()specifieshowthemeshshouldberasterized.ThedefaultisGL_FILL. Shown in the code is GL_LINE, which as we saw in Figure 12.1 caused onlyconnecting lines to be rasterized (sowe could see the grid itself thatwas produced by thetessellator). Ifwe change that line of code to GL_FILL (or comment it out, resulting in thedefaultbehaviorGL_FILL),wegettheversionshowninFigure12.2.

Nowlet’sworkourwaythroughthefourshaders.Asindicatedearlier,thevertexshaderhaslittletodo,sincetheJava/JOGLapplicationisn’tprovidinganyvertices.Allitcontainsisa uniform declaration, tomatch the other shaders. In any case, it is a requirement that allshaderprogramsincludeavertexshader.

Figure12.2TessellatedmeshrenderedwithGL_FILL.

The Tessellation Control Shader specifies the topology of the triangle mesh that thetessellatoristoproduce.Six“level”parametersareset—two“inner”andfour“outer”levels—byassigningvaluestothereservedwordsnamedgl_TessLevelxxx.Thisisfortessellatinga large rectangular grid of triangles, called a quad.3 The levels tell the tessellator how tosubdividethegridwhenformingtriangles,andarearrangedasshowninFigure12.3.

Page 252: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure12.3Tessellationlevels.

Notethelineinthecontrolshaderthatsays:layout(vertices=1)out;

ThisisrelatedtothepriorGL_PATCHESdiscussionandspecifiesthenumberofverticesper“patch”beingpassedfromthevertexshadertothecontrolshader(and“out”totheevaluationshader). In this particular program there are none, but we still must specify at least one,because it also affects how many times the control shader executes. Later this value willreflect thenumberofcontrolpoints,andmustmatch thevalue in theglPatchParameteri()callintheJava/JOGLapplication.

Nextlet’slookattheTessellationEvaluationShader.Itstartswithalineofcodethatsays:layout(quads,equal_spacing,ccw)in;

Thismayatfirstappeartoberelatedtothe“out”layoutstatementinthecontrolshader,butactuallytheyareunrelated.Rather,thislineiswhereweinstructthetessellatortogenerateverticessotheyarearrangedinalargerectangle(a“quad”).Italsospecifiesthesubdivisions(innerandouter) tobeofequal length (laterwewill seeause for subdivisionsofunequallength).The“ccw”parameterspecifiesthewindingorderinwhichthetessellatedgridverticesaregenerated(inthiscase,counter-clockwise).

Theverticesgeneratedbythetessellatorarethensenttotheevaluationshader.Thus,theevaluation shadermay receive vertices both from the control shader (typically as controlpoints), and from the tessellator (the tessellated grid). In Program 12.1, vertices are onlyreceivedfromthetessellator.

The evaluation shader executes once for each vertex produced by the tessellator. Thevertexlocationisaccessibleusingthebuilt-invariablegl_TessCoord.ThetessellatedgridisorientedsuchthatitliesintheX-Zplane,andthereforegl_TessCoord’sXandYcomponentsare applied at the grid’sX andZ coordinates. The grid coordinates, and thus the values ofgl_TessCoord, range from 0.0 to 1.0 (this will be handy later when computing texturecoordinates).Theevaluationshader thenuses themvpmatrix toorienteachvertex(thiswasdoneinthevertexshaderinexamplesfromearlierchapters).

Finally, the fragment shader simply outputs a constant color yellow for each pixel.Wecan,ofcourse,alsouseit toapplya textureor lightingtooursceneaswesawinpreviouschapters.

12.2 TESSELLATIONFORBÉZIERSURFACES

Let’snowextendourprogramso that it turnsoursimplerectangulargrid intoaBéziersurface.Thetessellatedgridshouldgiveusplentyofverticesforsamplingthesurface(and

Page 253: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

wecanincreasetheinner/outersubdivisionlevelsifwewantmore).Whatwenowneedistosend control points through the pipeline, and then use those control points to perform thecomputationstoconvertthetessellatedgridintothedesiredBéziersurface.

AssumingthatwewishtobuildacubicBéziersurface,wewillneed16controlpoints.WecouldsendthemfromtheJavasideinaVBO,orwecouldhardcodetheminthevertexshader.Figure12.4showsanoverviewoftheprocess,withthecontrolpointscomingfromtheJavaside.

Figure12.4OverviewoftessellationforBéziersurfaces.

Now is agood time to explain abitmorepreciselyhow the tessellation control shader(TCS)works.Similartothevertexshader,theTCSexecutesonceperincomingvertex.Also,recall fromChapter 4 thatOpenGL provides a built-in variable called gl_VertexID whichholdsacounterthatindicateswhichinvocationofthevertexshaderiscurrentlyexecuting.Asimilarbuilt-invariablecalledgl_InvocationIDexistsforthetessellationcontrolshader.

ApowerfulfeatureoftessellationisthattheTCS(andalsotheTES)shaderhasaccesstoallofthecontrolpointverticessimultaneously,inarrays.Atfirst,itmayseemconfusingthattheTCSexecutesoncepervertex,wheneachinvocationhasaccesstoallofthevertices.Itisalsocounterintuitivethat thetessellationlevelsarespecifiedinassignmentstatementswhichareredundantlysetateachTCSinvocation.Althoughallofthismayseemodd,itisdonethisway because the tessellation architecture is designed so that TCS invocations can run inparallel.

OpenGLprovidesseveralbuilt-invariablesforuseintheTCSandTESshaders.Onesthatwe have alreadymentioned aregl_InvocationID, and of course gl_TessLevelInner andgl_TessLevelOuter.Herearesomemoredetailsanddescriptionsofsomeofthemostusefulbuilt-invariables:

TessellationControlShader(TCS)built-invariables:

Page 254: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl_in[]–anarraycontainingeachoftheincomingcontrolpointvertices—onearrayelementper incomingvertex.Particularvertexattributescanbeaccessedasfields using the “.” notation. One built-in attribute is gl_Position—thus, thepositionofincomingvertex“i”isaccessedasgl_in[i].gl_Position.gl_out[]–anarray forsendingoutgoingcontrolpointvertices to theTES—onearrayelementperoutgoingvertex.Particularvertexattributescanbeaccessedas fields using the “.” notation. One built-in attribute is gl_Position—thus, thepositionofoutgoingvertex“i”isaccessedasgl_out[i].gl_Position.gl_InvocationID–anintegerIDcounterindicatingwhichinvocationoftheTCSiscurrentlyexecuting.Onecommonuseisforpassingthroughvertexattributes;forexample,passingthecurrentinvocation’svertexpositionfromtheTCStotheTESwould be done as follows: gl_out[gl_InvocationID].gl_Position =

gl_in[gl_InvocationID].gl_Position;

TessellationEvaluationShader(TES)built-invariables:

gl_in[]–anarraycontainingeachoftheincomingcontrolpointvertices—oneelementperincomingvertex.Particularvertexattributescanbeaccessedasfieldsusingthe“.”notation.Onebuilt-inattributeisgl_Position—thus,incomingvertexpositionsareaccessedasgl_in[xxx].gl_Position.gl_Position–outputpositionofatessellatedgridvertex,possiblymodifiedintheTES. It is important to note that gl_Position and gl_in[xxx].gl_Position aredifferent—gl_Position is the position of an output vertex that originated in thetessellator, while gl_in[xxx].gl_Position is a control point vertex positioncomingintotheTESfromtheTCS.

ItisimportanttonotethatinputandoutputcontrolpointvertexattributesintheTCSarearrays.Bycontrast,inputcontrolpointverticesandvertexattributesintheTESarearrays,butoutputverticesarescalars.Also, it iseasy tobecomeconfusedas towhichverticesare forcontrol points, andwhich are tessellated and thenmoved to form the resulting surface.Tosummarize,allvertexinputsandoutputstotheTCSarecontrolpoints,whereasintheTES,gl_in[ ] holds incoming control points, gl_TessCoord holds incoming tessellated gridpoints,andgl_Positionholdsoutputsurfaceverticesforrendering.

Ourtessellationcontrolshadernowhastwotasks:specifyingthetessellationlevels,andpassing the control points through from the vertex shader to the evaluation shader. Theevaluationshadercanthenmodifythelocationsofthegridpoints(thegl_TessCoords)basedontheBéziercontrolpoints.

Program12.2showsallfourshaders—vertex,TCS,TES,andfragment—forspecifyingacontrolpointpatch,generatingaflattessellatedgridofvertices,repositioningthoseverticesonthecurvedsurfacespecifiedbythecontrolpoints,andpaintingtheresultingsurfacewithatextureimage.ItalsoshowstherelevantportionoftheJava/JOGLapplication,specificallyinthedisplay()function.Inthisexample,thecontrolpointsoriginateinthevertexshader(theyare hardcoded there), rather than entering the OpenGL pipeline from the Java/JOGLapplication.Additionalcodedetailsfollowafterthecodelisting.

Page 255: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Program12.2TessellationforBézierSurface

VertexShader#version430

outvec2texCoord;

uniformmat4mvp;

layout(binding=0)uniformsampler2Dtex_color;

voidmain(void)

{//thistimethevertexshaderdefinesandsendsoutcontrolpoints:

constvec4vertices[]=

vec4[](vec4(-1.0,0.5,-1.0,1.0),vec4(-0.5,0.5,-1.0,1.0),

vec4(0.5,0.5,-1.0,1.0),vec4(1.0,0.5,-1.0,1.0),

vec4(-1.0,0.0,-0.5,1.0),vec4(-0.5,0.0,-0.5,1.0),

vec4(0.5,0.0,-0.5,1.0),vec4(1.0,0.0,-0.5,1.0),

vec4(-1.0,0.0,0.5,1.0),vec4(-0.5,0.0,0.5,1.0),

vec4(0.5,0.0,0.5,1.0),vec4(1.0,0.0,0.5,1.0),

vec4(-1.0,-0.5,1.0,1.0),vec4(-0.5,0.3,1.0,1.0),

vec4(0.5,0.3,1.0,1.0),vec4(1.0,0.3,1.0,1.0));

//computeanappropriatetexturecoordinateforthecurrentvertex,

shiftedfrom(-1...+1)to(0...1)

texCoord=vec2((vertices[gl_VertexID].x+1.0)/2.0,

(vertices[gl_VertexID].z+1.0)/2.0);

gl_Position=vertices[gl_VertexID];

}

TessellationControlShader#version430

invec2texCoord[];//Thetexturecoordsoutputfromthevertexshaderas

scalarsarriveinanarray

outvec2texCoord_TCSout[];//andarethenpassedthroughtotheevaluation

shader

uniformmat4mvp;

layout(binding=0)uniformsampler2Dtex_color;

layout(vertices=16)out;//thereare16controlpointsperpatch

voidmain(void)

{intTL=32;//tessellationlevelsareallsettothisvalue

if(gl_InvocationID==0)

{gl_TessLevelOuter[0]=TL;gl_TessLevelOuter[2]=TL;

gl_TessLevelOuter[1]=TL;gl_TessLevelOuter[3]=TL;

gl_TessLevelInner[0]=TL;gl_TessLevelInner[1]=TL;

}

//forwardthetextureandcontrolpointstotheTES

texCoord_TCSout[gl_InvocationID]=texCoord[gl_InvocationID];

gl_out[gl_InvocationID].gl_Position=gl_in[gl_InvocationID].gl_Position;

Page 256: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}

TessellationEvaluationShader#version430

layout(quads,equal_spacing,ccw)in;

uniformmat4mvp;

layout(binding=0)uniformsampler2Dtex_color;

invec2texCoord_TCSout[];//texturecoordinatearraycomingin

outvec2texCoord_TESout;//scalarsgoingoutoneatatime

voidmain(void)

{vec3p00=(gl_in[0].gl_Position).xyz;

vec3p10=(gl_in[1].gl_Position).xyz;

vec3p20=(gl_in[2].gl_Position).xyz;

vec3p30=(gl_in[3].gl_Position).xyz;

vec3p01=(gl_in[4].gl_Position).xyz;

vec3p11=(gl_in[5].gl_Position).xyz;

vec3p21=(gl_in[6].gl_Position).xyz;

vec3p31=(gl_in[7].gl_Position).xyz;

vec3p02=(gl_in[8].gl_Position).xyz;

vec3p12=(gl_in[9].gl_Position).xyz;

vec3p22=(gl_in[10].gl_Position).xyz;

vec3p32=(gl_in[11].gl_Position).xyz;

vec3p03=(gl_in[12].gl_Position).xyz;

vec3p13=(gl_in[13].gl_Position).xyz;

vec3p23=(gl_in[14].gl_Position).xyz;

vec3p33=(gl_in[15].gl_Position).xyz;

floatu=gl_TessCoord.x;

floatv=gl_TessCoord.y;

//cubicBezierbasisfunctions

floatbu0=(1.0-u)*(1.0-u)*(1.0-u);//(1-u)^3

floatbu1=3.0*u*(1.0-u)*(1.0-u);//3u(1-u)^2

floatbu2=3.0*u*u*(1.0-u);//3u^2(1-u)

floatbu3=u*u*u;//u^3

floatbv0=(1.0-v)*(1.0-v)*(1.0-v);//(1-v)^3

floatbv1=3.0*v*(1.0-v)*(1.0-v);//3v(1-v)^2

floatbv2=3.0*v*v*(1.0-v);//3v^2(1-v)

floatbv3=v*v*v;//v^3

//outputthepositionofthisvertexinthetessellatedpatch

vec3outputPosition=

bu0*(bv0*p00+bv1*p01+bv2*p02+bv3*p03)

+bu1*(bv0*p10+bv1*p11+bv2*p12+bv3*p13)

+bu2*(bv0*p20+bv1*p21+bv2*p22+bv3*p23)

+bu3*(bv0*p30+bv1*p31+bv2*p32+bv3*p33);

gl_Position=mvp*vec4(outputPosition,1.0f);

//outputtheinterpolatedtexturecoordinates

vec2tc1=mix(texCoord_TCSout[0],texCoord_TCSout[3],gl_TessCoord.x);

vec2tc2=mix(texCoord_TCSout[12],texCoord_TCSout[15],gl_TessCoord.x);

vec2tc=mix(tc2,tc1,gl_TessCoord.y);

texCoord_TESout=tc;

Page 257: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}

FragmentShader#version430

invec2texCoord_TESout;

outvec4color;

uniformmat4mvp;

layout(binding=0)uniformsampler2Dtex_color;

voidmain(void)

{color=texture(tex_color,texCoord_TESout);

}

Java/JOGLapplication//Thistimewealsopassatexturetopaintthesurface.

//Loadthetextureininit()asusual,thenenableitindisplay()

publicvoiddisplay(GLAutoDrawabledrawable)

{...

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_2D,textureID);

gl.glFrontFace(GL_CCW);

gl.glPatchParameteri(GL_PATCH_VERTICES,16);//numberofverticesper

patch=16

gl.glPolygonMode(GL_FRONT_AND_BACK,GL_FILL);

gl.glDrawArrays(GL_PATCHES,0,16);//totalnumberofpatchvertices:16

x1patch=16

}

The vertex shader now specifies 16 control points (the “patch” vertices) representing aparticularBéziersurface.Inthisexampletheyareallnormalizedtotherange[-1…+1].Thevertexshaderalsousesthecontrolpointstodeterminetexturecoordinatesappropriateforthetessellated grid,with values in the range [0...1]. It is important to reiterate that the verticesoutput fromthevertexshaderarenotvertices thatwillbe rasterized,but insteadareBéziercontrolpoints.Whenusingtessellation,patchverticesarenever rasterized—only tessellatedverticesproceedtorasterization.

The control shader still specifies the inner and outer tessellation levels. It now has theadditional responsibility of forwarding the control points and texture coordinates to theevaluation shader. Note that the tessellation levels only need to be specified once, andthereforethatstepisdoneonlyduringthe0thinvocation(recall thattheTCSrunsoncepervertex—thusthereare16invocationsinthisexample).Forconvenience,wehavespecified32subdivisionsforeachtessellationlevel.

Next, the evaluation shader performs all of theBézier surface computations. The largeblockofassignmentstatementsatthebeginningofmain()extractthecontrolpointsfromthe

Page 258: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

incominggl_Position’sofeach incominggl_in (note that thesecorrespond to thecontrolshader ’sgl_outvariable).Theweightsfortheblendingfunctionsarethencomputedusingthegridpointscominginfromthetessellator,resultinginanewoutputPosition, towhich themodel-view-projectionmatrixisthenapplied,producinganoutputgl_Positionforeachgridpoint,formingtheBéziersurface.

Itisalsonecessarytocreatetexturecoordinates.Thevertexshaderonlyprovidedoneforeach control point location. But it isn’t the control points that are being rendered—weultimately need texture coordinates for the much larger number of tessellated grid points.Therearevariouswaysofdoingthis—herewelinearlyinterpolatethemusingGLSL’shandymix() function.Themix() function expects three parameters: (a) starting point, (b) endingpoint,and(c)interpolationvalue,whichrangesfrom0to1.Itreturnsthevaluebetweenthestartingandendingpointcorrespondingtotheinterpolationvalue.Sincethetessellatedgridcoordinatesalsorangefrom0to1,theycanbeuseddirectlyforthispurpose.

Thistimeinthefragmentshader,ratherthanoutputtingasinglecolor,standardtexturingis applied. The texture coordinates, in the attribute texCoord_TESout, are those that wereproduced in the evaluation shader. The changes to the JOGL program are similarlystraightforward—notethatapatchsizeof16isnowspecified.TheresultingoutputisshowninFigure12.5(atiletextureisapplied[LU16]).

Figure12.5TessellatedBéziersurface.

12.3 TESSELLATIONFORTERRAIN/HEIGHTMAPS

Recallthatperformingheightmappinginthevertexshadercansufferfromaninsufficientnumberofvertices to render thedesireddetail.Nowthatwehaveaway togenerate lotsofvertices,let’sgobacktoHastings-Trew’smoonsurfacetexturemap(from[HT16]),anduseitasaheightmapbyraisingtessellatedverticestoproducemoonsurfacedetail.Aswewillsee,this has the advantages of achieving vertex geometry that bettermatches themoon image,alongwithimprovedsilhouette(edge)detail.

Ourstrategyis tomodifyProgram12.1,placinga tessellatedgrid in theX-Zplane,andusingheightmappingtosettheYcoordinateofeachtessellatedgridpoint.Todothis,apatch

Page 259: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

isn’tneeded,becausewecanhardcodethelocationofthetessellatedgrid,sowewillspecifytherequiredminimumof1vertexperpatchinglDrawArrays()andglPatchParameteri()aswasdoneinProgram12.1.Hastings-Trew’smoontextureimageisusedbothforcolorandastheheightmap.

We generate vertex and texture coordinates in the evaluation shader by mapping thetessellatedgrid’sgl_TessCoord values toappropriate ranges forverticesand textures.4 Theevaluationshaderalsoiswheretheheight-mappingisperformed,byaddingafractionofthecolorcomponentofthemoontexturetotheYcomponentofoutputvertex.ThechangestotheshadersareshowninProgram12.3.

Program12.3SimpleTessellatedTerrain

VertexShader#version430

uniformmat4mvp;

layout(binding=0)uniformsampler2Dtex_color;

voidmain(void){}

TessellationControlShader...

layout(vertices=1)out;//nocontrolpointsarenecessaryforthis

application

voidmain(void)

{intTL=32;

if(gl_InvocationID==0)

{gl_TessLevelOuter[0]=TL;gl_TessLevelOuter[2]=TL;

gl_TessLevelOuter[1]=TL;gl_TessLevelOuter[3]=TL;

gl_TessLevelInner[0]=TL;gl_TessLevelInner[1]=TL;

}

}

TessellationEvaluationShader...

outvec2tes_out;

uniformmat4mvp;

layout(binding=0)uniformsampler2Dtex_color;

voidmain(void)

{//mapthetessellatedgridverticesfrom(0..1)ontothedesiredvertices

(-0.5...+0.5)

vec4tessellatedPoint=vec4(gl_TessCoord.x-0.5,0.0,gl_TessCoord.y-0.5,

1.0);

//mapthetessellatedgridverticesastexturecoordinatesby"flipping"

theYvaluesvertically.

Page 260: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//Vertexcoordshave(0,0)atupperleft,texturecoordshave(0,0)at

lowerleft.

vec2tc=vec2(gl_TessCoord.x,1.0-gl_TessCoord.y);

//Theimageisgrayscale,soeithercomponent(R,G,orB)canserveas

heightoffset.

tessellatedPoint.y+=(texture(tex_color,tc).r)/40.0;//scaleddown

colorvalues.

//converttheheight-mapraisedpointtoeyespace

gl_Position=mvp*tessellatedPoint;

tes_out=tc;

}

FragmentShader...

invec2tes_out;

outvec4color;

layout(binding=0)uniformsampler2Dtex_color;

voidmain(void)

{color=texture(tex_color,tes_out);

}

The fragment shader is the same as the one for Program 12.2, and simply outputs thecolor based on the texture image. The Java/JOGL application is essentially unchanged—itloads the texture (servingasboth the texture andheightmap) andenables a sampler for it.Figure 12.6 shows the texture image (on the left) and the final output of this first attempt,whichunfortunatelydoesnotyetachieveproperheightmapping.

Figure12.6Tessellatedterrain–failedfirstattempt,withinsufficentnumberofvertices.

Thefirstresultsareseverelyflawed.Althoughwecannowseesilhouettedetailonthefarhorizon,thebumpstheredon’tcorrespondtotheactualdetailinthetexturemap.Recallthatin

Page 261: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

aheightmap,white issupposed tomean“high,”andblack issupposed tomean“low.”Theareaat theupper right, inparticular, shows largehills thatbearnorelation to the lightanddarkcolorsintheimage.

Thecauseofthisproblemistheresolutionofthetessellatedgrid.Themaximumnumberof vertices that can be generated by the tessellator is hardware dependent, and amaximumvalueofatleast64foreachtessellationlevelisallthatisrequiredforcompliancewiththeOpenGL standard. Our program specified a single tessellated grid with inner and outertessellation levelsof32, sowegeneratedabout32*32, or justover1000 vertices,which isinsufficienttoreflectthedetailintheimageaccurately.Thisisespeciallyapparentalongtheupper right (enlarged in the figure)—theedgedetail isonlysampledat32pointsalong thehorizon,producinglarge,random-lookinghills.Evenifweincreasedthetessellationvaluesto64, thetotalof64*64or justover4000verticeswouldstillbewoefully inadequate todoheight-mappingusingthemoonimage.

One way to increase the number of vertices is by using instancing, which we saw inChapter4.Our strategywillbe tohave the tessellatorgenerategrids, anduse instancing torepeatthismanytimes.Inthevertexshaderwebuildapatchdefinedbyfourvertices,oneforeach corner of a tessellated grid. In our Java/JOGL application we change theglDrawArrays() call to glDrawArraysInstanced(). There, we specify a grid of 64 by 64patches,eachofwhichcontainsatessellatedmeshwithlevelsofsize32.Thiswillgiveusatotalof64*64*32*32,orover4millionvertices.

Thevertexshaderstartsbyspecifyingfourtexturecoordinates(0,0),(0,1),(1,0),and(1,0).Whenusinginstancing,recallthatthevertexshaderhasaccesstoanintegervariablegl_InstanceID,whichholdsacountercorresponding to theglDrawArraysInstanced() callthat is currently being processed. We use this ID value to distribute the locations of theindividualpatcheswithinthelargergrid.Thepatchesarepositionedinrowsandcolumns,thefirstpatchatlocation(0,0),thenextat(1,0),thenextat(2,0),andsoon,andthefinalpatchinthefirstcolumnat(63,0).Thenextcolumnhaspatchesat(0,1),(1,1),andsoforthupto(63,1).The final columnhaspatches at(0,63),(1,63), and soonup to(63,63). TheXcoordinateforagivenpatchistheinstanceIDmodulo64,andtheYcoordinateistheinstanceIDdividedby64(withintegerdivision).Theshaderthenscalesthecoordinatesbackdowntotherange(0..1).

The control shader is unchanged, except that it passes through the vertices and texturecoordinates.

Next, the evaluation shader takes the incoming tessellated grid vertices (specified bygl_TessCoord)andmovesthemintothecoordinaterangespecifiedbytheincomingpatch.Itdoesthesameforthetexturecoordinates.ItalsoappliesheightmappinginthesamewayaswasdoneinProgram12.3.Thefragmentshaderisunchanged.

ThechangestoeachofthecomponentsareshowninProgram12.4.TheresultisshowninFigure12.7.Note that the highs and lows now correspondmuchmore closely to light anddarksectionsoftheimage.

Page 262: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Program12.4InstancedTessellatedTerrain

Java/JOGLapplication//sameasforBeziersurfaceexample,withthesechanges:

gl.glPatchParameteri(GL_PATCH_VERTICES,4);

gl.glDrawArraysInstanced(GL_PATCHES,0,4,64*64);

VertexShader...

outvec2tc;

...

voidmain(void)

{vec2patchTexCoords[]=vec2[](vec2(0,0),vec2(1,0),vec2(0,1),

vec2(1,1));

//computeanoffsetforcoordinatesbasedonwhichinstancethisis

intx=gl_InstanceID%64;

inty=gl_InstanceID/64;

//texcoordsaredistributedacross64patches,normalizedto(0..1).

FlipYcoords.

tc=vec2((x+patchTexCoords[gl_VertexID].x)/64.0,(64-

y+patchTexCoords[gl_VertexID].y)/64.0);

//vertexlocationsarethesameastexturecoordinates,excepttheyrange

from-0.5to+0.5.

gl_Position=vec4(tc.x-0.5,0.0,(1.0-tc.y)-0.5,1.0);//Alsoun-

fliptheY’s.

}

TessellationControlShader...

layout(vertices=4)out;

invec2tc[];

outvec2tcs_out[];

...

voidmain(void)

{//tessellationlevelspecificationthesameasthepreviousexample

...

tcs_out[gl_InvocationID]=tc[gl_InvocationID];

gl_out[gl_InvocationID].gl_Position=gl_in[gl_InvocationID].gl_Position;

}

TessellationEvaluationShader...

invec2tcs_out[];

outvec2tes_out;

voidmain(void)

Page 263: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

{//mapthetexturecoordinatesontothesub-gridspecifiedbytheincoming

controlpoints

vec2tc=vec2(tcs_out[0].x+(gl_TessCoord.x)/64.0,tcs_out[0].y+(1.0-

gl_TessCoord.y)/64.0);

//mapthetessellatedgridontothesub-gridspecifiedbytheincoming

controlpoints

vec4tessellatedPoint=vec4(gl_in[0].gl_Position.x+gl_TessCoord.x/64.0,

0.0,

gl_in[0].gl_Position.z+gl_TessCoord.y/64.0,1.0);

//addtheheightfromtheheightmaptothevertex:

tessellatedPoint.y+=(texture(tex_height,tc).r)/40.0;

gl_Position=mvp*tessellatedPoint;

tes_out=tc;

}

Now that we have achieved height mapping, we can work on improving it andincorporating lighting. One challenge is that our vertices do not yet have normal vectorsassociatedwiththem.Anotherchallengeisthatsimplyusingthetextureimageasaheightmaphasproducedanoverly“jagged”result—inthiscasebecausenotallgrayscalevariationinthetexture image is due to height. For this particular texturemap, it so happens thatHastings-Trewhasalreadyproducedan improvedheightmap thatwecanuse [HT16]. It is shown inFigure12.8(ontheleft).

Figure12.7Tessellatedterrain–secondattempt,withinstancing.

Page 264: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure12.8Moonsurface:heightmap[HT16]andnormalmap.

To create normals, we could compute them on the fly, by generating the heights ofneighboringvertices(orneighboringtexelsintheheightmap),buildingvectorsconnectingthem,andusingacrossproducttofindthenormal.Thisrequiressometuning,dependingontheprecisionofthescene(and/ortheheightmapimage).HerewehaveinsteadusedtheGIMP“normalmap”plugin[GP16]togenerateanormalmapbasedonHastings-Trew’sheightmap,showninFigure12.8(ontheright).

MostofthechangestoourcodearenowsimplytoimplementthestandardmethodsforPhongshading:

Java/JOGLapplicationWe load and activate an additional texture to hold the normalmap.We also addcodetospecifythelightingandmaterialsaswehavedoneinpreviousapplications.VertexshaderTheonlyadditionsaredeclarationsforlightinguniformsandthesamplerforthenormalmap.Lightingcodecustomarilydoneinthevertexshaderismovedtothetessellation evaluation shader, because the vertices aren’t generated until thetessellationstage.TessellationControlshaderTheonlyadditionsaredeclarationsforlightinguniformsandthesamplerforthenormalmap.TessellationEvaluationshaderThepreparatorycodeforPhonglightingisnowplacedintheevaluationshader:

varyingVertPos=(mv_matrix*position).xyz;

varyingLightDir=light.position-varyingVertPos;

FragmentshaderThe typical code sections,describedpreviously, for computingPhong (orBlinn-Phong) lighting are done here, as well as the code to extract normals from thenormalmap. The lighting result is then combinedwith the texture imagewith aweightedsum.

Page 265: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Thefinalresult,withheightandnormalmapping,andPhonglighting,isshowninFigure12.9.Theterrainnowrespondstolighting.Inthisexample,apositionallighthasbeenplacedtotheleftofcenterintheimageontheleft,andtotherightofcenterintheimageontheright.

Figure12.9Tessellatedterrainwithnormalmapandlighting(lightsourcepositionedatleft,andatright,respectively).

Althoughtheresponsetothemovementofthelightisdifficulttotellfromastillpicture,thereadershouldbeabletodiscernthediffuselightingchanges,andthatspecularhighlightsonthepeaksareverydifferentinthetwoimages.Thisisofcoursemoreobviouswhenthecamera or the light source ismoving. The results are still imperfect, because the originaltexture that is incorporated in theoutput includes shadows thatwill appearon the renderedresult,regardlessoflighting.

12.4 CONTROLLINGLEVELOFDETAIL(LOD)

Using instancing to generate millions of vertices in real-time, as in Program 12.4, islikelytoplacealoadonevenawell-equippedmoderncomputer.Fortunately,thestrategyofdividing the terrain into separate patches, as we have done to increase the number ofgeneratedgridvertices,alsoaffordsusanicemechanismforreducingthatload.

Of themillions of vertices beinggenerated,manyof themaren’t necessary.Vertices inpatches that are close to the camera are important because we expect to discern detail innearbyobjects.However,thefurtherthepatchesarefromthecamera,thelesslikelytherewilleven be enough pixels in the rasterization to warrant the number of vertices we aregenerating!

Changing thenumberofvertices in apatchbasedon thedistance from the camera is atechniquecalledlevelofdetail,orLOD.Sellersetal.describeawayofcontrollingLODininstanced tessellation [SW15], by modifying the control shader. Program 12.5 shows asimplifiedversionofSellers’ approach.The strategy is touse thepatch’sperceived size todetermine the values of its tessellation levels. Since the tessellated grid for a patch willeventuallybeplacedwithinthesquaredefinedbythefourcontrolpointsenteringthecontrol

Page 266: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

shader,wecanusethelocationsofthecontrolpointsrelativetothecameratodeterminehowmanyverticesshouldbegeneratedforthepatch.Thestepsareasfollows:

1. CalculatethescreenlocationsofthefourcontrolpointsbyapplyingtheMVPmatrixtothem.

2. Calculatethelengthsofthesidesofthesquare(i.e.,thewidthandheight)definedbythecontrolpoints (in screenspace).Note thateven though the fourcontrolpointsformasquare,thesesidelengthscandifferbecausetheperspectivematrixhasbeenapplied.

3. Scale the lengths’valuesbya tunableconstant,dependingontheprecisionneededforthetessellationlevels(basedontheamountofdetailintheheightmap).

4. Add1tothescaledlengthvalues,toavoidthepossibilityofspecifyingatessellationlevelof0(whichwouldresultinnoverticesbeinggenerated).

5. Setthetessellationlevelstothecorrespondingcalculatedwidthandheightvalues.

Recallthatinourinstancedexamplewearenotcreatingjustonegrid,but64*64ofthem.So the fivesteps listedaboveareperformedforeachpatch.Thus, the levelofdetailvariesfrompatchtopatch.

All of the changes are in the control shader and are shown in Program 12.5, with thegeneratedoutputfollowinginFigure12.10.Notethatthevariablegl_InvocationIDreferstowhichvertexinthepatchisbeingprocessed(notwhichpatchisbeingprocessed).Therefore,theLODcomputationwhichtellsthetessellatorhowmanyverticestogenerateoccursduringtheprocessingofthe0thvertexineachpatch.

Program12.5TessellationLevelofDetail(LOD)

TessellationControlShader...

voidmain(void)

{floatsubdivisions=16.0;//tunableconstantbasedondensityofdetail

inheightmap

if(gl_InvocationID==0)

{vec4p0=normalize(mvp*gl_in[0].gl_Position);//determinecontrol

pointpositionsin

//screenspace

vec4p1=normalize(mvp*gl_in[1].gl_Position);

vec4p2=normalize(mvp*gl_in[2].gl_Position);

floatwidth=length(p2.xy-p0.xy)*subdivisions+1.0;//perceived

"width"oftessellatedgrid

floatheight=length(p1.xy-p0.xy)*subdivisions+1.0;//perceived

"height"oftessellatedgrid

gl_TessLevelOuter[0]=height;//settessellationlevelsbasedon

perceivedsidelengths

gl_TessLevelOuter[1]=width;

gl_TessLevelOuter[2]=height;

gl_TessLevelOuter[3]=width;

gl_TessLevelInner[0]=width;

Page 267: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl_TessLevelInner[1]=height;

}

//forwardtexturecoordinatesandcontrolpointstoTESasbefore

tcs_out[gl_InvocationID]=tc[gl_InvocationID];

gl_out[gl_InvocationID].gl_Position=gl_in[gl_InvocationID].gl_Position;

}

Applying these control shader changes to the instanced (but not lighted)versionof ourscenefromFigure12.7,andreplacingtheheightmapwithHastings-Trew’simprovedversionshowninFigure12.8producestheimprovedscene,withmorerealistichorizondetail,showninFigure12.10.

Figure12.10Tessellatedmoonwithcontrolledlevelofdetail(LOD).

In this example it is also useful to change the layout specifier in the evaluation shaderfrom:

layout(quads,equal_spacing)in;

to:

layout(quads,fractional_even_spacing)in;

The reason for thismodification is difficult to illustrate in still images. In an animatedscene,asatessellatedobjectmovesthrough3Dspace,ifLODisuseditissometimespossibletoseethechangesintessellationlevelsonthesurfaceoftheobjectaswigglingartifactscalled“popping.”Changingfromequalspacingtofractionalspacingreducesthiseffectbymakingthe grid geometry of adjacent patch instancesmore similar, even if they differ in level ofdetail.(SeeExercises12.2and12.3.)

Employing LOD can dramatically reduce the load on the system. For example, whenanimated,thescenemightbelesslikelytoappearjerkyorlagthancouldbethecasewithoutcontrollingLOD.

Applying this simple LOD technique to the version that includes Phong shading (i.e.,Program12.4) is a bit trickier.This is because the changes inLODbetweenadjacent patch

Page 268: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

instancescaninturncausesuddenchangestotheassociatednormalvectors,causingpoppingartifacts in the lighting!As always, there are tradeoffs and compromises to considerwhenconstructingacomplex3Dscene.

SUPPLEMENTALNOTES

Combining tessellation with LOD is particularly useful in real-time virtual realityapplications that require both complex detail for realism and frequent object movementand/or changes in camera position, such as in computer games. In this chapter we haveillustrated the use of tessellation andLOD for real-time terrain generation, although it canalso be applied in other areas such as in displacement mapping for 3D models (wheretessellatedverticesareaddedtothesurfaceofamodelandthenmovedsoastoadddetail).ItisalsousefulinComputerAidedDesignapplications.

Sellers et al. extends theLOD technique (shown inProgram12.5) further thanwe havepresented,byalsoeliminatingverticesinpatchesthatarebehindthecamera(theydothisbysettingtheirinnerandouterlevelstozero)[SW15].Thisisanexampleofacullingtechnique,anditisaveryusefulonebecauseoftheloadthatinstancedtessellationcanstillplaceonthesystem.

Exercises

12.1 ModifyProgram12.1toexperimentwithvariousvaluesforinnerandoutertessellationlevels,andobservingtheresultingrenderedmesh.

12.2 ModifyProgram12.1bychangingthelayoutspecifierintheevaluationshaderfromequal_spacingtofractional_even_spacing,asshowninSection12.4.Observetheeffectonthegeneratedmesh.

12.3 TestProgram12.5withthelayoutspecifierintheevaluationshadersettoequal_spacing,andthentofractional_even_spacing,asdescribedinSection12.4.Observetheeffectsontherenderedsurfaceasthecameramoves.Youshouldbeabletoobservepoppingartifactsinthefirstcase,whicharemostlyalleviatedinthesecondcase.

12.4 (PROJECT)ModifyProgram12.3toutilizeaheightmapofyourowndesign(youcouldusetheoneyoubuiltpreviouslyinExercise10.2).Thenaddlightingandshadow-mappingsothatyourtessellatedterraincastsshadows.Thisisacomplexexercise,becausesomeofthecodeinthefirstandsecondshadow-mappingpasseswillneedtobemovedtotheevaluationshader.

References

Page 269: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

[GP16] GIMPPluginRegistry,normalmapplugin,accessedJuly2016,http://registry.gimp.org/node/69.

[LU16] F.Luna,Introductionto3DGameProgrammingwithDirectX12,2nded.(MercuryLearning,2016).

[HT16] J.Hastings-Trew,JHT’sPlanetaryPixelEmporium,accessedJuly2016,http://planetpixelemporium.com/.

[SW15] G.Sellers,R.WrightJr.,andN.Haemel,OpenGLSuperBible:ComprehensiveTutorialandReference,7thed.(Addison-Wesley,2015).

[TS16] Tessellation,Wikipedia,accessedJuly2016,https://en.wikipedia.org/wiki/Tessellation.

1 Although the GLU toolset previously included a utility for tessellation much earlier called gluTess. In 2001, Radeonreleasedthefirstcommercialgraphicscardwithtessellationsupport,buttherewerefewtoolsabletotakeadvantageofit.2orlines,butwewillfocusontriangles.3Thetessellatorisalsocapableofbuildingatriangulargridoftriangles,butthatisn’tcoveredinthistextbook.4 In some applications the texture coordinates are produced externally, such as when tessellation is being used to provideadditionalverticesforanimportedmodel.Insuchcases,theprovidedtexturecoordinateswouldneedtobeinterpolated.

Page 270: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 13

GEOMETRYSHADERS

13.1 Per-PrimitiveProcessinginOpenGL13.2 AlteringPrimitives13.3 DeletingPrimitives13.4 AddingPrimitives

SupplementalNotes

ImmediatelyfollowingtessellationintheOpenGLpipelineisthegeometrystage.Here,theprogrammer has the option of including a geometry shader. This stage actually pre-datestessellation;itbecamepartoftheOpenGLcoreatversion3.2(in2009).

Like tessellation, geometry shaders enable the programmer to manipulate groups ofvertices,inwaysthatareimpossibletodoinavertexshader.Insomecases,ataskmightbeaccomplished using either a tessellation shader or a geometry shader, as their capabilitiesoverlapinsomeways.

13.1 PER-PRIMITIVEPROCESSINGINOPENGL

The geometry shader stage is situated between tessellation and rasterization, within thesegmentof thepipelinedevoted toprimitiveprocessing (referback toFigure2.2).Whereasvertexshadersenablethemanipulationofonevertexatatime,andfragmentshadersenablethemanipulationofonefragment(essentiallyonepixel)ata time,geometryshadersenablemanipulationofoneprimitiveatatime.

RecallthatprimitivesarethebasicbuildingblocksinOpenGLfordrawingobjects.Onlyafew types of primitives are available; we will focus primarily on geometry shaders thatmanipulatetriangles.Thus,whenwesaythatageometryshadercanmanipulateoneprimitiveat a time,weusuallymean that the shaderhasaccess toall three verticesof a triangle at a

Page 271: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

time.Geometryshadersallowyouto:

Accessallverticesinaprimitiveatonce,andOutputthesameprimitiveunchanged,orOutputthesameprimitivechanged,orOutputadditionalprimitives,orDeletetheprimitive(notoutputitatall).

Similartothetessellationevaluationshader,incomingvertexattributesareaccessibleinageometry shader as arrays. However, in a geometry shader, incoming attribute arrays areindexedonly up to the primitive size.For example, if the primitives are triangles, then theavailable indices are 0, 1, and 2. Accessing the vertices themselves is done using thepredefinedarraygl_in,asfollows:

gl_in[2].gl_Position//positionofthe3rdvertex

Also similar to the tessellation evaluation shader, the geometry shader ’s output vertexattributesareallscalars.Thatis,theoutputisastreamofindividualvertices(theirpositions,andotherattributevariables,ifany)thatformprimitives.

Thereisalayoutqualifierusedtosettheprimitiveinput/outputtypesandtheoutputsize.

The special GLSL command EmitVertex() specifies that a vertex is to be output. ThespecialGLSL command EndPrimitive() indicates the completion of building a particularprimitive.

The built-in variable gl_PrimitiveIDIn is available and holds the ID of the currentprimitive.TheIDnumbersstartat0andcountuptothenumberofprimitivesminus1.

Threecommoncategoriesofoperationsthatwewillexploreare:

alteringprimitivesdeletingprimitivesaddingprimitives

13.2 ALTERINGPRIMITIVES

Geometryshadersareconvenientforchangingtheshapeofanobject,whenthatchangecanbeaffectedthroughisolatedchangestotheprimitives(typicallytriangles).

Consider, for example, the torus we rendered previously in Figure 7.12. Suppose thattorus represented an inner tube (such as for a tire), and we want to “inflate” it. SimplyapplyingascalefactorintheJava/JOGLcodewon’taccomplishthis,becauseitsfundamentalshapewouldn’tchange.Givingittheappearanceofbeing“inflated”requiresalsomakingtheinnerholesmallerasthetorusstretchesintotheemptycenterspace.

Page 272: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Onewayofdoingthiswouldbetoaddthesurfacenormalvectortoeachvertex.Whileitis true that this could be done in the vertex shader, let’s do it in the geometry shader, forpractice.Program13.1 shows theGLSL geometry shader code. The othermodules are thesameas forProgram7.3,witha fewminorchanges: the fragment shader inputnamesnowneed to reflect the geometry shader outputs (for example, varyingNormal becomesvaryingNormalG),andtheJava/JOGLapplicationneedstocompile thegeometryshaderandattach it to the shader program prior to linking. The new shader is specified as being ageometryshaderasfollows:

intgShader=gl.glCreateShader(GL_GEOMETRY_SHADER);

Program13.1GeometryShader:AlteringVertices#version430

layout(triangles)in;

invec3varyingNormal[];//inputsfromthevertexshader

invec3varyingLightDir[];

invec3varyingHalfVector[];

outvec3varyingNormalG;//outputsthroughtherasterizertothe

fragmentshader

outvec3varyingLightDirG;

outvec3varyingHalfVectorG;

layout(triangle_strip,max_vertices=3)out;

//matricesandlightinguniformssameasbefore

...

voidmain(void)

{//moveverticesalongthenormal,andpassthroughtheothervertex

attributesunchanged

for(i=0;i<3;i++)

{gl_Position=proj_matrix*

gl_in[i].gl_Position+normalize(vec4(varyingNormal[i],1.0))*0.4;

varyingNormalG=varyingNormal[i];

varyingLightDirG=varyingLightDir[i];

varyingHalfVectorG=varyingHalfVector[i];

EmitVertex();

}

EndPrimitive();

}

NoteinProgram13.1thattheinputvariablescorrespondingtotheoutputvariablesfromthe vertex shader are declared as arrays. This provides the programmer amechanism foraccessingeachoftheverticesinthetriangleprimitiveandtheirattributesusingtheindices0,1, and 2. We wish to move those vertices outward along their surface normals. Both theverticesandthenormalshavealreadybeentransformedtoviewspacewiththemvmatrix,inthe vertex shader. We add the normal to each of the incoming vertex positions(gl_in[...].gl_Position),andthenapplytheprojectionmatrixtotheresult,producingeach

Page 273: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

outputgl_Position.

Note the use of the GLSL call EmitVertex() that specifies when we have finishedcomputingtheoutputgl_Positionanditsassociatedvertexattributesandarereadytooutputavertex.TheEndPrimitive()callspecifiesthatwehavecompletedthedefinitionofasetofverticescomprisingaprimitive(inthiscase,atriangle).TheresultisshowninFigure13.1.

Figure13.1“Inflated”toruswithverticesalteredbygeometryshader.

Thegeometryshaderincludestwolayoutqualifiers.ThefirstspecifiestheinputprimitivetypeandmustbecompatiblewiththeprimitivetypeintheJava-sideglDrawArrays()call.Theoptionsare:

geometryshaderinputprimitive

compatibleOpenGLprimitivessentfromglDrawArrays()

#verticesperinvocation

points GL_POINTS 1

lines GL_LINES,GL_LINE_STRIP 2

lines_adjacencyGL_LINES_ADJACENCY,GL_LINE_STRIP_ADJACENCY

4

trianglesGL_TRIANGLES,GL_TRIANGLE_STRIP,GL_TRIANGLE_FAN

3

triangles_adjacencyGL_TRIANGLES_ADJACENCY,GL_TRIANGLE_STRIP_ADJACENCY

6

ThevariousOpenGLprimitivetypes(including“strip”and“fan”types)weredescribedinChapter4.“Adjacency”typeswereintroducedinOpenGLforusewithgeometryshaders,andallowaccesstoverticesadjacenttotheprimitive.Wedon’tusetheminthisbook,buttheyarelistedforcompleteness.

Theoutputprimitivetypemustbepoints,line_strip,ortriangle_strip.Notethattheoutputlayoutqualifieralsospecifiesthemaximumnumberofverticestheshaderoutputsin

Page 274: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

eachinvocation.

This particular alteration to the torus could have been done more easily in the vertexshader.However,supposethatinsteadofmovingeachvertexoutwardalongitsownsurfacenormal,wewishedinsteadtomoveeachtriangleoutwardalongitssurfacenormal,ineffect“exploding”thetorustrianglesoutward.Thevertexshadercannotdothat,becausecomputinganormalforthetrianglerequiresaveragingthevertexnormalsofallthreetrianglevertices,andthevertexshaderonlyhasaccesstothevertexattributesofonevertexinthetriangleatatime.Wecan,however,dothisinthegeometryshader,becausethegeometryshaderdoeshaveaccesstoallthreeverticesineachtriangle.Weaveragethemtocomputeasurfacenormalforthe triangle, thenadd that averagednormal toeachof thevertices in the triangleprimitive.Figure 13.2, Figure 13.3, and Figure 13.4 show the averaging of the surface normals, themodifiedgeometryshadermain()code,andtheresultingoutput,respectively.

Figure13.2Applyingaveragedtrianglesurfacenormaltotrianglevertices.

Theappearanceofthe“exploded”toruscanbeimprovedbyensuringthattheinsideofthetorusisalsovisible(normallythosetrianglesareculledbyOpenGLbecausetheyare“back-facing”).Onewayofdoingthisistorenderthetorustwice,onceinthenormalmanner,andoncewithwinding order reversed (reversing thewinding order effectively switcheswhichfaces are front-facing andwhich areback-facing).Wealso send a flag to the shaders (in auniform)todisablediffuseandspecularlightingontheback-facingtriangles,tomakethemlessprominent.Thechangestothecodeareasfollows:

Page 275: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure13.3Modifiedgeometryshaderfor"exploding"thetorus.

Figure13.4"Exploded"torus.

changestodisplay()function:

...

//drawfront-facingtriangles–enablelighting

gl.glUniform1i(enable_lighting_location,1);

gl.glFrontFace(GL_CCW);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

//drawback-facingtriangles–disablelighting

gl.glUniform1i(enable_lighting_location,0);

gl.glFrontFace(GL_CW);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

modificationtofragmentshader:

...

if(enableLighting==1)

{fragColor=…//whenrenderingfrontfaces,usenormallighting

Page 276: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

computations

}

else//whenrenderingbackfaces,enableonlytheambient

lightingcomponent

{fragColor=globalAmbient*material.ambient+light.ambient*

material.ambient;

}

Theresulting“exploded”torus,includingbackfaces,isshowninFigure13.5.

13.3 DELETINGPRIMITIVES

Acommonuseforgeometryshadersistobuildrichlyornamentalobjectsoutofsimpleones, by judiciously deleting some of the primitives. For example, removing some of thetriangles fromour torus can turn it into a sort of complex latticed structure thatwould bemoredifficulttomodelfromscratch.AgeometryshaderthatdoesthisisshowninProgram13.2,andtheoutputisshowninFigure13.6.

Figure13.5"Exploded"torusincludingbackfaces.

Program13.2Geometry:DeletePrimitives

//inputs,outputs,anduniformsasbefore

...

voidmain(void)

{if(mod(gl_PrimitiveIDIn,3)!=0)

{for(inti=0;i<3;i++)

{gl_Position=proj_matrix*gl_in[i].gl_Position;

varyingNormalG=varyingNormal[i];

varyingLightDirG=varyingLightDir[i];

varyingHalfVectorG=varyingHalfVector[i];

EmitVertex();

}}

EndPrimitive();

}

Page 277: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Nootherchangestothecodearenecessary.Notetheuseofthemodfunction—allverticesarepassed through, except the firstof every threeprimitives,whichare ignored.Here too,renderingtheback-facingtrianglescanimproverealism,asshowninFigure13.7.

Figure13.6Geometryshader:primitivedeletion.

Figure13.7Primitivedeletionshowingbackfaces.

13.4 ADDINGPRIMITIVES

Perhaps the most interesting and powerful use of geometry shaders is for addingadditionalverticesand/orprimitivestoamodelbeingrendered.Thismakesitpossibletodosuch thingsas increase thedetail inanobject to improveheightmapping,or tochange theshapeofanobjectcompletely.

Consider the following example, where we change each triangle in the torus to a tinypyramid.

Our strategy, similar toourprevious“exploded” torus example, is illustrated inFigure13.8.Theverticesofanincomingtriangleprimitiveareusedtodefinethebaseofapyramid.Thewalls of the pyramid are constructed of those vertices, and of a new point (called the“spikepoint”)computedbyaveragingthenormalsoftheoriginalvertices:

Page 278: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure13.8Convertingtrianglestopyramids.

ThegeometryshaderinProgram13.3does this foreach triangleprimitive in the torus.Foreachincomingtriangle,itoutputsthreetriangleprimitives,foratotalofninevertices.

Note the setAttribValues() function in Program 13.3. Its purpose is to assignappropriate values to theoutgoingvertex attributes for eachvertex in theprimitive. In thisexample,itsimplycopiestheincomingvertexattributesforthenthincomingvertex,intoeachofthethreeverticesinthecorrespondingoutgoingnthfaceoftheconstructedtinypyramid.Theparticularattributesareforfacilitatinglighting,andthefactthattheoutgoingnormalforeachpyramidfacehasbeensettoanoriginaltorusnormalrepresentsasimplification—actualnormalsforthepyramidsideswouldbeconsiderablydifferent.

Program13.3Geometry:AddPrimitives...

//variablesanduniformsasbefore,except:

layout(triangle_strip,max_vertices=9)out;

...

voidsetAttribValues(intn)

{varyingNormalG=varyingNormal[n];

varyingLightDirG=varyingLightDir[n];

varyingVertPosG=varyingVertPos[n];

}

voidmain(void)

{//findstheendpointsofthesurfacenormals:

vec4sp0=gl_in[0].gl_Position+vec4(varyingNormal[0],1.0)*0.1;//the

0.1isascalefactor

vec4sp1=gl_in[1].gl_Position+vec4(varyingNormal[1],1.0)*0.1;

vec4sp2=gl_in[2].gl_Position+vec4(varyingNormal[2],1.0)*0.1;

//averagestheendpointsofthesurfacenormals:

floatx=(sp0.x+sp1.x+sp2.x)/3.0;

floaty=(sp0.y+sp1.y+sp2.y)/3.0;

floatz=(sp0.z+sp1.z+sp2.z)/3.0;

//theendpointsformtheadditionalpyramidvertex:

vec4spikePoint=vec4(x,y,z,1);

//triangle#1

Page 279: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

gl_Position=proj_matrix*gl_in[0].gl_Position;

setAttribValues(0);EmitVertex();

gl_Position=proj_matrix*gl_in[1].gl_Position;

setAttribValues(1);EmitVertex();

gl_Position=proj_matrix*spikePoint;

setAttribValues(2);EmitVertex();EndPrimitive();

//triangle#2

gl_Position=proj_matrix*gl_in[2].gl_Position;

setAttribValues(2);EmitVertex();

gl_Position=proj_matrix*gl_in[0].gl_Position;

setAttribValues(0);EmitVertex();

gl_Position=proj_matrix*spikePoint;

setAttribValues(1);EmitVertex();EndPrimitive();

//triangle#3

gl_Position=proj_matrix*gl_in[1].gl_Position;

setAttribValues(1);EmitVertex();

gl_Position=proj_matrix*gl_in[2].gl_Position;

setAttribValues(2);EmitVertex();

gl_Position=proj_matrix*spikePoint;

setAttribValues(0);EmitVertex();EndPrimitive();

}

The resulting output is shown in Figure 13.9. This rendering uses Phong shadingwithspecularhighlightingdisabled—becausewehavenotcomputedsurfacenormalsforthenewlyaddedprimitives (specular highlightswould appearnoticeablyunrealistic in the absenceofsufficiently detailed normals). Of course, the result could be improved by calculating theproper normals for each of the newly created triangle primitives, and by incorporatingshadowmapping(alsonotdonehere).

Figure13.9Geometryshader:primitiveaddition.

Carefulapplicationofthistechniquecanenablethesimulationofspikes,thorns,hair,andotherfinesurfaceprotrusions,aswellasthereverse,suchasindentationsandcraters([DV14],[TR13],and[KS16]).

SUPPLEMENTALNOTES

Page 280: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Oneof theappealsofgeometryshaders is that theyare relativelyeasy touse.Althoughmanyapplicationsforwhichgeometryshadersareusedcouldbeachievedusingtessellation,themechanism of geometry shaders oftenmakes them easier to implement and debug.Ofcourse,therelativefitofgeometryversustessellationdependsontheparticularapplication.

Exercises

13.1 ModifyProgram13.1sothatitmoveseachvertexslightlytowardsthecenterofitsprimitivetriangle.TheresultshouldlooksimilartotheexplodedtorusinFigure13.5,butwithouttheoverallchangeintorussize.

13.2 ModifyProgram13.2sothatitdeletesoneoftheotherverticesineachtriangleprimitive,andobservetheeffectontheresultingrenderedtorus.

13.3 (RESEARCH&PROJECT)ModifyProgram13.3sothatitaddsoutward-facinglinesegmentsratherthantriangles.

References

[DV14] J.deVries,LearnOpenGL,2014,AccessedJuly2016,http://www.learnopengl.com/.

[KS16]J.Kessenich,G.Sellers,andD.Shreiner,OpenGLProgrammingGuide:TheOfficialGuidetoLearningOpenGL,Version4.5withSPIR-V,9thed.(Addison-Wesley,2016).

[TR13] P.Trettner,PrototypeGrass(blog),2013,AccessedJuly2016,https://upvoid.com/devblog/2013/02/prototype-grass/.

Page 281: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

CHAPTER 14

OTHERTECHNIQUES

14.1 Fog14.2 Compositing/Blending/Transparency14.3 User-DefinedClippingPlanes14.4 3DTextures14.5 Noise14.6 NoiseApplication-Marble14.7 NoiseApplication-Wood14.8 NoiseApplication-Clouds14.9 NoiseApplication-SpecialEffects

SupplementalNotes

In this chapter we explore a variety of techniques utilizing the tools we have learnedthroughout the book. Some we will develop fully, while for others we will offer a morecursorydescription.Graphicsprogrammingisahugefield,andthischapterisbynomeanscomprehensive,butratheranintroductiontojustafewofthecreativeeffectsthathavebeendevelopedovertheyears.

14.1 FOG

Usuallywhenpeoplethinkoffog,theythinkofearlymistymorningswithlowvisibility.Intruth,atmospherichaze(suchasfog)ismorecommonthanmostofusthink.Themajorityofthetime,thereissomedegreeofhazeintheair,andalthoughwehavebecomeaccustomedtoseeingit,wedon’tusuallyrealizeitisthere.Sowecanenhancetherealisminouroutdoorscenesbyintroducingfog—evenifonlyasmallamount.

Page 282: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Fog also can enhance the sense of depth. When close objects have better clarity thandistantobjects,itisonemorevisualcuethatourbrainscanusetodecipherthetopographyofa3Dscene.

Figure14.1Fog:blendingbasedondistance.

Thereareavarietyofmethodsforsimulatingfog,fromverysimpleonestosophisticatedmodels that include light scattering effects. However, even very simple approaches can beeffective.One suchmethod is to blend the actual pixel colorwith another color (the “fog”color, typically gray or bluish-gray—also used for the background color), based on thedistancetheobjectisfromtheeye.

Figure 14.1 illustrates the concept. The eye (camera) is shown at the left, and two redobjects areplaced in theview frustum.Thecylinder is closer to theeye, so it ismostly itsoriginalcolor(red);thecubeisfurtherfromtheeye,soitismostlyfogcolor.Forthissimpleimplementation,virtuallyallofthecomputationscanbeperformedinthefragmentshader.

Program14.1showstherelevantcodeforaverysimplefogalgorithmthatusesalinearblend from object color to fog color based on the distance from the camera to the pixel.Specifically,thisexampleaddsfogtotheheightmappingexamplefromProgram10.4.

Program14.1SimpleFogGeneration

Vertex(orTessellationEvaluation)shader...

outvec3vertEyeSpacePos;

...

//computevertexpositionineyespace,withoutperspective,andsenditto

thefragmentshader.

//(thevariable“p”istheheight-mappedvertex,asdescribedearlierin

Program10-4)

vertEyeSpacePos=(mv*p).xyz;

Fragmentshader

Page 283: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

...

invec3vertEyeSpacePos;

outvec3fragColor;

...

voidmain(void)

{vec4fogColor=vec4(0.7,0.8,0.9,1.0);//bluishgray

floatfogStart=0.0;

floatfogEnd=1.5;

//thedistancefromthecameratothevertexineyespaceissimplythe

lengthofa

//vectortothatvertex,becausethecameraisat(0,0,0)ineyespace.

floatdist=length(vertEyeSpace.xyz);

floatfogFactor=clamp(((fogEnd-dist)/(fogEnd-fogStart)),0.0,1.0);

fragColor=mix(fogColor,(texture(t,tc),fogFactor);

}

ThevariablefogColorspecifiesacolorforthefog.ThevariablesfogStartandfogEndspecifytherange(ineyespace)overwhichtheoutputcolortransitionsfromobjectcolortofogcolor,andcanbetunedtomeettheneedsofthescene.Thepercentageoffogmixedwiththeobjectcolor iscalculated in thevariablefogFactor,which is the ratioofhowclose thevertexistofogEndtothetotallengthofthetransitionregion.TheGLSLclamp()functionisusedtorestrict thisratio tobeingbetweenthevalues0.0and1.0.TheGLSLmix() functionthen returns a weighted average of fog color and object color, based on the value offogFactor.Figure14.2shows theadditionof fog toascenewithheightmapped terrain. (Arockytexturefrom[LU16]hasalsobeenapplied.)

Figure14.2Fogexample.

14.2 COMPOSITING/BLENDING/TRANSPARENCY

We have already seen a few examples of blending—in the supplementary notes forChapter7,andjustaboveinourimplementationoffog.However,wehaven’tyetseenhowtoutilizetheblending(orcompositing)capabilitiesthatfollowafterthefragmentshader,duringpixel operations (recall the pipeline sequence shown in Figure 2.2). It is there that

Page 284: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

transparencyishandled,whichwelookatnow.

Throughout thisbookwehavemadefrequentuseof thevec4data type, to represent3Dpointsandvectors inahomogenouscoordinate system.Youmayhavenoticed thatwealsofrequentlyuseavec4 tostorecolor information,where thefirst threevaluesconsistofred,green,andblue,andthefourthelementis—what?

Thefourthelementinacoloriscalledthealphachannel,andspecifiestheopacityofthecolor.Opacity is ameasureofhownon-transparent thepixel color is.Analphavalueof0means“noopacity,”orcompletelytransparent.Analphavalueof1means“fullyopaque,”notatalltransparent.Inasense,the“transparency”ofacoloris1-α,whereαisthevalueofthealphachannel.

Recall fromChapter 2 that pixel operations utilize theZ-buffer, which achieves hiddensurface removalby replacingan existingpixel colorwhenanotherobject’s location at thatpixelisfoundtobecloser.Weactuallyhavemorecontroloverthisprocess—wemaychoosetoblendthetwopixels.

Whenapixel isbeing rendered, it is called the“source”pixel.Thepixel already in theframebuffer(presumablyrenderedfromapreviousobject)iscalledthe“destination”pixel.OpenGL provides many options for deciding which of the two pixels, or what sort ofcombinationofthem,ultimatelyisplacedintheframebuffer.Notethatthepixeloperationsstep is not a programmable stage—so the OpenGL tools for configuring the desiredcompositingarefoundintheJava/JOGLapplication,ratherthaninashader.

ThetwoOpenGLfunctionsforcontrollingcompositingareglBlendEquation(mode)andglBlendFunc(srcFactor,destFactor).Figure14.3 shows an overviewof the compositingprocess.

Figure14.3OpenGLcompositingoverview.

Thecompositingprocessworksasfollows:

1. Thesource anddestination pixels aremultiplied by source factor anddestination

Page 285: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

factor, respectively. The source and destination factors are specified in theblendFunc()functioncall.

2. The specified blendEquation is then used to combine the modified source anddestination pixels to produce a new destination color. The blend equation isspecifiedintheglBlendEquation()call.

The most common options for glBlendFunc() parameters (i.e., srcFactor anddestFactor)are:

glBlendFunc()parameter resultingsrcFactorordestFactor

GL_ZERO (0,0,0,0)

GL_ONE (1,1,1,1)

GL_SRC_COLOR (Rsrc,Gsrc,Bsrc,Asrc)

GL_ONE_MINUS_SRC_COLOR (1,1,1,1)–(Rsrc,Gsrc,Bsrc,Asrc)

GL_DST_COLOR (Rdest,Gdest,Bdest,Adest)

GL_ONE_MINUS_DST_COLOR (1,1,1,1)–(Rdest,Gdest,Bdest,Adest)

GL_SRC_ALPHA (Asrc,Asrc,Asrc,Asrc)

GL_ONE_MINUS_SRC_ALPHA (1,1,1,1)–(Asrc,Asrc,Asrc,Asrc)

GL_DST_ALPHA (Adest,Adest,Adest,Adest)

GL_ONE_MINUS_DST_ALPHA (1,1,1,1)–(Adest,Adest,Adest,Adest)

GL_CONSTANT_COLOR (RblendColor,GblendColor,BblendColor,AblendColor)

GL_ONE_MINUS_CONSTANT_COLOR (1,1,1,1)–(RblendColor,GblendColor,BblendColor,AblendColor)

GL_CONSTANT_ALPHA (AblendColor,AblendColor,AblendColor,AblendColor)

GL_ONE_MINUS_CONSTANT_ALPHA (1,1,1,1)–(AblendColor,AblendColor,AblendColor,AblendColor)

GL_ALPHA_SATURATE (f,f,f,1)wheref=min(Asrc,1)

Those options that indicate a “blendColor” (GL_CONSTANT_COLOR, etc.) require anadditionalcalltoglBlendColor()tospecifyaconstantcolorthatwillbeusedtocomputetheblendfunctionresult.Thereareafewadditionalblendfunctionsthataren’tlistedabove,not

Page 286: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

describedhere.

ThepossibleoptionsfortheglBlendEquation()parameter(i.e.,mode)are:

GL_FUNC_ADD result=sourceRGBA+destinationRGBA

GL_FUNC_SUBTRACT result=sourceRGBA–destinationRGBA

GL_FUNC_REVERSE_SUBTRACT result=destinationRGBA–sourceRGBA

GL_MIN result=min(sourceRGBA,destinationRGBA)

GL_MAX result=max(sourceRGBA,destinationRGBA)

The glBlendFunc() defaults are GL_ONE (1.0) for srcFactor, and GL_ZERO (0.0) fordestFactor.ThedefaultforglBlendEquation()isGL_FUNC_ADD.Thus,bydefault,thesourcepixelisuntouched(multipliedbyone),thedestinationpixelisscaledtozero,andthetwoareadded—meaningthatthesourcepixelbecomestheframebuffercolor.

TherearealsothecommandsglEnable(GL_BLEND)andglDisable(GL_BLEND),whichcanbeusedtotellOpenGLtoapplythespecifiedblending,ortoignoreit.

We won’t illustrate the effects of all of the options here, but will walk through someillustrativeexamples.

SupposewespecifythefollowingsettingsintheJava/JOGLapplication:

glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA)

glBlendEquation(GL_FUNC_ADD)

Compositingwouldproceedasfollows:

1. Thesourcepixelisscaledbyitsalphavalue.2. Thedestinationpixelisscaledby1-srcAlpha(thesourcetransparency).3. Thepixelvaluesareaddedtogether.

For example, if the source pixel is red, with 75% opacity: [1, 0, 0, 0.75], and thedestinationpixelcontainscompletelyopaquegreen:[0,1,0,1], thentheresultplacedintheframebufferwouldbe:

srcPixel*srcAlpha=[0.75,0,0,0.5625]

destPixel*(1-srcAlpha)=[0,0.25,0,0.25]

resultingpixel=[0.75,0.25,0,0.8125]

Thatis,mostlyred,withsomegreen,andmostlysolid.Theoveralleffectofthesettingsisto let the destination show through by an amount corresponding to the source pixel’stransparency.Inthisexample,thepixelintheframebufferisgreen,andtheincomingpixelisredwith25%transparency(75%opacity).Sosomegreenisallowedtoshowthroughthered.

Page 287: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

It turnsout that these settings forblend functionandblendequationworkwell inmanycases.Let’sapplythemtoapracticalexampleinascenecontainingtwo3Dmodels:atorus,andapyramidinfrontofthetorus.Figure14.4showssuchascene,ontheleftwithanopaquepyramid,andontherightwiththepyramid’salphavaluesetto0.8.Lightinghasbeenadded.

Formanyapplications—suchascreatingaflat“window”aspartofamodelofahouse—thissimpleimplementationoftransparencymaybesufficient.However,intheexampleshownin Figure 14.4, there is a fairly obvious inadequacy. Although the pyramid model is noweffectively transparent, an actual transparent pyramid should reveal not only the objectsbehindit,butalsoitsownbacksurfaces.

Actually, the reason that the back faces of the pyramid did not appear is because weenabled back-face culling. A reasonable idea might be to disable back-face culling whiledrawing thepyramid.However, thisoftenproducesotherartifacts,asshown inFigure 14.5(on the left). The problem with simply disabling back-face culling is that the effects ofblendingdependontheorderthatsurfacesarerendered(becausethatdeterminesthesourceand destination pixels), and we don’t always have control over the rendering order. It isgenerallyadvantageoustorenderopaqueobjectsfirst,aswellasobjectsthatareintheback(suchasthetorus)beforeanytransparentobjects.Thisalsoholdstrueforthesurfacesofthepyramid,andinthiscasethereasonthatthetwotrianglescomprisingthebaseofthepyramidappeardifferentisthatoneofthemwasrenderedbeforethefrontofthepyramid,andonewasrendered after. Artifacts such as this are sometimes called “ordering” artifacts, and canmanifest in transparentmodelsbecausewecannotalwayspredict theorder that its triangleswillberendered.

Figure14.4Pyramidwithalpha=1.0(left),andalpha=0.8(right).

Wecansolvetheprobleminourpyramidexamplebyrenderingthefrontandbackfacesseparately, ourselves, startingwith the back faces. Program14.2 shows the code for doingthis.We specify the alpha value for the pyramid by passing it to the shader program in auniformvariable,thenapplyitinthefragmentshaderbysubstitutingthespecifiedalphaintothecomputedoutputcolor.

Note also that for lighting to work properly, we must flip the normal vector when

Page 288: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

renderingthebackfaces.Weaccomplishthisbysendingaflagtothevertexshader,wherewethenflipthenormalvector.

Program14.2Two-PassBlendingforTransparency

JOGL/Javaapplication-indisplay()forrenderingpyramid:...

gl.glEnable(GL_CULL_FACE);

...

gl.glEnable(GL_BLEND);//configureblendsettings

gl.glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);

gl.glBlendEquation(GL_FUNC_ADD);

gl.glCullFace(GL_FRONT);//renderpyramidbackfacesfirst

gl.glProgramUniform1f(rendering_program2,alpha_location,0.3f);//backfaces

settoverytransparent

gl.glProgramUniform1f(rendering_program2,flip_location,-1.0f);//flip

normalsonbackfaces

gl.glDrawArrays(GL_TRIANGLES,0,pyramid.getNumVertices());

gl.glCullFace(GL_BACK);//thenrenderpyramidfrontfaces

gl.glProgramUniform1f(rendering_program2,alpha_location,0.8f);//frontfaces

areslighltytransparent

gl.glProgramUniform1f(rendering_program2,flip_location,1.0f);//don’tflip

normalsonfrontfaces

gl.glDrawArrays(GL_TRIANGLES,0,pyramid.getNumVertices());

gl.glDisable(GL_BLEND);

Vertexshader:...

if(flipNormal<0)varyingNormal=-varyingNormal;

...

Fragmentshader:...

fragColor=globalAmbient*material.ambient+…etc.//sameasforBlinn-

Phonglighting.

fragColor=vec4(fragColor.xyz,alpha);//replacealphavaluewithonesent

inuniformvariable

Theresultofthis“two-pass”solutionisshowninFigure14.5,ontheright.

Althoughitworkswellhere,thetwo-passsolutionshowninProgram14.2isnotalwaysadequate.Forexample,somemorecomplexmodelsmayhavehiddensurfacesthatarefront-facing,andifsuchanobjectweremadetransparent,ouralgorithmwouldfailtorenderthosehidden front-facing portions of themodel.Alec Jacobson describes a 5-pass sequence thatworksinalargenumberofcases[JA12].

Page 289: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure14.5Transparency&backfaces:orderingartifacts(left)andtwo-passcorrection(right).

14.3 USER-DEFINEDCLIPPINGPLANES

OpenGL includes the capability to specify clipping planes beyond those defined by theview frustum.Oneuse for a user-defined clippingplane is to slice amodel.Thismakes itpossibletocreatecomplexshapesbystartingwithasimplemodel,andslicingsectionsoffofit.

Aclippingplaneisdefinedaccordingtothestandardmathematicaldefinitionofaplane:ax+by+cz+d=0

wherea,b,c,anddareparametersdefiningaparticularplanein3DspacewithX,Y,andZaxes.Theparametersrepresentavector(a,b,c)normaltotheplane,andadistanced fromtheorigin to theplane.Such aplane canbe specified in thevertex shader using avec4, asfollows:

vec4clip_plane=vec4(0.0,0.0,-1.0,0.2);

Thiswouldcorrespondtotheplane:(0.0)x+(0.0)y+(-1.0)z+0.2=0

Theclippingcanthenbeachieved,alsointhevertexshader,byusingthebuilt-inGLSLvariablegl_ClipDistance[],asinthefollowingexample:

gl_ClipDistance[0]=dot(clip_plane.xyz,vertPos)+clip_plane.w;

Inthisexample,vertPos referstothevertexpositioncomingintothevertexshaderinavertexattribute(suchasinaVBO)andclip_planewasdefinedabove.Wethencomputethesigneddistancefromtheclippingplanetotheincomingvertex(showninChapter3),whichiseitherzeroifthevertexisontheplane,orisnegativeorpositivedependingonwhichsideofthe plane the vertex lies. The subscript on the gl_ClipDistance array enables multipleclippingdistances(i.e.,multipleplanes)tobedefined.Themaximumnumberofuserclippingplanesthatcanbedefineddependsonthegraphicscard’sOpenGLimplementation.

Page 290: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

User-definedclippingmustthenbeenabledintheJava/JOGLapplication.Therearebuilt-inOpenGLidentifiersGL_CLIP_DISTANCE0,GL_CLIP_DISTANCE1,andsoon,correspondingtoeachgl_ClipDistance[]arrayelement.The0thUser-definedclippingplanecanbeenabled,forexample,asfollows:

gl.glEnable(GL_CLIP_DISTANCE0);

Figure14.6Clippingatorus.

Figure14.7Clippingwithbackfaces.

ApplyingtheabovestepstoourlightedtorusresultsintheoutputshowninFigure14.6,inwhichthefronthalfofthetorushasbeenclipped.(Arotationhasalsobeenappliedtoprovideaclearerview.)

Itmayappearthatthebottomportionofthetorushasalsobeenclipped,butthatisbecausetheinsidefacesofthetoruswerenotrendered.Whenclippingrevealstheinsidesurfacesofashape,itisnecessarytorenderthemaswell,orthemodelwillappearincomplete(asitdoesinFigure14.6).

Renderingtheinnersurfacesrequiresmakingasecondcalltogl_DrawArrays(),withthewinding order reversed. Additionally, it is necessary to reverse the surface normal vectorwhenrenderingtheback-facingtriangles(aswasdoneintheprevioussection).TherelevantmodificationstotheJavaapplicationandthevertexshaderareshowninProgram14.3,withtheoutputshowninFigure14.7.

Page 291: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Program14.3ClippingwithBackFaces

Java/JOGLapplication:publicvoiddisplay(GLAutoDrawabledrawable)

{...

intflipLocation=gl.glGetUniformLocation(rendering_program,"flipNormal");

...

gl.glEnable(GL_CLIP_DISTANCE0);

//normaldrawingofexternalfaces

gl.glUniform1i(flipLocation,0);

gl.glFrontFace(GL_CCW);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

//renderingofbackfaceswithnormalsreversed

gl.glUniform1i(flipLocation,1);

gl.glFrontFace(GL_CW);

gl.glDrawArrays(GL_TRIANGLES,0,numTorusVertices);

}

Vertexshader:...

vec4clip_plane=vec4(0.0,0.0,-1.0,0.1);

uniformintflipNormal;//flagforinvertingnormal

...

voidmain(void)

{...

if(flipNormal==1)varyingNormal=-varyingNormal;

...

gl_ClipDistance[0]=dot(clip_plane.xyz,vertPos)-clip_plane.w;

...

}

14.4 3DTEXTURES

Whereas2Dtexturescontainimagedataindexedbytwovariables,3Dtexturescontainthesametypeofimagedata,butina3Dstructurethatisindexedbythreevariables.Thefirsttwodimensionsstillrepresentwidthandheightinthetexturemap;thethirddimensionrepresentsdepth.

Because the data in a 3D texture is stored in a similarmanner as for 2D textures, it istemptingtothinkofa3Dtextureasasortof3D“image.”However,wegenerallydon’treferto 3D texture source data as a 3D image, because there are no commonly used image fileformatsforthissortofstructure(i.e.,thereisnothingakintoa3DJPEG,atleastnotonethatistruly3-dimensional).Instead,wesuggestthinkingofa3Dtextureasasortofsubstanceintowhichwewillsubmerge(or“dip”)theobjectbeingtextured,resultingintheobject’ssurfacepointsobtainingtheircolorsfromthecorrespondinglocationsinthetexture.Alternatively,it

Page 292: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

canbeusefultoimaginethattheobjectisbeing“carved”outofthe3Dtexture“cube,”muchlikeasculptorcarvesafigureoutofasinglesolidblockofmarble.

OpenGLhassupportfor3Dtextureobjects.Inordertousethem,weneedtolearnhowtobuildthe3Dtexture,andhowtouseittotextureanobject.

Unlike2Dtextures,whichcanbebuiltfromstandardimagefiles,3Dtexturesareusuallygeneratedprocedurally.Aswasdonepreviouslyfor2Dtextures,wedecideonaresolution—that is, thenumberof texels in eachdimension.Dependingon the colors in the texture,wemaybuilda3-dimensionalarraycontainingthosecolors.Alternatively,ifthetextureholdsa“pattern”thatcouldbeutilizedwithvariouscolors,wemightinsteadbuildanarraythatholdsthepattern,suchaswith0sand1s.

For example, we can build a 3D texture that represents horizontal stripes by filling anarray with 0s and 1s corresponding to the desired stripe pattern. Suppose that the desiredresolutionof the texture is200x200x200 texels, and the texture is comprised of alternatingstripesthatareeach10texelshigh.Asimplefunctionthatbuildssuchastructurebyfillinganarraywithappropriate0sand1sinanestedloop(assuminginthiscasethatwidth,height,anddepthvariablesareeachsetto200),wouldbe:

voidgenerate3Dpattern()

{for(intx=0;x<texWidth;x++)

{for(inty=0;y<texHeight;y++)

{for(intz=0;z<texDepth;z++)

{if((y/10)%2==0)

tex3Dpattern[x][y][z]=0.0;

else

tex3Dpattern[x][y][z]=1.0;

}}}}

The pattern stored in the tex3Dpattern array is illustrated in Figure 14.8 with the 0srenderedinblue,andthe1srenderedinyellow.

Texturinganobjectwiththeabovestripedpatternrequiresthefollowingsteps:

1. Generatingthepatternasalreadyshown2. Usingthepatterntofillabytearrayofdesiredcolors3. Loadingthebytearrayintoatextureobject4. Decidingonappropriate3Dtexturecoordinatesfortheobjectvertices5. Texturingtheobjectinthefragmentshaderusinganappropriatesampler

Texture coordinates for 3D textures range from 0 to 1, in the samemanner as for 2Dtextures.

Interestingly, step #4 (determining 3D texture coordinates) is usually a lot simpler thanonemightinitiallysuspect.Infact,itisusuallysimplerthanfor2Dtextures!Thisisbecause(inthecaseof2Dtextures)sincea3Dobjectwasbeingtexturedwitha2Dimage,weneededtodecidehowto“flatten”the3Dobject’svertices(suchasbyUV-mapping)tocreatetexturecoordinates. But when 3D texturing, both the object and the texture are of the same

Page 293: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

dimensionality(three).Inmostcases,wewanttheobjecttoreflectthetexturepattern,asifitwere “carved” out of it (or dipped into it). So the vertex locations themselves serve as thetexturecoordinates!Usuallyallthatisnecessaryistoapplysomesimplescalingtoensurethattheobject’svertices’locationcoordinatesmaptothe3Dtexturecoordinates’range(0…1).

Figure14.8Striped3Dtexturepattern.

Sincewearegenerating the3D textureprocedurally,weneed awayof constructing anOpenGL texturemap out of generated data. The process for loading data into a texture issimilar towhatwe saw earlier in Section 5.12. In this case, we fill a 3D arraywith colorvalues,thencopythemintoatextureobject.

Program14.4showsthevariouscomponentsforachievingallofthestepslistedabove,inorder to textureanobjectwithblueandyellowhorizontalstripesfromaprocedurallybuilt3Dtexture.Thedesiredpatternisbuiltinthegenerate3Dpattern()function,whichstoresthepattern in an array named “tex3Dpattern”. The “image” data is then built in the functionfillDataArray(),whichfillsa3DarraywithbytedatacorrespondingtotheRGBcolorsR,G,B,andA,eachintherange(0…255),accordingtothepattern.Thosevaluesarethencopiedintoatextureobjectintheload3DTexture()function.

Program14.43DTexturing,StripedPattern

Java/JOGLapplication:...

privateinttexHeight=200;

privateinttexWidth=200;

privateinttexDepth=200;

privatedouble[][][]tex3Dpattern=newdouble[texWidth][texHeight]

[texDepth];

...

//fillabytearraywithRGBblue/yellowvaluescorrespondingtothepattern

builtbygenerate3Dpattern()

privatevoidfillDataArray(bytedata[])

{for(inti=0;i<texWidth;i++)

{for(intj=0;j<texHeight;j++)

{for(intk=0;k<texDepth;k++)

{if(tex3Dpattern[i][j][k]==1.0)

Page 294: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

{//yellowcolor

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+0]=(byte)

255;//red

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+1]=(byte)

255;//green

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+2]=(byte)

0;//blue

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+3]=(byte)

255;//alpha

}

else

{//bluecolor

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+0]=(byte)

0;//red

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+1]=(byte)

0;//green

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+2]=(byte)

255;//blue

data[i*(texWidth*texHeight*4)+j*(texHeight*4)+k*4+3]=(byte)

255;//alpha

}}}}}

//build3Dpatternofstripes

voidgenerate3Dpattern()

{for(intx=0;x<texWidth;x++)

{for(inty=0;y<texHeight;y++)

{for(intz=0;z<texDepth;z++)

{if((y/10)%2==0)

tex3Dpattern[x][y][z]=0.0;

else

tex3Dpattern[x][y][z]=1.0;

}}}}

//loadthesequentialbytedataarrayintoatextureobject

privateintload3DTexture()

{GL4gl=(GL4)GLContext.getCurrentGL();

byte[]data=newbyte[texWidth*texHeight*texDepth*4];

fillDataArray(data);

ByteBufferbb=Buffers.newDirectByteBuffer(data);

int[]textureIDs=newint[1];

gl.glGenTextures(1,textureIDs,0);

inttextureID=textureIDs[0];

gl.glBindTexture(GL_TEXTURE_3D,textureID);

gl.glTexStorage3D(GL_TEXTURE_3D,1,GL_RGBA8,texWidth,texHeight,

texDepth);

gl.glTexSubImage3D(GL_TEXTURE_3D,0,0,0,0,

texWidth,texHeight,texDepth,GL_RGBA,

GL_UNSIGNED_INT_8_8_8_8_REV,bb);

gl.glTexParameteri(GL_TEXTURE_3D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);

returntextureID;

}

publicvoidinit(GLAutoDrawabledrawable)

{...

generate3Dpattern();//the3Dpatternandtextureisonlyloadedonce,so

Page 295: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

isdonefrominit()

textureID=load3DTexture();

}

publicvoiddisplay(GLAutoDrawabledrawable)

{...

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_3D,textureID);

gl.glDrawArrays(GL_TRIANGLES,0,numObjVertices);

}

VertexShader:...

outvec3originalPosition;//theoriginalmodelverticeswillbeusedfor

texturecoordinates

...

layout(binding=0)uniformsampler3Ds;

voidmain(void)

{originalPosition=position;//passoriginalmodelcoordinatesforuseas

3Dtexturecoordinates

gl_Position=proj_matrix*mv_matrix*vec4(position,1.0);

}

FragmentShader:...

invec3originalPosition;//receiveoriginalmodelcoordinatesforuseas

3Dtexturecoordinates

outvec4fragColor;

...

layout(binding=0)uniformsampler3Ds;

voidmain(void)

{

fragColor=texture(s,originalPosition/2.0+0.5);//verticesare(-1…

+1),texcoordsare(0…1)

}

In theJava/JOGLapplication, theload3Dtexture() function is similar to the JavaAWTloadTexture()functionshownearlierinProgram5.2.Asbefore,itexpectstheimagedatatobeformattedasasequenceofbytescorrespondingtoRGBAcolorcomponents.ThefunctionfillDataArray()doesthis,applyingtheRGBvaluesforyellowandbluecorrespondingtothe stripedpatternbuilt by thegenerate3Dpattern() functionandheld in thetex3Dpatternarray.NotealsothespecificationoftexturetypeGL_TEXTURE_3Dinthedisplay()function.

Sincewewish to use the object’s vertex locations as texture coordinates,we pass themthroughfromthevertexshadertothefragmentshader.Thefragmentshaderthenscalesthemsothattheyaremappedintotherange(0,1)asisstandardfortexturecoordinates.Finally,3Dtexturesareaccessedviaasampler3Duniform,whichtakesthreeparametersinsteadoftwo.Weusethevertex’soriginalX,Y,andZcoordinates,scaledtothecorrectrange,toaccessthe

Page 296: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

texture.TheresultisshowninFigure14.9.

Figure14.9Dragonobjectwith3Dstripedtexture.

More complex patterns can be generated by modifying generate3Dpattern(). Figure14.10 shows a simple change that converts the striped pattern to a 3D checkerboard. TheresultingeffectisthenshowninFigure14.11.Itisworthnotingthattheeffectisverydifferentthan would be the case if the dragon’s surface had been textured with a 2D checkerboardtexturepattern.(SeeExercise14.2.)

Figure14.10Generatingacheckerboard3Dtexturepattern.

14.5 NOISE

Many natural phenomena can be simulated using randomness, or noise. One commontechnique,PerlinNoise,isnamedafterKenPerlin,whoin1997receivedanAcademyAward1fordevelopingapracticalwaytogenerateanduse2Dand3Dnoise.Themethoddescribed

Page 297: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

hereisbasedonPerlin’s.

Figure14.11Dragonwith3Dcheckerboardtexture.

There aremany applications of noise in graphics scenes.A fewcommonexamples areclouds,terrain,woodgrain,minerals(suchasveinsinmarble),smoke,fire,flames,planetarysurfaces, and random movements. In this section, we focus on generating 3D texturescontaining noise, and then subsequent sections illustrate using the noise data to generatecomplexmaterialssuchasmarbleandwood,andtosimulateanimatedcloudtexturesforusewithacubemaporskydome.Spatialdata (e.g.,2Dor3D) thatcontainsnoise issometimescalledanoisemap.

Westartbyconstructinga3Dtexturemapoutofrandomdata.Thiscanbedoneusingthefunctions shown in the previous section, with a few modifications. First, we replace thegenerate3Dpattern() function from Program 14.4 with the following simplergenerateNoise()function:

double[][][]noise=newdouble[noiseWidth][noiseHeight][noiseDepth];

Randomrandom=newRandom();

...

voidgenerateNoise()

{for(intx=0;x<noiseWidth;x++)

{for(inty=0;y<noiseHeight;y++)

{for(intz=0;z<noiseDepth;z++)

{noise[x][y][z]=random.nextDouble();//returnsadoublein

therange(0…1)

}}}}

Next,thefillDataArray()functionfromProgram14.4ismodifiedso that itcopies thenoisedataintothebytearrayinpreparationforloadingintoatextureobject,asfollows:

privatevoidfillDataArray(bytedata[])

{for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]=

(byte)(noise[i][j][k]*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]=

Page 298: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

(byte)(noise[i][j][k]*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]=

(byte)(noise[i][j][k]*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]=

(byte)255;

}}}}

TherestofProgram14.4forloadingdataintoatextureobjectandapplyingittoamodelisunchanged.Wecanview this3Dnoisemapbyapplying it toour simple cubemodel, asshowninFigure14.12.Inthisexample,noiseHeight=noiseWidth=noiseDepth=256.

Thisisa3Dnoisemap,althoughitisn’taveryusefulone.Asis,itisjusttoonoisytohavevery many practical applications. To make more practical, tunable noise patterns, we willreplacethefillDataArray()functionwithdifferentnoise-producingprocedures.

Figure14.12Cubetexturedwith3Dnoisedata.

Suppose thatwe fill thedata arrayby “zooming in” to a small subsectionof thenoisemapillustratedabove inFigure14.12,using indexesmadesmallerby integerdivision.ThemodificationtothefillDataArray()functionisshownbelow.Theresulting3Dtexturecanbemademoreorless“blocky”dependingonthe“zooming”factorusedtodividetheindex.InFigure14.13,thetexturesshowtheresultofzoominginbydividingtheindicesbyzoomfactors8,16,and32(lefttoright,respectively).

privatevoidfillDataArray(bytedata[])

{intzm=8;//zoomfactor

for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]=(byte)

(noise[i/zm][j/zm][k/zm]*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]=(byte)

(noise[i/zm][j/zm][k/zm]*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]=(byte)

(noise[i/zm][j/zm][k/zm]*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]=(byte)

255;

Page 299: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}}}}

Figure14.13“Blocky”3Dnoisemapswithvarious“zoomingin”factors.

The“blockiness”withinagivennoisemapcanbesmoothedby interpolatingfromeachdiscretegrayscalecolorvaluetothenextone.Thatis,foreachsmall“block”withinagiven3D texture, we set each texel color within the block by interpolating from its color to itsneighboring blocks’ colors. The interpolation code is shown below in the functionsmoothNoise(), along with the modified fillDataArray() function. The resulting“smoothed”textures(forzoomingfactors2,4,8,16,32,and64—lefttoright,toptobottom)thenfollowinFigure14.14.Notethatthezoomfactorisnowadouble,becauseweneedthefractionalcomponenttodeterminetheinterpolatedgrayscalevaluesforeachtexel.

privatevoidfillDataArray(bytedata[])

{doublezm=32.0;//zoomfactor

for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]=

(byte)(smoothNoise(i/zm,j/zm,k/zm)*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]=

(byte)(smoothNoise(i/zm,j/zm,k/zm)*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]=

(byte)(smoothNoise(i/zm,j/zm,k/zm)*255);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]=

(byte)255;

}}}}

doublesmoothNoise(doublex1,doubley1,doublez1)

{//fractionofx1,y1,z1(%ofdistancefromcurrentblocktonextblock,

forthistexel)

doublefractX=x1-(int)x1;

doublefractY=y1-(int)y1;

doublefractZ=z1-(int)z1;

//theindicesforneighboringpixelsintheX,Y,andZdirections

intx2=((int)x1+noiseWidth-1)%noiseWidth;

inty2=((int)y1+noiseHeight-1)%noiseHeight;

intz2=((int)z1+noiseDepth-1)%noiseDepth;

//smooththenoisebyinterpolatingthegreyscaleintensityalongall

threeaxes

Page 300: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

doublevalue=0.0;

value+=fractX*fractY*fractZ*noise[(int)x1][(int)

y1][(int)z1];

value+=fractX*(1-fractY)*fractZ*noise[(int)x1][(int)

y2][(int)z1];

value+=(1-fractX)*fractY*fractZ*noise[(int)x2][(int)

y1][(int)z1];

value+=(1-fractX)*(1-fractY)*fractZ*noise[(int)x2][(int)

y2][(int)z1];

value+=fractX*fractY*(1-fractZ)*noise[(int)x1][(int)

y1][(int)z2];

value+=fractX*(1-fractY)*(1-fractZ)*noise[(int)x1][(int)

y2][(int)z2];

value+=(1-fractX)*fractY*(1-fractZ)*noise[(int)x2][(int)

y1][(int)z2];

value+=(1-fractX)*(1-fractY)*(1-fractZ)*noise[(int)x2][(int)

y2][(int)z2];

returnvalue;

}

Figure14.14Smoothingof3Dtextures,atvariouszoominglevels.

ThesmoothNoise() functioncomputes agrayscalevalue for each texel in the smoothedversionofagivennoisemapbycomputingaweightedaverageoftheeightgrayscalevaluessurroundingthetexelinthecorrespondingoriginal“blocky”noisemap.Thatis,itaveragesthecolorvaluesattheeightverticesofthesmall“block”thetexelisin.Theweightsforeachof these “neighbor” colors are based on the texel’s distance to each of its neighbors,normalizedtotherange(0…1).

Next,smoothednoisemapsofvariouszoomingfactorsarecombined.Anewnoisemapiscreatedinwhicheachofitstexelsisformedbyanotherweightedaverage,thistimebasedonthe sum of the texels at the same location in each of the “smoothed” noisemaps,with thezoom factor serving as theweight.The effect has beendubbed “turbulence,” although it isreallymore closely related to the harmonics produced by summing variouswaveforms.A

Page 301: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

new turbulence() function, and a modified version of fillDataArray() that specifies anoisemap that sumszoom levels1 through32 (theones that arepowersof two) is shownbelow,followedbyanimageofacubetexturedwiththeresultingnoisemap.

privatedoubleturbulence(doublex,doubley,doublez,doublemaxZoom)

{doublesum=0.0,zoom=maxZoom;

while(zoom>=1.0)//thelastpassiswhenzoom=1.

{//computeweightedsumofsmoothednoisemaps

sum=sum+smoothNoise(x/zoom,y/zoom,z/zoom)*zoom;

zoom=zoom/2.0;//foreachzoomfactorthatisapowerof

two.

}

sum=128.0*sum/maxZoom;//guaranteesRGB<256formaxZoomvaluesup

to64

returnsum;

}

privatevoidfillDataArray(bytedata[])

{doublemaxZoom=32.0;

for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]=

(byte)turbulence(i,j,k,maxZoom);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]=

(byte)turbulence(i,j,k,maxZoom);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]=

(byte)turbulence(i,j,k,maxZoom);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]=

(byte)255;

}}}}

3Dnoisemaps,suchastheoneshowninFigure14.15,canbeusedforawidevarietyofimaginative applications. In thenext sections,wewill use it to generatemarble,wood, andclouds. The distribution of the noise can be adjusted by various combinations of zoom-inlevels.

Figure14.153Dtexturemapwithcombined“turbulence”noise.

Page 302: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

14.6 NOISEAPPLICATION–MARBLE

BymodifyingthenoisemapandaddingPhonglightingwithanappropriateADSmaterialasdescribedpreviouslyinFigure7.3,wecanmakethedragonmodelappeartobemadeofamarble-likestone.

Westartbygeneratingastripedpatternsomewhatsimilar tothe“stripes”examplefromearlier inthischapter—thenewstripesdifferfromthepreviousones,firstbecausetheyarediagonal,andalsobecausetheyareproducedbyasinewaveandthereforehaveblurryedges.We then use the noise map to perturb those lines, storing them as grayscale values. ThechangestothefillDataArray()functionareasfollows:

privatevoidfillDataArray(bytedata[])

{doubleveinFrequency=2.0;

doubleturbPower=1.5;

doublemaxZoom=64.0;

for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{doublexyzValue=i/noiseWidth+j/noiseHeight+k/

noiseDepth

+turbPower*turbulence(i,j,k,maxZoom)/

256.0;

doublesineValue=Math.abs(Math.sin(xyzValue*3.14159*

veinFrequency));

Colorc=newColor((float)sineValue,(float)sineValue,

(float)sineValue);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]

=(byte)c.getRed();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]

=(byte)c.getGreen();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]

=(byte)c.getBlue();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]

=(byte)255;

}}}}

ThevariableveinFrequencyisusedtoadjustthenumberofstripes,turbSizeadjuststhezoom factor used when generating the turbulence, and turbPower adjusts the amount ofperturbation in the stripes (setting it to zero leaves the stripesunperturbed).Since the samesinewavevalue isusedforall three(RGB)colorcomponents, thefinalcolorstored in theimage data array is grayscale. Figure 14.16 shows the resulting texture map for variousvaluesofturbPower(0.0,0.5,1.0,and1.5,lefttoright).

Page 303: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure14.16Building3D"marble"noisemaps.

Since we expect marble to have a shiny appearance, we incorporate Phong shading tomake a “marble” textured object look convincing. Program 14.5 summarizes the code forgeneratingamarbledragon.ThevertexandfragmentshadersarethesameasusedforPhongshading,exceptthatwealsopassthroughtheoriginalvertexcoordinatesforuseas3Dtexturecoordinates(asdescribedearlier).Thefragmentshaderbuildsaweightedsumtocombinethenoiseresultwiththelightingresult.

Program14.5BuildingaMarbleDragon

Java/JOGLapplication...

publicvoidinit(GLAutoDrawabledrawable)

{...

//materialADSsettingsforuseinPhongShading

float[]matAmbient=newfloat[]{0.5f,0.5f,0.5f,1.0f};

float[]matDiffuse=newfloat[]{0.5f,0.5f,0.5f,1.0f};

float[]matSpecular=newfloat[]{1.0f,0.5f,0.5f,1.0f};

thisMaterial.setAmbient(matAmbient);

thisMaterial.setDiffuse(matDiffuse);

thisMaterial.setSpecular(matSpecular);

thisMaterial.setShininess(50.0f);

}

privatevoidfillDataArray(bytedata[])

{doubleveinFrequency=2.0;

doubleturbPower=3.0;

doubleturbSize=64.0;

//remainderisasshownaboveforbuildingthemarblenoisemap

...

}

VertexShader://unchangedfromprogram14-4

FragmentShader:...

voidmain(void)

{...

fragColor=

0.5*(globalAmbient*material.ambient+light.ambient*material.ambient

+light.diffuse*material.diffuse*max(cosTheta,0.0)

+light.specular*material.specular*pow(max(cosPhi,0.0),

material.shininess))

+0.5*texture(s,originalPosition/3.0+0.5);

//modelverticesare(-1.5…+1.5),texturecoordinatesare(0…1)

Page 304: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

}

By modifying the ADS material values specified in init(), we can simulate differentmarbleorjade-likestones.Figure14.17showstwoexamples,thefirstusingtheADSmaterialvalues shown in Program 14.5, and the second utilizing the “jade” ADS colors shownpreviouslyinFigure7.3.

14.7 NOISEAPPLICATION–WOOD

Creating a “wood” texture can be done in a similar way as was done in the previous“marble”example.Treesgrowinrings,anditistheseringsthatproducethe“grain”weseeinobjectsmadeofwood.As treesgrow,environmental stressescreatevariations in the rings,whichwealsoseeinthegrain.

Westartbybuildingaprocedural“rings”3Dtexturemap,similartothe“checkerboard”fromearlierinthischapter.Wethenusethenoisemaptoperturbthoserings,insertingdarkandlightbrowncolors intotheringtexturemap.Byadjustingthenumberofrings,andthedegree to which we perturb the rings, we can simulate wood with various types of grain.Shadesofbrowncanbemadebycombiningsimilaramountsofredandgreen,withlessblue.WethenapplyPhongshadingwithalowlevelof“shininess.”

Figure14.17Dragontexturedwith3Dmarblenoisemaps

We can generate rings encircling the Z-axis in our 3D texture map by modifying thefillDataArray() function, using trigonometry to specify values for X and Y that areequidistantfromtheZaxis.Weuseasinewavetorepeatthisprocesscyclically,raisingandlowering the red and green components equally based on this sine wave, to produce thevarying shades of brown. The variable sineValue holds the exact shade, which can be

Page 305: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

adjustedbyslightlyoffsettingoneortheother(inthiscaseincreasingtheredby80,andthegreenby30).Wecancreatemore(orfewer)ringsbyadjustingthevalueofxyPeriod.TheresultingtextureisshowninFigure14.18.

privatevoidfillDataArray(bytedata[])

{doublexyPeriod=30.0;

for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{doublexValue=(i-(double)noiseWidth/2.0)/

(double)noiseWidth;

doubleyValue=(j-(double)noiseHeight/2.0)/

(double)noiseHeight;

doubledistanceFromZ=Math.sqrt(xValue*xValue+yValue*

yValue)

doublesineValue=128.0*Math.abs(Math.sin(2.0*xyPeriod*

distanceFromZ*3.14159));

Colorc=newColor((int)(80+(int)sineValue),(int)(30+

(int)sineValue),30);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]=

(byte)c.getRed();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]=

(byte)c.getGreen();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]=

(byte)c.getBlue();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]=

(byte)255;

}}}}

ThewoodringsinFigure14.18areagoodstart,buttheydon’tlookveryrealistic—theyaretooperfect.Toimprovethis,weusethenoisemap(morespecifically,theturbulence)toperturbthedistanceValuevariablesothattheringshaveslightvariations.Thecomputationismodifiedasfollows:

doubledistanceValue=Math.sqrt(xValue*xValue+yValue*yValue)

+turbPower*turbulence(i,j,k,

maxZoom)/256.0;

Again, the variableturbPower adjusts howmuch turbulence is applied (setting it to 0.0results in the unperturbedversion shown inFigure14.18), and maxZoom specifies the zoomvalue (typically 32). Figure 14.19 shows the resultingwood textures for turbPower values0.05,1.0,and2.0(lefttoright).

Page 306: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Figure14.18Creatingringsfor3Dwoodtexture.

Wecannowapplythe3Dwoodtexturemaptoamodel.Therealismofthetexturecanbefurther enhancedby applying a rotation to theoriginalPosition vertex locationsused fortexturecoordinates;thisisbecausemostitemscarvedoutofwooddon’tperfectlyalignwiththeorientationoftherings.Toaccomplishthis,wesendanadditionalrotationmatrixtotheshaders for rotating the texture coordinates.We also add Phong shading, with appropriatewood-colorADSvalues,andamodestlevelofshininess.Thecompleteadditionsandchangesforcreatinga“wooddolphin”areshowninProgram14.6.

Figure14.19"Wood"3Dtexturemapswithringsperturbedbynoisemap.

Program14.6CreatingaWoodDolphin

Java/JOGLapplication:publicvoidinit(GLAutoDrawabledrawable)

{...

float[]brownAmbient=newfloat[]{0.5f,0.35f,0.15f,1.0f};

float[]brownDiffuse=newfloat[]{0.5f,0.35f,0.15f,1.0f};

float[]brownSpecular=newfloat[]{0.5f,0.35f,0.15f,1.0f};

thisMaterial.setAmbient(brownAmbient);

thisMaterial.setDiffuse(brownDiffuse);

thisMaterial.setSpecular(brownSpecular);

thisMaterial.setShininess(15.0f);

texRot.setToIdentity();

Page 307: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

//rotationtobeappliedtotexturecoordinates–addsadditionalgrain

variation

texRot.rotateX(35.0f);texRot.rotateY(35.0f);texRot.rotateZ(35.0f);

}

privatevoidfillDataArray(bytedata[])

{doublexyPeriod=40.0;

doubleturbPower=0.1;

doublemaxZoom=32.0;

for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{doublexValue=(i-(double)noiseWidth/2.0)/

(double)noiseWidth;

doubleyValue=(j-(double)noiseHeight/2.0)/

(double)noiseHeight;

doubledistanceValue=Math.sqrt(xValue*xValue+yValue*

yValue)

+turbPower*turbulence(i,j,k,maxZoom)/

256.0;

doublesineValue=128.0*Math.abs(Math.sin(2.0*xyPeriod*

distanceValue*3.14159));

Colorc=newColor((int)(80+(int)sineValue),(int)(30+

(int)sineValue),30);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]=

(byte)c.getRed();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]=

(byte)c.getGreen();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]=

(byte)c.getBlue();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]=

(byte)255;

}}}}

privatevoiddisplay(GLAutoDrawabledrawable)

{...

t_location=gl.glGetUniformLocation(rendering_program,"texRot");

gl.glUniformMatrix4fv(t_location,1,false,texRot.getFloatValues(),0);

...

}

Vertexshader:...

uniformmat4texRot;

...

voidmain(void)

{...

originalPosition=vec3(texRot*vec4(position,1.0)).xyz;

....

}

Fragmentshader:voidmain(void)

{...

Page 308: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

uniformmat4texRot;

...

//combinelightingwith3Dtexturing

fragColor=

0.5*(...)

+

0.5*texture(s,originalPosition/2.0+0.5);

}

Figure14.20Dolphintexturedwith“wood”3Dnoisemap.

Theresulting3DtexturedwooddolphinisshowninFigure14.20.

Thereisoneadditionaldetailinthefragmentshaderworthnoting.Sincewearerotatingthemodelwithinthe3Dtexture,itissometimespossibleforthistocausethevertexpositionstomovebeyondtherequired(0,1)rangeoftexturecoordinatesasaresultoftherotation.Ifthis were to happen, we could adjust for this possibility by dividing the original vertexpositionsbya largernumber(suchas4.0rather than2.0),andthenaddingaslightlylargernumber(suchas0.6)tocenteritinthetexturespace.

14.8 NOISEAPPLICATION–CLOUDS

The“turbulence”noisemapbuiltearlierinFigure14.15alreadylooksabit likeclouds.Ofcourse,itisn’ttherightcolor,sowestartbychangingitfromgrayscaletoanappropriatemixoflightblueandwhite.Astraightforwardwayofdoingthis is toassignacolorwithamaximumvalueof1.0forthebluecomponent,andvarying(butequal)valuesbetween0.0and1.0fortheredandgreencomponentsdependingonthevaluesinthenoisemap.ThenewfillDataArray()functionfollows:

privatevoidfillDataArray(bytedata[])

{for(inti=0;i<noiseWidth;i++)

{for(intj=0;j<noiseHeight;j++)

{for(intk=0;k<noiseDepth;k++)

{floatbrightness=1.0f-(float)turbulence(i,j,k,32)/256.0f;

Colorc=newColor(brightness,brightness,1.0f,1.0f);

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+0]=

(byte)c.getRed();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+1]=

(byte)c.getGreen();

Page 309: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+2]=

(byte)c.getBlue();

data[i*(noiseWidth*noiseHeight*4)+j*(noiseHeight*4)+k*4+3]=

(byte)255;

}}}}

Theresultingblueversionofthenoisemapcannowbeusedtotextureaskydome.Recallthataskydomeisasphereorhalf-spherethatistextured,renderedwithdepth-testingdisabled,andplacedsothatitsurroundsthecamera(similartoaskybox).

Onewayofbuildingtheskydomewouldbetotextureitinthesamewayaswehaveforother3Dtextures,usingthevertexcoordinatesastexturecoordinates.However,inthiscase,itturnsoutthatusingtheskydome’s2Dtexturecoordinatesinsteadproducespatternsthatlookmore like clouds, because the spherical distortion slightly stretches the texture maphorizontally.Wecangraba2DslicefromthenoisemapbysettingthethirddimensionintheGLSLtexture() call to a constant value.Assuming that the skydome’s texture coordinateshavebeensenttotheOpenGLpipelineinavertexattribute,inthestandardway,thefollowingfragmentshadertexturesitwitha2Dsliceofthenoisemap:

#version430

invec2texCoord;

outvec4fragColor;

uniformmat4mv_matrix;

uniformmat4proj_matrix;

layout(binding=0)uniformsampler3Ds;

voidmain(void)

{fragColor=texture(s,vec3(tc.x,tc.y,0.5));//constantvalueinplaceof

tc.z

}

TheresultingtexturedskydomeisshowninFigure14.21.Althoughthecameraisusuallyplacedinsidetheskydome,wehaverendereditherewiththecameraoutside,sothattheeffectonthedomeitselfcanbeseen.Thecurrentnoisemapleadsto“misty-looking”clouds.

Figure14.21Skydometexturedwithmistyclouds.

Although ourmisty clouds look nice,wewould like to be able to shape them—that is,makethemmoreorlesshazy.Onewayofdoingthisistomodifytheturbulence()functionso that it uses an exponential, such as a logistic function,2 to make the clouds lookmore“distinct.” Themodified turbulence() function is shown in Program 14.7, along with an

Page 310: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

associatedlogistic()function.ThecompleteProgram14.7alsoincorporatesthesmooth(),fillDataArray(),andgenerateNoise()functionsdescribedearlier.

Program14.7CloudTextureGenerationprivatedoubleturbulence(doublex,doubley,doublez,doublesize)

{doublevalue=0.0,initialSize=size,cloud_quant;

while(size>=0.9)

{value=value+smoothNoise(x/size,y/size,z/size)*size;

size=size/2.0;

}

cloud_quant=110.0;//tunablequantityofclouds

value=value/initialSize;

value=256.0*logistic(value*128.0-cloud_quant);

returnvalue;

}

privatedoublelogistic(doublex)

{doublek=0.2;//tunablehazinessofclouds,producesmoreorless

distinctcloudboundaries

return(1.0/(1.0+Math.pow(2.718,-k*x)));

}

The logistic function causes the colors to tendmore towardswhite or blue, rather thanvaluesin-between,producingthevisualeffectoftherebeingmoredistinctcloudboundaries.Thevariablecloud_quantadjuststherelativeamountofwhite(versusblue)inthenoisemap,whichinturnleadstomore(orfewer)generatedwhiteregions(i.e.,distinctclouds)whenthelogisticfunctionisapplied.Theresultingskydome,nowwithmoredistinctcloudformations,isshowninFigure14.22.

Figure14.22Skydomewithexponentialcloudtexture.

Lastly,realcloudsaren’tstatic.Toenhancetherealismofourclouds,weshouldanimatethemby(a)makingthemmoveor“drift’overtimeand(b)graduallychangingtheirformastheydrift.

Onesimplewayofmakingtheclouds“drift”istoslowlyrotatetheskydome.Thisisn’taperfectsolution,asrealcloudstendtodriftinastraightdirectionratherthanrotatingaroundthe observer. However, if the rotation is slow and the clouds are simply for decorating ascene,theeffectislikelytobeadequate.

Page 311: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Havingthecloudsgraduallychangeformastheydriftmayatfirstseemtricky.However,giventhe3Dnoisemapwehaveusedtotexturetheclouds,thereisactuallyaverysimpleandcleverwayofachievingtheeffect.Recallthatalthoughweconstructeda3Dtexturenoisemapforclouds,wehavesofaronlyusedone“slice”ofit,inconjunctionwiththeskydome’s2Dtexturecoordinates(wesetthe“Z”coordinateofthetexturelookuptoaconstantvalue).Therestofthe3Dtexturehassofargoneunused.

Ourtrickwillbe toreplacethe texture lookup’sconstant“Z”coordinatewithavariablethatchangesgraduallyovertime.Thatis,aswerotatetheskydome,wegraduallyincrementthedepthvariable,causingthetexturelookuptouseadifferentslice.Recallthatwhenwebuiltthe 3D texture map, we applied smoothing to the color changes along all three axes. So,neighboring slices from the texture map are very similar, but slightly different. Thus, bygradually changing the “Z” value in the texture() call, the appearance of the cloudswillgraduallychange.

ThecodechangestocausethecloudstoslowlymoveandchangeovertimeareshowninProgram14.8.

Program14.8AnimatingtheCloudTexture

Java/JOGLapplicationprivatedoublerotAmt=0.0;//Y-axisrotationamounttomakecloudsappear

todrift

privatefloatdepth=0.01f;//depthlookupfor3Dnoisemap,tomakeclouds

graduallychange

...

publicvoiddisplay(GLAutoDrawabledrawable)

{...

//graduallyrotatetheskydome

m_matrix.setToIdentity();

m_matrix.translate(objLoc.getX(),objLoc.getY(),objLoc.getZ());

rotAmt=rotAmt+0.2;

m_matrix.rotateY(rotAmt);

//graduallyalterthethirdtexturecoordinatetomakecloudschange

dloc=gl.glGetUniformLocation(program,"d");

depth=depth+0.00005f;

if(depth>=0.99)depth=0.01f;//wrap-aroundwhenwegettotheendof

thetexturemap

gl.glUniform1f(dloc,depth);

...

}

FragmentShader:#version430

invec2tc;

outvec4fragColor;

uniformmat4mv_matrix;

Page 312: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

uniformmat4proj_matrix;

layout(binding=0)uniformsampler3Ds;

uniformfloatd;

voidmain(void)

{fragColor=texture(s,vec3(tc.x,tc.y,d));//gradually-changing“d”

replacespreviousconstant

}

Whilewecannotshowtheeffectofgraduallychangingdriftingandanimatedcloudsinasingle still image, Figure 14.23 shows such changes in a series of snapshots of the 3Dgeneratedcloudsastheydriftacrosstheskydomefromrighttoleft.

Figure14.233Dcloudschangingwhiledrifting.

14.9 NOISEAPPLICATION–SPECIALEFFECTS

Noise textures can be used for a variety of special effects. In fact, there are so manypossibleusesthatitsapplicabilityislimitedonlybyone’simagination.

Oneverysimplespecialeffect thatwewilldemonstratehere isadissolveeffect.This iswherewemakeanobjectappeartograduallydissolveintosmallparticles,untiliteventuallydisappears.Given a 3Dnoise texture, this effect can be achievedwith very little additionalcode.Notethatthisexampleonlyworksifanimationisincorporated,suchaswiththeuseofFPSAnimator.

Tofacilitatethedissolveeffect,weintroducetheGLSLdiscardcommand.Thiscommandis only legal in the fragment shader, and when executed, it causes the fragment shader todiscardthecurrentfragment(meaningnotrenderit).

Ourstrategyisasimpleone.IntheJava/JOGLapplication,wecreateafine-grainednoisetexturemapidenticaltotheoneshownbackinFigure14.12,andalsoafloatvariablecounter

Page 313: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

that gradually increasesover time.This variable is then sent down the shader pipeline in auniform variable, and the noise map is also placed in a texture map with an associatedsampler.Thefragmentshaderthenaccessesthenoisetextureusingthesampler—inthiscaseweusethereturnednoisevaluetodeterminewhetherornottodiscardthefragment.Wedothis by comparing the grayscale noise value against the counter,which serves as a sort of“threshold”value.Becausethethresholdisgraduallychangingovertime,wecansetitupsothatgraduallymoreandmorefragmentsarediscarded.Theresultisthattheobjectappearstogradually dissolve.Program14.9 shows the relevant code sections,which are added to theearth-renderedspherefromProgram6.1.ThegeneratedoutputisshowninFigure14.24.

Program14.9DissolveEffectUsingdiscardCommand

Java/JOGLapplication:privatefloatthreshold=0.0f;//gradually-increasingthresholdfor

retaining/discardingfragment

...

inDisplay:...

t_location=gl.glGetUniformLocation(rendering_program,"t");

threshold=threshold+.002f;

gl.glUniform1f(t_location,threshold);

...

gl.glActiveTexture(GL_TEXTURE0);

gl.glBindTexture(GL_TEXTURE_3D,textureID);//noisetexture

gl.glActiveTexture(GL_TEXTURE1);

gl.glBindTexture(GL_TEXTURE_2D,textureIDearth);//earthtexture

...

gl.glDrawArrays(GL_TRIANGLES,0,numSphereVertices);

FragmentShader:#version430

invec2tc;//texturecoordinatesforthisfragment

invec3origPos;//originalvertexpositionsinthemodel,foraccessing3D

texture

...

layout(binding=0)uniformsampler3Dn;//samplerfornoisetexture

layout(binding=1)uniformsampler2De;//samplerforearthtexture

...

uniformfloatt;//thresholdforretainingordiscardingfragment

voidmain(void)

{floatnoise=texture(n,origPos).x;//retrievenoisevalueforthis

fragment.

if(noise>t)//ifthenoisevalueisgreaterthanthecurrentthreshold

value,

{fragColor=texture(e,tc);//renderthefragmentusingtheearth

Page 314: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

texture.

}

else

{discard;//otherwise,discardthefragment(donotrenderit)

}

}

Figure14.24Dissolveeffectwithdiscardshader.

The discard command should, if possible, be used sparingly because it can incur aperformance penalty. This is because its presence makes it more difficult for OpenGL tooptimizeZ-bufferdepthtesting.

SUPPLEMENTALNOTES

Inthischapter,weusedPerlinnoisetogenerateclouds,andtosimulatebothwoodandamarble-likestonefromwhichwerenderedadragonandadolphin.Peoplehavefoundmanyother uses for Perlin noise. For example, it can be used to create fire and smoke [KE16,AF14],build realisticbumpmaps [GR05], andhasbeenused togenerate terrain in thevideogameMinecraft[PE11].

The noise maps generated in this chapter are based on procedures outlined by LodeVandevenne[VA04].Thereremainsomedeficienciesinour3Dcloudgeneration.Thetextureisnotseamless,soat the360°point there isanoticeablevertical line.(This isalsowhywestartedthedepthvariableinProgram14.8at0.01ratherthanat0.0—toavoidencounteringtheseam in theZ dimension of the noisemap). Simplemethods exist for removing the seams[AS04], ifneeded.Another issue isat thenorthernpeakof theskydomewhere thesphericaldistortionintheskydomecausesapincushioneffect.

Thecloudsweimplementedinthischapteralsofailtomodelsomeimportantaspectsof

Page 315: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

realclouds,suchasthewaythattheyscatterthesun’slight.Realcloudsalsotendtobemorewhiteonthetopandgrayeratthebottom.Ourcloudsalsodon’tachievea3D“fluffy”lookthatmanyactualcloudshave.

Similarly, more comprehensive models exist for generating fog, such as the onedescribedbyKilgardandFernando[KF03].

Whileperusing theOpenGLdocumentation, the readermightnotice thatGLSL includessomenoisefunctionsnamednoise1(),noise2(),noise3(),andnoise4(),thataredescribedas taking an input seed and producingGaussian-like stochastic output.We didn’t use thesefunctionsinthischapterbecauseasofthiswriting,mostvendorshavenotimplementedthem.For example,manyNVIDIA cards currently return0 for these functions, regardless of theinputseed.

Exercises

14.1 ModifyProgram14.2tograduallyincreasethealphavalueofanobject,causingittograduallyfadeoutandeventuallydisappear.

14.2 ModifyProgram14.3toclipthetorusalongthehorizontal,creatingacircular“trough.”14.3 ModifyProgram14.4(theversionincludingthemodificationinFigure14.10that

producesa3Dcubedtexture),sothatitinsteadtexturesasimplercurvedobject,suchastheStudio522dolphin.Thenobservetheresults.Manypeoplewhenfirstobservingtheresult—suchasthatshownonthedragon,butevenonsimplerobjects—believethatthereissomeerrorintheprogram.Unexpectedsurfacepatternscanresultfrom“carving”anobjectoutof3Dtextures,evensimpleones.

14.4 Thesimplesinewaveusedtodefinethewood“rings”(showninFigure14.18)generateringsinwhichthelightanddarkareasareequalwidth.ExperimentwithmodificationstotheassociatedfillDataArray()functionwiththegoalofmakingthedarkringsnarrowerinwidththanthelightrings.Thenobservetheeffectsontheresultingwood-texturedobject.

14.5 (PROJECT)Incorporatethelogisticfunction(fromProgram14.7)intothemarbledragonfromProgram14.5,andexperimentwiththesettingstocreatemoredistinctveins.

14.6 ModifyProgram14.9toincorporatethezooming,smoothing,turbulence,andlogisticstepsdescribedinthepriorsections.Observethechangesintheresultingdissolveeffect.

References

[AF14]S.AbrahamandD.Fussell,“SmokeBrush,”ProceedingsoftheWorkshoponNon-PhotorealisticAnimationandRendering(NPAR’14),2014,accessedJuly2016,https://www.cs.utexas.edu/~theshark/smokebrush.pdf.

D.Astle,“SimpleCloudsPart1,”gamedev.net,2004,accessedJuly2016,

Page 316: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

[AS04] http://www.gamedev.net/page/resources/_/technical/game-programming/simple-clouds-part-1-r2085.

[GR05]S.Green,“ImplementingImprovedPerlinNoise,”GPUGems2,NVIDIA,2005,accessedJuly2016,http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter26.html.

[JA12] A.Jacobson,“CheapTricksforOpenGLTransparency,”2012,accessedJuly2016,http://www.alecjacobson.com/weblog/?p=2750.

[KE16]B.Kaskosz,D.Ensley,andD.Gries,“BillowingFirewithPerlinNoiseandFilters–FlashAS3Effect,”Flash&Math,accessedJuly2016,http://www.flashandmath.com/about/.

[KF03]M.KilgardandR.Fernando,“AdvancedTopics,”TheCgTutorial(Addison-Wesley,2003),accessedJuly2016,http://http.developer.nvidia.com/CgTutorial/cg_tutorial_chapter09.html.

[LU16] F.Luna,Introductionto3DGameProgrammingwithDirectX12,2nded.(MercuryLearning,2016).

[PE11]M.Persson,“TerrainGeneration,Part1,”TheWordofNotch(blog),Mar9,2011,accessedJuly2016,http://notch.tumblr.com/post/3746989361/terrain-generation-part-1.

[VA04]L.Vandevenne,“TextureGenerationUsingRandomNoise,”Lode’sComputerGraphicsTutorial,2004,accessedJuly2016,http://lodev.org/cgtutor/randomnoise.html.

1TheTechnicalAchievementAward,givenbytheAcademyofMotionPictureArtsandSciences.2A"logistic"(or"sigmoid")functionhasanS-shapedcurvewithasymptotesonbothends.Commonexamplesarehyperbolictangentandf(x)=1/(1+e-x).Theyarealsosometimescalled"squashing"functions.

Page 317: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Index

3Dtexturescomplexpatterns,315Java/JOGLapplication,314stripedpattern,311,312f,312–314

A

Addingprimitives,294–297ADSlightingmodel,158,163computations

ambientcontribution,165diffusecontribution,166–167specularcontribution,167–168

implementingfacetedshading,169smoothshading.seeSmoothshading

Aliasing,115,116Alphachannel,302Alteringprimitives,288–293Ambientreflection,158Animation,30–32Anisotropicfiltering(AF),120normalmapping,241,242f

ApplicationProgrammingInterfaces(APIs),2Aspectratio,53Attenuationfactors,160,161

Page 318: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

B

Back-faceculling,97–99Bernstein,Sergei,251Béziercurve,249,249fBéziersurfaces,137,249cubic,256–258quadratic,254–256tessellation

controlshader,270evaluationshader,270–272fragmentshader,272Java/JOGLapplication,272mix()function,273overviewof,267,267fTCS,268TES,268–269vertexshader,272,273

Bézier,Pierre,249Bilinearfiltering,118Bitangentvectors,235,236Blender,144,145,146Blendingfunction,251Blending.SeeTransparencyBlinn,James,180Blinn-Phongshading,180Bronze,164Buffer,9,16Bumpmappingperturbednormalvectorsfor,232,232fproceduralbumpmapping,232,233f

C

Cameraspace.SeeEyespaceCamera,49,50Clippingplane,308–310Clouds,327–332Colorbuffer,21,22Compositing.SeeTransparencyComputeraideddesign,56Concatenation,39Crossproduct,47Crow,Frank,191Cubemap

Page 319: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

cameraplacedinside,212,212fsix-facedskyboxtexture,211,212ftexturecoordinates,212,213fusingopenGL,219-223

Culling.SeeBack-faceculling

D

deCasteljau,Paul,249deCasteljau’salgorithm,253Debugging,26,27Deletingprimitives,293–294Depthbuffer.SeeZ-bufferDepth-fighting,94Diffusereflection,158DigitalContentCreation(DCC),144Directbuffer,67Directionallight,159–160Dissolveeffect,332Distantlight.SeeDirectionallight3Dmodelsloadingexternallyproducedmodels

Blender,145,146DCC-createdmodel,144ImportedModelclasses,149ModelImporterclass,149OBJfiles,145,147

OpenGLindexinginnerandoutervariables,139torus,137-138VBO,139

proceduralmodelsobjects,typesof,130spherevertex,132,132f,136vertices,133

Dotproduct,45–47

E

Emissiveness,164Enhancingsurfacedetail,231–247EnvironmentmappingBlinn-Phonglighting,224–225overview,224,224f

Page 320: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

reflectionvector,224shaders,setsof,227shininess,223texturecubemaps,223

Erroneousself-shadowing.SeeShadowacneEulerangles,42Euler ’sTheorem,42Eyespace,49

F

Facetedshading,169Farclippingplane,53Fieldofview,53Flatshading.SeeFacetedshadingFOG,299–301FPSAnimator,30,31Fragmentshader,20–21Framerate,30,76Frame,30Frustum,53Full-sceneanti-aliasing(FSAA),126

G

Geometryshadersaddingprimitives,294–297alteringprimitives,288–293deletingprimitives,293–294OpenGL,per-primitiveprocessingin,287–288

GLCanvas,9,30Globalambientlight,159GLSL,2buildingmatrixtransforms,57–59error-detectingmodules

checkOpenGLError(),23,24printProgramLog,24printShaderLog,24

GLSLshadercode,files,27–28shaderlanguages,8texturemapping,104

GNUImageManipulationProgram(GIMP),234,237Gold,164Gouraudshading,169–177

Page 321: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Gouraud,Henri,169Graphicscard(GPU),1Graphicspipeline.SeeOpenGLpipelineGraphicsprogrammingfourlanguagesandlibraries

graphicslib3D,3Java,2JOGL,3OpenGL/GLSL,2

installationandconfigurationgraphicslib3D,4–5Java,3JOGL,4OpenGL/GLSL,4

Graphicslib3Dbuildingscalematrices,41languagesandlibraries,3rotationmatrices,42torus,140installationandconfiguration,4–5

H

Hardshadows,209Heightmappingdefining,242interpretation,242,243fterrain,244,244fvertexmanipulation,243vertex-shader-based,244,245,245f,246

Hiddensurfaceremovalalgorithm(HSR),21–22,192Hierarchicalmodel,88Homogeneousnotation,36HSR.Seehiddensurfaceremoval

I

Identitymatrix,37Immediatemode,2Indexing,137Instancing,81–84

J

Page 322: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Jade,163,164JavaDevelopmentKit(JDK),3JavaOpenGL(JOGL)animatorclasses,30overviewof,8,8fvertices,29

Javainstallationandconfiguration,3languagesandlibraries,2

JogAmp,3JOGLinstallationandconfiguration,4languagesandlibraries,3OpenGLtextureobject,105texturemapping,104

L

Layoutqualifier,66Levelofdetail(LOD),281–284LightingADSlightingcomputations

ambientcontribution,165diffusecontribution,166–167specularcontribution,167–168

ADSmodel,158combininglightingandtextures,182–184directional/distantlight,159–160globalambient,159implementingADSlighting

facetedshading,169smoothshading.seeSmoothshading

materials,163–165positional,160–161reflectionmodel,158shadingmodels,158shadows,190fspotlights,161–163typesof,159

LightweightJavaGameLibrary(LWJGL),xivLinearfiltering,118Localspace.SeeModelspaceLOD.SeelevelofdetailLogisticfunction,330

Page 323: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Look-atmatrix,56–57LuxoJr.,163

M

Managing3Dgraphicsdataback-faceculling,97–99buffers,typesof,67combatingZ-fightingartifacts,94–953Dcube

3Dcubeprogram,72–74display()function,76fragmentshader,77,78framerate,76translate()androtate()functions,80varyingColor,79vertex_positions,75

matrixstacksdisplay()function,90hierarchicalmodels,88planet,92simplesolarsystem,92–93sun’srotation,90–91viewmatrices,89

model-viewandperspectivematricesdefining,69modelmatrix,70perspectivematrix,70viewmatrix,70

primitivesline,96patch,97point,96triangle,95–96

renderingmultipledifferentmodels,85–87renderingmultipleobjects,80–84uniformvariables,67–68vertexattributes

interpolationof,68–69vertexshader,66

Marble,322Marblenoisemaps,321–323Mathematicalfoundations3Dcoordinatesystems,36

Page 324: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

buildingmatrixtransforms,GLSLfunctionsfor,57–59eyespaceandsyntheticcamera,49–52localandworldspace,48look-atmatrix,56–57matrices

addition,38identitymatrix,37inverse,39multiplication,38–39transformation,40–44transpose,37–38

points,36projectionmatrices

orthographicprojection,54–56perspectiveprojection,52–54

vectorscrossproduct,47dotproduct,45–47

Materials,163–165Matricesaddition,38identitymatrix,37inverse,39look-atmatrix,56–57multiplication,38–39projectionmatrices

orthographicprojection,54–56perspectiveprojection,52–54

transformationrotation,42–44scale,41–42translation,40–41

transpose,37–38Matrixmultiplication,38–39Matrixstacksdisplay()function,90hierarchicalmodels,88planet,92simplesolarsystem,92–93sun’srotation,90–91viewmatrices,89

Maya,144Mipmappingaliasingartifacts,115,116

Page 325: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

bricktexture,119minification,118minificationtechnique,118trilinearfiltering,118,118f,119

Modeling,129–155Modelmatrix,48Modelspace,48Model-viewmatrix,67,85

N

NASA,149NearclippingplaneSeeProjectionplaneNewell,Martin,137Noise.SeePerlinNoiseNormalmappingAF,241,242fBlinn-Phonglighting,235fragmentshader,236imagefile,234mipmapping,241moonsurfacerendered,239objectnormal,235tangentandbitangentvectors,235,236TBNmatrix,237textureunits,applicationof,234

Normalvector,132,224NVIDIA,335

O

OBJ.SeeWavefrontOBJOpacity,302OpenGLArchitectureReviewBoard(ARB),2OpenGLcamera,50,195OpenGLextension,120OpenGLpipelineerror-detectingmodules,23,25-26

enablingdebuggingcauses,26enablingtracingcauses,26

geometryshader,18–19hardwareside,7Java/JOGLapplicationCpointer,12

Page 326: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

colorbuffer,11–12Direct3Dacceleration,10GL4,11GLCanvas,9,30

overviewof,8,9fpixeloperations,21–22rasterization,19–20shaderstages,8–9softwareside,7tessellation,17–18vertexandfragmentshaders,20–21

glShaderSource(),15glUseProgram,15pixelprocessing,17primitives,12RGBcolor,16VAOs,16vshaderSourceandfshaderSource,15

OpenGLShadingLanguage.SeeGLSLOpenGLglGenVertexArrays()andglGenBuffers(),65per-primitiveprocessingin,287–288primitivetypes,95VAOs,65

Orthographicprojection,54–56,209

P

ParametricsurfacescubicBéziercurves

analyticdefinitionfor,251deCasteljau’salgorithm,253recursivesubdivisionalgorithm,253,254f,254

cubicBéziersurfaces,256–258quadraticBéziercurve,249–251quadraticBéziersurfaces,254–256

Patch,265Pearl,163PerlinNoise3Dnoisedata,cubetexturedwith,317,317fnoiseapplication

cloudstexture,327–332specialeffects,332–334woodtexture,323–327

Page 327: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

marblenoisemaps,321–323smoothednoisemaps,320zoomingfactors,318

Perlin,Ken,316Perspectivecorrection,123Perspectivedistortion,123–124Perspectivematrix,53Perspectivetransform,53Peterpanning,206,207fPewter,163PhongshadingBlinn-Phongshading,180–181externalmodelswith,181,182fimplementingPhongshading,177,177fStanfordDragon,181–182

Phong,BuiTuong,177Photoshop,214,237Pipeline.SeeOpenGLpipelinePixar,163Point,36Popping,283Positionallight,160–161Primitiveassembly,18Primitives,12Proceduralbumpmapping,232,233fProjectionmatricesorthographicprojection,54–56perspectiveprojection,52–54

Projectionplane,49,53,55Projectiveshadows,190–191

Q

QuadraticBéziercurve,249–251Quaternion,44,60

R

Rasterization,19–20Reflectionmapping.SeeEnvironmentmappingReflectionmodel,158Right-handrule,47

Page 328: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

S

Samplervariable,110Sampler2D,110Sampler3D,314,315SamplerCube,220Scalematrix,41–42Shaderprograms,1,2Shadingmodels,158Shadowacne,205Shadowbuffering,193Shadowmappingartifactsjaggedshadowedges,207,208fpeterpanning,206,207fshadowacne,205shadowbar,207

Shadowtexture,193Shadowvolumes,191–192,191fShadowsimportanceof,189–190mapping

artifacts,205–208HSRalgorithm,192JOGLcode,200lightposition,drawobjectsfrom,193–194lightedscene,199,199frenderingscene,195–199shadowbuffering,193shadowtextures,193strategy,193Z-buffer,texture,194–195

projectiveshadows,190–191volumes,191–192,191f

Shininess,163,168Silver,164Skyboxesconceptof,211implementing

texturecoordinate,216textureimages,218usingOpenGLcubemaps,219–223

texturecubemap,212Skydomesadvantages,214disadvantage,214

Page 329: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

usingsphere,215,215fSmoothshadingGouraudshading,169–177Phongshading,177–182

Softshadows,209Specularhighlight,167,177,239,296Specularreflection,158Spotlights,161–163Stanforddragon,181Starfield,215,215Stencilbuffer,192Studio522dolphin,184

T

Tangentvectors,235,236Teapot,Utah,137Terragen,214Terrain,244Tessellationcontrolshader(TCS),267,268Tessellationevaluationshader(TES),268–269Tessellationlevels,266Tessellationprimitivegenerator(TPG),262Tessellation,258controllingLOD,281–284Béziersurfaces

controlshader,270evaluationshader,270–272fragmentshader,272Java/JOGLapplication,272mix()function,273overviewof,267,267fTCS,268TES,268–269vertexshader,272,273

OpenGLfragmentshader,267innerlevelandouterlevel,264,266modules,262patch,265,266pipelinestages,261trianglemeshoutput,264,264f

terrain/heightmapscontrolshader,278

Page 330: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

evaluationshader,278–279fragmentshader,275Java/JOGLapplication,277Phongshading,280vertexshader,274–275,277,278

Texels,106Texturecoordinates3Dmodel,106constructing,108–109cubemodel,107curvedgeometricshapes,108values,110vertexshader,106

Texturecubemap.SeeCubemapTextureimage,104,124Texturemapping,193anisotropicfiltering(AF),120creatingtextureobject,108JAVAAWTclasses,124–126JOGL/GLSL,104mipmapping

aliasingartifacts,115,116bricktexture,119minificationtechnique,118trilinearfiltering,118,118f,119

OpenGLtextureobject,105perspectivedistortion,123–124pyramid’smodel,111samplervariablesandtextureunits,110–111texturecoordinates

3Dmodel,106constructing,108–109cubemodel,107curvedgeometricshapes,108values,110vertexshader,106

type,105wrappingandtiling,121–122

Textureobject,108Textureunits,103,110–111Tiling.SeeWrappingTracing,26Translationmatrix,40–41Transparency

Page 331: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

alphachannel,302compositingprocess,303glBlendEquation()parameter,304glBlendFunc()parameters,303Java/JOGLapplication,304opacity,302Z-buffer,302

Trilinearfiltering,118,118f,119Turberville,Jay,181Turbulence,320,327

U

Uniformsamplervariable,110Uniformvariable,31,67–68User-definedclippingplanes,308–310UV-mapping,108,150

V

VAO.SeeVertexArrayObject,65VBO.SeeVertexBufferObject,64Vecmath,3Vectorscrossproduct,47dotproduct,45–47

VertexArrayObjects(VAOs),16Vertexattributesinterpolationof,68–69vertexshader,12,66

VertexBufferObject(VBO),64,139Vertex,16,17Viewmatrix,70Viewspace.SeeEyespaceViewingtransformmatrix,50

W

WavefrontOBJ,145Windingorder,97Wireframerendering,20fWorldspace,48Wood,323Wrapping,121–122

Page 332: C G P O GL WITH JAVAdl.booktolearn.com/ebooks2/computer/graphics/9781683920274_co… · 5.9 Anisotropic Filtering 5.10 Wrapping and Tiling 5.11 Perspective Distortion 5.12 Loading

Z

Z-bufferalgorithm,22,22f,94Z-buffer,22,94,95,302Z-fighting,94