Table of Contents - FITstaff.fit.ac.cy/.../3d-game-development-with-lwjgl.pdf · 2017-09-28 ·...

Preview:

Citation preview

1.1

1.2

1.3

1.4

1.5

1.6

1.7

1.8

1.9

1.10

1.11

1.12

1.13

1.14

1.15

1.16

1.17

1.18

1.19

1.20

1.21

1.22

1.23

1.24

1.25

1.26

1.27

1.28

TableofContentsIntroduction

Firststeps

TheGameLoop

Abriefaboutcoordinates

Rendering

MoreonRendering

Transformations

Textures

Camera

Loadingmorecomplexmodels

Lettherebelight

Lettherebeevenmorelight

HUD

SkyBoxandsomeoptimizations

HeightMaps

TerrainCollisions

Fog

NormalMapping

Shadows

Animations

Particles

InstancedRendering

Audio

3DObjectpicking

Hudrevisited-NanoVG

Optimizations

CascadedShadowMaps

Assimp

2

3

3DGameDevelopmentwithLWJGL3Thisonlinebookwillintroducethemainconceptsrequiredtowritea3DgameusingtheLWJGL3library.

LWJGLisaJavalibrarythatprovidesaccesstonativeAPIsusedinthedevelopmentofgraphics(OpenGL),audio(OpenAL)andparallelcomputing(OpenCL)applications.ThislibraryleveragesthehighperformanceofnativeOpenGLapplicationswhileusingtheJavalanguage.

Myinitialgoalwastolearnthetechniquesinvolvedinwritinga3DgameusingOpenGL.Alltheinformationrequiredwasthereintheinternetbutitwasnotorganizedandsometimesitwasveryhardtofindandevenincompleteormisleading.

Istartedtocollectsomematerials,developsomeexamplesanddecidedtoorganizethatinformationintheformofabook.

SourceCodeThesourcecodeofthesamplesofthisbookareinGitHub.

ThesourcecodeforthebookitselfisalsopublishedinGitHub.

LicenseThebookislicensedunderAttribution-ShareAlike4.0International(CCBY-SA4.0)

ThesourcecodeforthebookislicensedunderApachev2.0

SupportIfyoulikethebookpleaserateitwithastartandshareit.Ifyouwanttocontributewithadonationyoucandoadonation:

OrifyoupreferBitcoin:1Kwe78faWarzGTsWXtdGvjjbS9RmW1j3nb.

Commentsarewelcome

Introduction

4

Suggestionsandcorrectionsaremorethanwelcome(andifyoudolikeitpleaserateitwithastar).Pleasesendthemusingthediscussionforumandmakethecorrectionsyouconsiderinordertoimprovethebook.

AuthorAntonioHernándezBejarano

SpecialThanksToallthereadersthathavecontributedwithcorrections,improvementsandideas.

Introduction

5

FirststepsInthisbookwewilllearntheprincipaltechniquesinvolvedindeveloping3Dgames.WewilldevelopoursamplesinJavaandwewillusetheLightweightJavaGameLibrary(LWJGL).TheLWJGLlibraryenablestheaccesstolow-levelAPIs(ApplicationProgrammingInterface)suchasOpenGL.

LWJGLisalowlevelAPIthatactslikeawrapperaroundOpenGL.Ifyourideaistostartcreating3Dgamesinashortperiodoftimemaybeyoushouldconsiderotheralternativeslike[JmonkeyEngine].ByusingthislowlevelAPIyouwillhavetogothroughmanyconceptsandwritelotsoflinesofcodebeforeyouseetheresults.Thebenefitofdoingitthiswayisthatyouwillgetamuchbetterunderstandingof3Dgraphicsandalsoyoucangetbettercontrol.

AssaidinthepreviousparagraphswewillbeusingJavaforthisbook.WewillbeusingJava8,soyouneedtodownloadtheJavaSDKfromOracle’spages.JustchoosetheinstallerthatsuitsyourOperatingSystemandinstallit.ThisbookassumesthatyouhaveamoderateunderstandingoftheJavalanguage.

ThesourcecodethataccompaniesthisbookhasbeendevelopedusingtheNetbeansIDE.YoucandownloadthelatestversionofthatIDEfromhttps://netbeans.org/.InordertoexecuteNetbeansyouonlyneedtheJavaSEversionbutremembertodownloadtheversionthatcorrespondstoyourJDKversion(32bitsor64bits).

ForbuildingoursampleswewillbeusingMaven.MavenisalreadyintegratedinNetbeansandyoucandirectlyopenthedifferentsamplesfromNetbeans.JustopenthefolderthatcontainsthechaptersampleandNetbeanswilldetectthatitisamavenproject.

Firststeps

6

MavenbuildsprojectsbasedonanXMLfilenamedpom.xml(ProjectObjectModel)whichmanagesprojectdependencies(thelibrariesyouneedtouse)andthestepstobeperformedduringthebuildprocess.Mavenfollowstheprincipleofconventionoverconfiguration,thatis,ifyousticktothestandardprojectstructureandnamingconventionstheconfigurationfiledoesnotneedtoexplicitlysaywheresourcefilesareorwherecompiledclassesshouldbelocated.

Thisbookdoesnotintendtobeamaventutorial,sopleasefindtheinformationaboutitinthewebincaseyouneedit.Thesourcecodefolderdefinesaparentprojectwhichdefinesthepluginstobeusedandcollectstheversionsofthelibrariesemployed.

LWJGL3.1introducedsomechangesinthewaythattheprojectisbuilt.Nowthebasecodeismuchmoremodular,andwecanbemoreselectiveinthepackagesthatwewanttouseinsteadofusingagiantmonolithicjarfile.Thiscomesatacost:Younowneedtocarefullyspecifythedependenciesonebyone.Butthedownloadpageincludesafancyscriptthatgeneratesthepomfileforyou.Inourcase,wewilljustbeusingGLFWandOpenGLbindings.Youcancheckwhatthepomfilelookslikeinthesourcecode.

TheLWJGLplatformdependencyalreadytakescareofunpackingnativelibrariesforyourplatform,sothere'snoneedtouseotherplugins(suchasmavennatives).WejustneedtosetupthreeprofilestosetapropertythatwillconfiguretheLWJGLplatform.TheprofileswillsetupthecorrectvaluesofthatpropertyforWindows,LinuxandMacOSfamilies.

Firststeps

7

<profiles>

<profile>

<id>windows-profile</id>

<activation>

<os>

<family>Windows</family>

</os>

</activation>

<properties>

<native.target>natives-windows</native.target>

</properties>

</profile>

<profile>

<id>linux-profile</id>

<activation>

<os>

<family>Linux</family>

</os>

</activation>

<properties>

<native.target>natives-linux</native.target>

</properties>

</profile>

<profile>

<id>OSX-profile</id>

<activation>

<os>

<family>mac</family>

</os>

</activation>

<properties>

<native.target>natives-osx</native.target>

</properties>

</profile>

</profiles>

Insideeachproject,theLWJGLplatformdependencywillusethecorrectpropertyestablishedintheprofileforthecurrentplatform.

<dependency>

<groupId>org.lwjgl</groupId>

<artifactId>lwjgl-platform</artifactId>

<version>${lwjgl.version}</version>

<classifier>${native.target}</classifier>

</dependency>

Besidesthat,everyprojectgeneratesarunnablejar(onethatcanbeexecutedbytypingjava-jarname_of_the_jar.jar).Thisisachievedbyusingthemaven-jar-pluginwhichcreatesajarwithaMANIFEST.MFfilewiththecorrectvalues.Themostimportantattributeforthatfile

Firststeps

8

isMain-Class,whichsetstheentrypointfortheprogram.Inaddition,allthedependenciesaresetasentriesintheClass-Pathattributeforthatfile.Inordertoexecuteitonanothercomputer,youjustneedtocopythemainjarfileandthelibdirectory(withallthejarsincludedthere)whicharelocatedunderthetargetdirectory.

ThejarsthatcontainLWJGLclasses,alsocontainthenativelibraries.LWJGLwillalsotakecareofextractingthemandaddingthemtothepathwheretheJVMwilllokforlibraries.

Chapter1sourcecodeistakendirectlyfromthegettingstartedsampleintheLWJGLsite(http://www.lwjgl.org/guide).YouwillseethatwearenotusingSwingorJavaFXasourGUIlibrary.InsteadofthatweareusingGLFWwhichisalibrarytohandleGUIcomponents(Windows,etc.)andevents(keypresses,mousemovements,etc.)withanOpenGLcontextattachedinastraightforwardway.PreviousversionsofLWJGLprovidedacustomGUIAPIbut,forLWJGL3,GLFWisthepreferredwindowingAPI.

Thesamplessourcecodeisverywelldocumentedandstraightforwardsowewon’trepeatthecommentshere.

Ifyouhaveyourenvironmentcorrectlysetupyoushouldbeabletoexecuteitandseeawindowwitharedbackground.

ThesourcecodeofthisbookispublishedinGitHub.

Firststeps

9

TheGameLoopInthischapterwewillstartdevelopingourgameenginebycreatingourgameloop.Thegameloopisthecorecomponentofeverygame.Itisbasicallyanendlessloopwhichisresponsibleforperiodicallyhandlinguserinput,updatinggamestateandrenderingtothescreen.

Thefollowingsnippetshowsthestructureofagameloop:

while(keepOnRunning){

handleInput();

updateGameState();

render();

}

So,isthatall?Arewefinishedwithgameloops?Well,notyet.Theabovesnippethasmanypitfalls.Firstofallthespeedthatthegamelooprunsatwillbedifferentdependingonthemachineitrunson.Ifthemachineisfastenoughtheuserwillnotevenbeabletoseewhatishappeninginthegame.Moreover,thatgameloopwillconsumeallthemachineresources.

Thus,weneedthegamelooptotryrunningataconstantrateindependentlyofthemachineitrunson.Letussupposethatwewantourgametorunataconstantrateof50FramesPerSecond(FPS).Ourgameloopcouldbesomethinglikethis:

doublesecsPerFrame=1.0d/50.0d;

while(keepOnRunning){

doublenow=getTime();

handleInput();

updateGameState();

render();

sleep(now+secsPerFrame–getTime());

}

Thisgameloopissimpleandcouldbeusedforsomegamesbutitalsopresentssomeproblems.Firstofall,itassumesthatourupdateandrendermethodsfitintheavailabletimewehaveinordertorenderataconstantrateof50FPS(thatis,secsPerFramewhichisequalto20ms.).

TheGameLoop

10

Besidesthat,ourcomputermaybeprioritizingothertasksthatpreventourgameloopfromexecutingforacertainperiodoftime.So,wemayendupupdatingourgamestateatveryvariabletimestepswhicharenotsuitableforgamephysics.

Finally,sleepaccuracymayrangetotenthofasecond,sowearenotevenupdatingataconstantframerateevenifourupdateandrendermethodstakenotime.So,asyouseetheproblemisnotsosimple.

OntheInternetyoucanfindtonsofvariantsforgameloops.Inthisbookwewilluseanottoocomplexapproachthatcanworkwellinmanysituations.Soletusmoveonandexplainthebasisforourgameloop.ThepatternusedhereisusuallycalledFixedStepGameLoop.

Firstofallwemaywanttocontrolseparatelytheperiodatwhichthegamestateisupdatedandtheperiodatwhichthegameisrenderedtothescreen.Whydowedothis?Well,updatingourgamestateataconstantrateismoreimportant,especiallyifweusesomephysicsengine.Onthecontrary,ifourrenderingisnotdoneintimeitmakesnosensetorenderoldframeswhileprocessingourgameloop.Wehavetheflexibilitytoskipsomeframes.

Letushavealookathowourgamelooplookslike:

doublesecsPerUpdate=1.0d/30.0d;

doubleprevious=getTime();

doublesteps=0.0;

while(true){

doubleloopStartTime=getTime();

doubleelapsed=loopStartTime-previous;

previous=current;

steps+=elapsed;

handleInput();

while(steps>=secsPerUpdate){

updateGameState();

steps-=secsPerUpdate;

}

render();

sync(current);

}

Withthisgameloopweupdateourgamestateatfixedsteps.Buthowdowecontrolthatwedonotexhaustthecomputer'sresourcesbyrenderingcontinuously?Thisisdoneinthesyncmethod:

TheGameLoop

11

privatevoidsync(doubleloopStartTime){

floatloopSlot=1f/50;

doubleendTime=loopStartTime+loopSlot;

while(getTime()<endTime){

try{

Thread.sleep(1);

}catch(InterruptedExceptionie){}

}

}

Sowhatarewedoingintheabovemethod?Insummarywecalculatehowmanysecondsourgameloopiterationshouldlast(whichisstoredintheloopSlotvariable)andwewaitforthatamountoftimetakingintoconsiderationthetimewespentinourloop.Butinsteadofdoingasinglewaitforthewholeavailabletimeperiodwedosmallwaits.Thiswillallowothertaskstorunandwillavoidthesleepaccuracyproblemswementionedbefore.Then,whatwedois:

1. Calculatethetimeatwhichweshouldexitthiswaitmethodandstartanotheriterationofourgameloop(whichisthevariableendTime).

2. Comparethecurrenttimewiththatendtimeandwaitjustonemillisecondifwehavenotreachedthattimeyet.

NowitistimetostructureourcodebaseinordertostartwritingourfirstversionofourGameEngine.Butbeforedoingthatwewilltalkaboutanotherwayofcontrollingtherenderingrate.Inthecodepresentedabove,wearedoingmicro-sleepsinordertocontrolhowmuchtimeweneedtowait.Butwecanchooseanotherapproachinordertolimittheframerate.Wecanusev-sync(verticalsynchronization).Themainpurposeofv-syncistoavoidscreentearing.Whatisscreentearing?It’savisualeffectthatisproducedwhenweupdatethevideomemorywhileit’sbeingrendered.Theresultwillbethatpartoftheimagewillrepresentthepreviousimageandtheotherpartwillrepresenttheupdatedone.Ifweenablev-syncwewon’tsendanimagetotheGPUwhileitisbeingrenderedontothescreen.

Whenweenablev-syncwearesynchronizingtotherefreshrateofthevideocard,whichattheendwillresultinaconstantframerate.Thisisdonewiththefollowingline:

glfwSwapInterval(1);

Withthatlinewearespecifyingthatwemustwait,atleast,onescreenupdatebeforedrawingtothescreen.Infact,wearenotdirectlydrawingtothescreen.Weinsteadstoretheinformationtoabufferandweswapitwiththismethod:

glfwSwapBuffers(windowHandle);

TheGameLoop

12

So,ifweenablev-syncweachieveaconstantframeratewithoutperformingthemicro-sleepstochecktheavailabletime.Besidesthat,theframeratewillmatchtherefreshrateofourgraphicscard.Thatis,ifit’ssetto60Hz(60timespersecond),wewillhave60FramesPerSecond.Wecanscaledownthatratebysettinganumberhigherthan1intheglfwSwapIntervalmethod(ifwesetitto2,wewouldget30FPS).

Let’sgetbacktoreorganizethesourcecode.FirstofallwewillencapsulatealltheGLFWWindowinitializationcodeinaclassnamedWindowallowingsomebasicparameterizationofitscharacteristics(suchastitleandsize).ThatWindowclasswillalsoprovideamethodtodetectkeypresseswhichwillbeusedinourgameloop:

publicbooleanisKeyPressed(intkeyCode){

returnglfwGetKey(windowHandle,keyCode)==GLFW_PRESS;

}

TheWindowclassbesidesprovidingtheinitializationcodealsoneedstobeawareofresizing.Soitneedstosetupacallbackthatwillbeinvokedwheneverthewindowisresized.Thecallbackwillreceivethewidthandheight,inpixels,oftheframebuffer(therenderingarea,inthissample,thedisplayarea).Ifyouwantthewidth,heightoftheframebufferinscreencoordinatesyoumayusethetheglfwSetWindowSizeCallbackmethod.Screencoordinatesdon'tnecessarillycorrespondtopixels(forinstance,onaMacwithRetinadisplay.SincewearegoingtousethatinformationwhenperformingsomeOpenGLcalls,weareinterestedinpixelsnotinscreencoordinates.YoucangetmoreinfomationintheGLFWdocumentation.

//Setupresizecallback

glfwSetFramebufferSizeCallback(windowHandle,(window,width,height)->{

Window.this.width=width;

Window.this.height=height;

Window.this.setResized(true);

});

WewillalsocreateaRendererclasswhichwillhandleourgamerenderlogic.Bynow,itwilljusthaveanemptyinitmethodandanothermethodtoclearthescreenwiththeconfiguredclearcolor:

publicvoidinit()throwsException{

}

publicvoidclear(){

glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);

}

TheGameLoop

13

ThenwewillcreateaninterfacenamedIGameLogicwhichwillencapsulateourgamelogic.Bydoingthiswewillmakeourgameenginereusableacrossdifferenttitles.Thisinterfacewillhavemethodstogettheinput,toupdatethegamestateandtorendergame-specificdata.

publicinterfaceIGameLogic{

voidinit()throwsException;

voidinput(Windowwindow);

voidupdate(floatinterval);

voidrender(Windowwindow);

}

ThenwewillcreateaclassnamedGameEnginewhichwillcontainourgameloopcode.ThisclasswillimplementtheRunnableinterfacesincethegameloopwillberuninsideaseparatethread:

publicclassGameEngineimplementsRunnable{

//..[Removedcode]..

privatefinalThreadgameLoopThread;

publicGameEngine(StringwindowTitle,intwidth,intheight,booleanvsSync,IGame

LogicgameLogic)throwsException{

gameLoopThread=newThread(this,"GAME_LOOP_THREAD");

window=newWindow(windowTitle,width,height,vsSync);

this.gameLogic=gameLogic;

//..[Removedcode]..

}

ThevSyncparameterallowsustoselectifwewanttousev-syncornot.YoucanseewecreateanewThreadwhichwillexecutetherunmethodofourGameEngineclasswhichwillcontainourgameloop:

TheGameLoop

14

publicvoidstart(){

gameLoopThread.start();

}

@Override

publicvoidrun(){

try{

init();

gameLoop();

}catch(Exceptionexcp){

excp.printStackTrace();

}

}

OurGameEngineclassprovidesastartmethodwhichjuststartsourThreadsotherunmethodwillbeexecutedasynchronously.Thatmethodwillperformtheinitializationtasksandwillrunthegameloopuntilourwindowisclosed.ItisveryimportanttoinitializeGLFWinsidethethreadthatisgoingtoupdateitlater.Thus,inthatinitmethodourWindowandRendererinstancesareinitialized.

InthesourcecodeyouwillseethatwecreatedotherauxiliaryclassessuchasTimer(whichwillprovideutilitymethodsforcalculatingelapsedtime)andwillbeusedbyourgamelooplogic.

OurGameEngineclassjustdelegatestheinputandupdatemethodstotheIGameLogicinstance.IntherendermethoditdelegatesalsototheIGameLogicinstanceandupdatesthewindow.

protectedvoidinput(){

gameLogic.input(window);

}

protectedvoidupdate(floatinterval){

gameLogic.update(interval);

}

protectedvoidrender(){

gameLogic.render(window);

window.update();

}

Ourstartingpoint,ourclassthatcontainsthemainmethodwilljustonlycreateaGameEngineinstanceandstartit.

TheGameLoop

15

publicclassMain{

publicstaticvoidmain(String[]args){

try{

booleanvSync=true;

IGameLogicgameLogic=newDummyGame();

GameEnginegameEng=newGameEngine("GAME",

600,480,vSync,gameLogic);

gameEng.start();

}catch(Exceptionexcp){

excp.printStackTrace();

System.exit(-1);

}

}

}

Attheendweonlyneedtocreateorgamelogicclass,whichforthischapterwillbeasimplerone.Itwilljustincrease/decreasetheclearcolorofthewindowwhenevertheuserpressestheup/downkey.Therendermethodwilljustclearthewindowwiththatcolor.

TheGameLoop

16

publicclassDummyGameimplementsIGameLogic{

privateintdirection=0;

privatefloatcolor=0.0f;

privatefinalRendererrenderer;

publicDummyGame(){

renderer=newRenderer();

}

@Override

publicvoidinit()throwsException{

renderer.init();

}

@Override

publicvoidinput(Windowwindow){

if(window.isKeyPressed(GLFW_KEY_UP)){

direction=1;

}elseif(window.isKeyPressed(GLFW_KEY_DOWN)){

direction=-1;

}else{

direction=0;

}

}

@Override

publicvoidupdate(floatinterval){

color+=direction*0.01f;

if(color>1){

color=1.0f;

}elseif(color<0){

color=0.0f;

}

}

@Override

publicvoidrender(Windowwindow){

if(window.isResized()){

glViewport(0,0,window.getWidth(),window.getHeight());

window.setResized(false);

}

window.setClearColor(color,color,color,0.0f);

renderer.clear();

}

}

Intherendermethodwegetnotifiedwhenthewindowhasbeenresizedinordertoupdatetheviewporttolocatethecenterofthecoordinatestothecenterofthewindow.

TheGameLoop

17

Theclasshierarchythatwehavecreatedwillhelpustoseparateourgameenginecodefromthecodeofaspecificgame.Althoughitmayseemnecessaryatthismomentweneedtoisolategenerictasksthateverygamewillusefromthestatelogic,artworkandresourcesofaspecificgameinordertoreuseourgameengine.Inlaterchapterswewillneedtorestructurethisclasshierarchyasourgameenginegetsmorecomplex.

ThreadingissuesIfyoutrytorunthesourcecodeprovidedaboveinOSXyouwillgetanerrorlikethis:

Exceptioninthread"GAME_LOOP_THREAD"java.lang.ExceptionInInitializerError

Whatdoesthismean?TheansweristhatsomefunctionsoftheGLFWlibrarycannotbecalledinaThreadwhichisnotthemainThread.Wearedoingtheinitializingstuff,includingwindowcreationintheinitmethodoftheGameEngineclass.Thatmethodgetscalledintherunmethodofthesameclass,whichisinvokedbyanewThreadinsteadtheonethat'susedtolaunchtheprogram.

ThisisaconstraintoftheGLFWlibraryandbasicallyitimpliesthatweshouldavoidthecreationofnewThreadsforthegameloop.WecouldtrytocreatealltheWindowsrelatedstuffinthemainthreadbutwewillnotbeabletorenderanything.Theproblemisthat,OpenGLcallsneedtobeperformedinthesameThreadthatitscontextwascreated.

OnWindowsandLinuxplatforms,althoughwearenotusingthemainthreadtoinitializetheGLFWstuffthesampleswillwork.TheproblemiswithOSX,soweneedtochangethesourcecodeoftherunmethodoftheGameEngineclasstosupportthatplatformlikethis:

publicvoidstart(){

StringosName=System.getProperty("os.name");

if(osName.contains("Mac")){

gameLoopThread.run();

}else{

gameLoopThread.start();

}

}

WhatwearedoingisjustignoringthegameloopthreadwhenweareinOSXandexecutethegameloopcodedirectlyinthemainThread.ThisisnotaperfectsolutionbutitwillallowyoutorunthesamplesonMac.Othersolutionsfoundintheforums(suchasexecutingtheJVMwiththe-XstartOnFirstThreadflagseemtonotwork).

TheGameLoop

18

InthefutureitmaybeinterestingtoexploreifLWJGLprovidesotherGUIlibrariestocheckifthisrestrictionappliestothem.(ManythankstoTimoBühlmannforpointingoutthisissue).

PlatformDifferences(OSX)YouwillbeabletorunthecodedescribedaboveonWindowsorLinux,butwestillneedtodosomemodificationsforOSX.Asit'sstatedinthGLFWdocumentation:

TheonlyOpenGL3.xand4.xcontextscurrentlysupportedbyOSXareforward-compatible,coreprofilecontexts.Thesupportedversionsare3.2on10.7Lionand3.3and4.1on10.9Mavericks.Inallcases,yourGPUneedstosupportthespecifiedOpenGLversionforcontextcreationtosucceed.

So,inordertosupportfeaturesexplainedinlaterchaptersweneedtoaddtheselinestotheWindowclassbeforethewindowiscreated:

glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR,3);

glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR,2);

glfwWindowHint(GLFW_OPENGL_PROFILE,GLFW_OPENGL_CORE_PROFILE);

glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT,GL_TRUE);

ThiswillmaketheprogramusethehighestOpenGLversionpossiblebetween3.2and4.1.Ifthoselinesarenotincluded,aLegacyversionofOpenGLisused.

TheGameLoop

19

AbriefaboutcoordinatesInthischapterwewilltalkalittlebitaboutcoordinatesandcoordinatesystemstryingtointroducesomefundamentalmathematicalconceptsinasimplewaytosupportthetechniquesandtopicsthatwewilladdressinsubsequentchapters.Wewillassumesomesimplificationswhichmaysacrificeprecisenessforthesakeoflegibility.

Welocateobjectsinspacebyspecifyingitscoordinates.Thinkaboutamap.Youspecifyapointonamapbystatingitslatitudeorlongitude.Withjustapairofnumbersapointispreciselyidentified.Thatpairofnumbersarethepointcoordinates(thingsarealittlebitmorecomplexinreality,sinceamapisaprojectionofanonperfectellipsoid,theearth,somoredataisneededbutit’sagoodanalogy).

Acoordinatesystemisasystemwhichemploysoneormorenumbers,thatis,oneormorecoordinatestouniquelyspecifythepositionofapoint.Therearedifferentcoordinatesystems(Cartesian,polar,etc.)andyoucantransformcoordinatesfromonesystemtoanother.WewillusetheCartesiancoordinatesystem.

IntheCartesiancoordinatesystem,fortwodimensions,acoordinateisdefinedbytwonumbersthatmeasurethesigneddistancetotwoperpendicularaxes,xandy.

Continuingwiththemapanalogy,coordinatesystemsdefineanorigin.Forgeographiccoordinatestheoriginissettothepointwheretheequatorandthezeromeridiancross.Dependingonwherewesettheorigin,coordinatesforaspecificpointaredifferent.Acoordinatesystemmayalsodefinetheorientationoftheaxis.Inthepreviousfigure,thex

Abriefaboutcoordinates

20

coordinateincreasesaslongaswemovetotherightandtheycoordinateincreasesaswemoveupwards.But,wecouldalsodefineanalternativeCartesiancoordinatesystemwithdifferentaxisorientationinwhichwewouldobtaindifferentcoordinates.

Asyoucanseeweneedtodefinesomearbitraryparameters,suchastheoriginandtheaxisorientationinordertogivetheappropriatemeaningtothepairofnumbersthatconstituteacoordinate.Wewillrefertothatcoordinatesystemwiththesetofarbitraryparametersasthecoordinatespace.Inordertoworkwithasetofcoordinateswemustusethesamecoordinatespace.Thegoodnewsisthatwecantransformscoordinatesfromonespacetoanotherjustbyperformingtranslationsandrotations.

Ifwearedealingwith3Dcoordinatesweneedanadditionalaxis,thezaxis.3Dcoordinateswillbeformedbyasetofthreenumbers(x,y,z).

Abriefaboutcoordinates

21

Asin2DCartesiancoordinatespaceswecanchangetheorientationoftheaxesin3Dcoordinatespacesaslongastheaxesareperpendicular.Thenextfigureshowsanother3Dcoordinatespace.

3Dcoordinatescanbeclassifiedintwotypes:lefthandedandrighthanded.Howdoyouknowwhichtypeitis?Takeyourhandandforma“L”betweenyourthumbandyourindexfingers,themiddlefingershouldpointinadirectionperpendiculartotheothertwo.Thethumbshouldpointtothedirectionwherethexaxisincreases,theindexfingershouldpointwheretheyaxisincreasesandthemiddlefingershouldpointwherethezaxisincreases.Ifyouareabletodothatwithyourlefthand,thenitslefthanded,ifyouneedtouseyourrighthandisright-handed.

Abriefaboutcoordinates

22

2Dcoordinatespacesareallequivalentsincebyapplyingrotationwecantransformfromonetoanother.3Dcoordinatespaces,onthecontrary,arenotallequal.Youcanonlytransformfromonetoanotherbyapplyingrotationiftheybothhavethesamehandedness,thatis,ifbotharelefthandedorrighthanded.

Nowthatwehavedefinedsomebasictopicslet’stalkaboutsomecommonlyusedtermswhendealingwith3Dgraphics.Whenweexplaininlaterchaptershowtorender3Dmodelswewillseethatweusedifferent3Dcoordinatespaces,thatisbecauseeachofthosecoordinatespaceshasacontext,apurpose.Asetofcoordinatesismeaninglessunlessitreferstosomething.Whenyouexaminethiscoordinates(40.438031,-3.676626)theymaysaysomethingtoyouornot.ButifIsaythattheyaregeometriccoordinates(latitudeandlongitude)youwillseethattheyarethecoordinatesofaplaceinMadrid.

Whenwewillload3Dobjectswewillgetasetof3Dcoordinates.Thosecoordinatesareexpressedina3Dcoordinatespacewhichiscalledobjectcoordinatespace.Whenthegraphicsdesignersarecreatingthose3Dmodelstheydon’tknowanythingaboutthe3Dscenethatthismodelwillbedisplayedin,sotheycanonlydefinethecoordinatesusingacoordinatespacethatisonlyrelevantforthemodel.

Whenwewillbedrawinga3Dsceneallofour3Dobjectswillberelativetothesocalledworldspacecoordinatespace.Wewillneedtotransformfrom3Dobjectspacetoworldspacecoordinates.Someobjectswillneedtoberotated,stretchedorenlargedandtranslatedinordertobedisplayedproperlyina3Dscene.

Wewillalsoneedtorestricttherangeofthe3Dspacethatisshown,whichislikemovingacamerathroughour3Dspace.Thenwewillneedtotransformworldspacecoordinatestocameraorviewspacecoordinates.Finallythesecoordinatesneedtobetransformedtoscreencoordinates,whichare2D,soweneedtoproject3Dviewcoordinatestoa2Dscreencoordinatespace.

Abriefaboutcoordinates

23

ThefollowingpictureshowsOpenGLcoordinates,(thezaxisisperpendiculartothescreen)andcoordinatesarebetween-1and+1.

Don’tworryifyoudon’thaveaclearunderstandingofalltheseconcepts.Theywillberevisitedduringnextchapterswithpracticalexamples.

Abriefaboutcoordinates

24

RenderingInthischapterwewilllearntheprocessesthattakesplacewhilerenderingasceneusingOpenGL.IfyouareusedtoolderversionsofOpenGL,thatisfixed-functionpipeline,youmayendthischapterwonderingwhyitneedstobesocomplex.Youmayendupthinkingthatdrawingasimpleshapetothescreenshouldnotrequiresomanyconceptsandlinesofcode.Letmegiveyouanadviceforthoseofyouthatthinkthatway.Itisactuallysimplerandmuchmoreflexible.Youonlyneedtogiveitachance.ModernOpenGLletsyouthinkinoneproblematatimeanditletsyouorganizeyourcodeandprocessesinamorelogicalway.

Thesequenceofstepsthatendsupdrawinga3Drepresentationintoyour2Dscreeniscalledthegraphicspipeline.FirstversionsofOpenGLemployedamodelwhichwascalledfixed-functionpipeline.Thismodelemployedasetofstepsintherenderingprocesswhichdefinedafixedsetofoperations.Theprogrammerwasconstrainedtothesetoffunctionsavailableforeachstep.Thus,theeffectsandoperationsthatcouldbeappliedwerelimitedbytheAPIitself(forinstance,“setfog”or“addlight”,buttheimplementationofthosefunctionswerefixedandcouldnotbechanged).

Thegraphicspipelinewascomposedofthesesteps:

Rendering

25

OpenGL2.0introducedtheconceptofprogrammablepipeline.Inthismodel,thedifferentstepsthatcomposethegraphicspipelinecanbecontrolledorprogrammedbyusingasetofspecificprogramscalledshaders.ThefollowingpicturedepictsasimplifiedversionoftheOpenGLprogrammablepipeline:

TherenderingstartstakingasitsinputalistofverticesintheformofVertexBuffers.But,whatisavertex?Avertexisadatastructurethatdescribesapointin2Dor3Dspace.Andhowdoyoudescribeapointina3Dspace?Byspecifyingitsx,yandzcoordinates.AndwhatisaVertexBuffer?AVertexBufferisanotherdatastructurethatpacksalltheverticesthatneedtoberendered,byusingvertexarrays,andmakesthatinformationavailabletotheshadersinthegraphicspipeline.

Thoseverticesareprocessedbythevertexshaderwhosemainpurposeistocalculatetheprojectedpositionofeachvertexintothescreenspace.Thisshadercangeneratealsootheroutputsrelatedtocolourortexture,butitsmaingoalistoprojecttheverticesintothescreenspace,thatis,togeneratedots.

Thegeometryprocessingstageconnectstheverticesthataretransformedbythevertexshadertoformtriangles.Itdoessobytakingintoconsiderationtheorderinwhichtheverticeswerestoredandgroupingthemusingdifferentmodels.Whytriangles?Atriangleislikethebasicworkunitforgraphiccards.It’sasimplegeometricshapethatcanbecombinedandtransformedtoconstructcomplex3Dscenes.Thisstagecanalsouseaspecificshadertogroupthevertices.

Rendering

26

Therasterizationstagetakesthetrianglesgeneratedinthepreviousstages,clipsthemandtransformsthemintopixel-sizedfragments.

Thosefragmentsareusedduringthefragmentprocessingstagebythefragmentshadertogeneratepixelsassigningthemthefinalcolorthatgetswrittenintotheframebuffer.Theframebufferisthefinalresultofthegraphicspipeline.Itholdsthevalueofeachpixelthatshouldbedrawntothescreen.

Keepinmindthat3Dcardsaredesignedtoparallelizealltheoperationsdescribedabove.Theinputdatacanbeprocessesinparallelinordertogeneratethefinalscene.

Solet'sstartwritingourfirstshaderprogram.ShadersarewrittenbyusingtheGLSLlanguage(OpenGLShadingLanguage)whichisbasedonANSIC.Firstwewillcreateafilenamed“vertex.vs”(TheextensionisforVertexShader)undertheresourcesdirectorywiththefollowingcontent:

#version330

layout(location=0)invec3position;

voidmain()

{

gl_Position=vec4(position,1.0);

}

ThefirstlineisadirectivethatstatestheversionoftheGLSLlanguageweareusing.ThefollowingtablerelatestheGLSLversion,theOpenGLthatmatchesthatversionandthedirectivetouse(Wikipedia:https://en.wikipedia.org/wiki/OpenGL_Shading_Language#Versions).

Rendering

27

GLSVersion OpenGLVersion ShaderPreprocessor

1.10.59 2.0 #version110

1.20.8 2.1 #version120

1.30.10 3.0 #version130

1.40.08 3.1 #version140

1.50.11 3.2 #version150

3.30.6 3.3 #version330

4.00.9 4.0 #version400

4.10.6 4.1 #version410

4.20.11 4.2 #version420

4.30.8 4.3 #version430

4.40 4.4 #version440

4.50 4.5 #version450

Thesecondlinespecifiestheinputformatforthisshader.DatainanOpenGLbuffercanbewhateverwewant,thatis,thelanguagedoesnotforceyoutopassaspecificdatastructurewithapredefinedsemantic.Fromthepointofviewoftheshaderitisexpectingtoreceiveabufferwithdata.Itcanbeaposition,apositionwithsomeadditionalinformationorwhateverwewant.Thevertexshaderisjustreceivinganarrayoffloats.Whenwefillthebuffer,wedefinethebufferchunksthataregoingtobeprocessedbytheshader.

So,firstweneedtogetthatchunkintosomethingthat’smeaningfultous.Inthiscasewearesayingthat,startingfromtheposition0,weareexpectingtoreceiveavectorcomposedof3attributes(x,y,z).

TheshaderhasamainblocklikeanyotherCprogramwhichinthiscaseisverysimple.Itisjustreturningthereceivedpositionintheoutputvariablegl_Positionwithoutapplyinganytransformation.Younowmaybewonderingwhythevectorofthreeattributeshasbeenconvertedintoavectoroffourattributes(vec4).Thisisbecausegl_Positionisexpectingtheresultinvec4formatsinceitisusinghomogeneouscoordinates.Thatis,it’sexpectingsomethingintheform(x,y,z,w),wherewrepresentsanextradimension.Whyaddanotherdimension?Inlaterchaptersyouwillseethatmostoftheoperationsweneedtodoarebasedonvectorsandmatrices.Someofthoseoperationscannotbecombinedifwedonothavethatextradimension.Forinstancewecouldnotcombinerotationandtranslationoperations.(Ifyouwanttolearnmoreonthis,thisextradimensionallowustocombineaffineandlineartransformations.Youcanlearnmoreaboutthisbyreadingtheexcellentbook“3DMathPrimerforGraphicsandGamedevelopment,byFletcherDunnandIanParberry).

Rendering

28

Letusnowhavealookatourfirstfragmentshader.Wewillcreateafilenamed“fragment.fs”(TheextensionisforFragmentShader)undertheresourcesdirectorywiththefollowingcontent:

#version330

outvec4fragColor;

voidmain()

{

fragColor=vec4(0.0,0.5,0.5,1.0);

}

Thestructureisquitesimilartoourvertexshader.Inthiscasewewillsetafixedcolourforeachfragment.Theoutputvariableisdefinedinthesecondlineandsetasavec4fragColor.Nowthatwehaveourshaderscreated,howdoweusethem?Thisisthesequenceofstepsweneedtofollow:

1. CreateaOpenGLProgram2. Loadthevertexandshadercodefiles.3. Foreachshader,createanewshaderprogramandspecifyitstype(vertex,fragment).4. Compiletheshader.5. Attachtheshadertotheprogram.6. Linktheprogram.

Attheendtheshaderwillbeloadedinthegraphicscardandwecanuseitbyreferencinganidentifier,theprogramidentifier.

packageorg.lwjglb.engine.graph;

importstaticorg.lwjgl.opengl.GL20.*;

publicclassShaderProgram{

privatefinalintprogramId;

privateintvertexShaderId;

privateintfragmentShaderId;

publicShaderProgram()throwsException{

programId=glCreateProgram();

if(programId==0){

thrownewException("CouldnotcreateShader");

}

}

Rendering

29

publicvoidcreateVertexShader(StringshaderCode)throwsException{

vertexShaderId=createShader(shaderCode,GL_VERTEX_SHADER);

}

publicvoidcreateFragmentShader(StringshaderCode)throwsException{

fragmentShaderId=createShader(shaderCode,GL_FRAGMENT_SHADER);

}

protectedintcreateShader(StringshaderCode,intshaderType)throwsException{

intshaderId=glCreateShader(shaderType);

if(shaderId==0){

thrownewException("Errorcreatingshader.Type:"+shaderType);

}

glShaderSource(shaderId,shaderCode);

glCompileShader(shaderId);

if(glGetShaderi(shaderId,GL_COMPILE_STATUS)==0){

thrownewException("ErrorcompilingShadercode:"+glGetShaderInfoLog(s

haderId,1024));

}

glAttachShader(programId,shaderId);

returnshaderId;

}

publicvoidlink()throwsException{

glLinkProgram(programId);

if(glGetProgrami(programId,GL_LINK_STATUS)==0){

thrownewException("ErrorlinkingShadercode:"+glGetProgramInfoLog(pr

ogramId,1024));

}

if(vertexShaderId!=0){

glDetachShader(programId,vertexShaderId);

}

if(fragmentShaderId!=0){

glDetachShader(programId,fragmentShaderId);

}

glValidateProgram(programId);

if(glGetProgrami(programId,GL_VALIDATE_STATUS)==0){

System.err.println("WarningvalidatingShadercode:"+glGetProgramInfoLo

g(programId,1024));

}

}

publicvoidbind(){

glUseProgram(programId);

}

Rendering

30

publicvoidunbind(){

glUseProgram(0);

}

publicvoidcleanup(){

unbind();

if(programId!=0){

glDeleteProgram(programId);

}

}

}

TheconstructoroftheShaderProgramcreatesanewprograminOpenGLandprovidesmethodstoaddvertexandfragmentshaders.ThoseshadersarecompiledandattachedtotheOpenGLprogram.Whenallshadersareattachedthelinkmethodshouldbeinvokedwhichlinksallthecodeandverifiesthateverythinghasbeendonecorrectly.

Oncetheshaderprogramhasbeenlinked,thecompiledvertexandfragmentshaderscanbefreedup(bycallingglDetachShader)

Regardingverification,thisisdonethroughtheglValidateProgramcall.Thismethodisusedmainlyfordebuggingpurposes,anditshouldberemovedwhenyourgamereachesproductionstage.ThismethodtriestovalidateiftheshaderiscorrectgiventhecurrentOpenGLstate.Thismeans,thatvalidationmayfailinsomecaseseveniftheshaderiscorrect,duetothefactthatthecurrentstateisnotcompleteenoughtoruntheshader(somedatamayhavenotbeenuploadedyet).So,insteadoffailing,wejustprintanerrormessagetothestandarderroroutput.

ShaderProgramalsoprovidesmethodstoactivatethisprogramforrendering(bind)andtostopusingit(unbind).Finallyitprovidesacleanupmethodtofreealltheresourceswhentheyarenolongerneeded.

Sincewehaveacleanupmethod,letuschangeourIGameLogicinterfaceclasstoaddacleanupmethod:

voidcleanup();

Thismethodwillbeinvokedwhenthegameloopfinishes,soweneedtomodifytherunmethodoftheGameEngineclass:

Rendering

31

@Override

publicvoidrun(){

try{

init();

gameLoop();

}catch(Exceptionexcp){

excp.printStackTrace();

}finally{

cleanup();

}

}

Nowwecanuseorshadersinordertodisplayatriangle.WewilldothisintheinitmethodofourRendererclass.Firstofall,wecreatetheshaderprogram:

publicvoidinit()throwsException{

shaderProgram=newShaderProgram();

shaderProgram.createVertexShader(Utils.loadResource("/vertex.vs"));

shaderProgram.createFragmentShader(Utils.loadResource("/fragment.fs"));

shaderProgram.link();

}

Wehavecreatedautilityclasswhichbynowprovidesamethodtoretrievethecontentsofafilefromtheclasspath.Thismethodisusedtoretrievethecontentsofourshaders.

Nowwecandefineourtriangleasanarrayoffloats.Wecreateasinglefloatarraywhichwilldefinetheverticesofthetriangle.Asyoucanseethere’snostructureinthatarray.Asitisrightnow,OpenGLcannotknowthestructureofthatdata.It’sjustasequenceoffloats:

float[]vertices=newfloat[]{

0.0f,0.5f,0.0f,

-0.5f,-0.5f,0.0f,

0.5f,-0.5f,0.0f

};

Thefollowingpicturedepictsthetriangleinourcoordinatessystem.

Rendering

32

Nowthatwehaveourcoordinates,weneedtostorethemintoourgraphicscardandtellOpenGLaboutthestructure.Wewillintroducenowtwoimportantconcepts,VertexArrayObjects(VAOs)andVertexBufferObject(VBOs).Ifyougetlostinthenextcodefragmentsrememberthatattheendwhatwearedoingissendingthedatathatmodelstheobjectswewanttodrawtothegraphicscardmemory.Whenwestoreitwegetanidentifierthatservesuslatertorefertoitwhiledrawing.

LetusfirststartwithVertexBufferObject(VBOs).AVBOisjustamemorybufferstoredinthegraphicscardmemorythatstoresvertices.Thisiswherewewilltransferourarrayoffloatsthatmodelatriangle.Aswesaidbefore,OpenGLdoesnotknowanythingaboutourdatastructure.Infactitcanholdnotjustcoordinatesbutotherinformation,suchastextures,colour,etc.AVertexArrayObjects(VAOs)isanobjectthatcontainsoneormoreVBOswhichareusuallycalledattributelists.Eachattributelistcanholdonetypeofdata:position,colour,texture,etc.Youarefreetostorewhicheveryouwantineachslot.

AVAOislikeawrapperthatgroupsasetofdefinitionsforthedatathatisgoingtobestoredinthegraphicscard.WhenwecreateaVAOwegetanidentifier.Weusethatidentifiertorenderitandtheelementsitcontainsusingthedefinitionswespecifiedduringitscreation.

Soletuscontinuecodingourexample.ThefirstthingthatwemustdoistostoreourarrayoffloatsintoaFloatBuffer.ThisismainlyduetothefactthatwemustinterfacewiththeOpenGLlibrary,whichisC-based,sowemusttransformourarrayoffloatsintosomethingthatcanbemanagedbythelibrary.

FloatBufferverticesBuffer=MemoryUtil.memAllocFloat(vertices.length);

verticesBuffer.put(vertices).flip();

Rendering

33

WeusetheMemoryUtilclasstocreatethebufferinoff-heapmemorysothatit'saccessiblebytheOpenGLlibrary.Afterwehavestoredthedata(withtheputmethod)weneedtoresetthepositionofthebuffertothe0positionwiththeflipmethod(thatis,wesaythatwe’vefinishingwritingtoit).Remember,thatJavaobjects,areallocatedinaspacecalledtheheap.TheheapisalargebunchofmemoryreservedintheJVM'sprocessmemory.Memorystoredintheheapcannotbeaccessedbynativecode(JNI,themechanismthatallowscallingnativecodefromJavadoesnotallowthat).TheonlywayofsharingmemorydatabetweenJavaandnativecodeisbydirectlyallocatingmemoryinJava.

IfyoucomefrompreviousversionsofLWJGLit'simportanttostressoutafewtopics.YoumayhavenoticedthatwedonotusetheutilityclassBufferUtilstocreatethebuffers.InsteadweusetheMemoryUtilclass.ThisisduetothefactthatBufferUtilswasnotveryefficient,andhasbeenmantainedonlyforbackwardscompatibility.Instead,LWJGL3proposestwomethodsforbuffermanagement:

Auto-managedbuffers,thatis,buffersthatareautomaticallycollectedbytheGarbageCollector.Thesebuffersaremainlyusedforshortlivedoperations,orfordatathatistransferredtotheGPUanddoesnotneedtobepresentintheprocessmemory.Thisisachievedbyusingtheorg.lwjgl.system.MemoryStackclass.Manuallymanagedbuffers.Inthiscaseweneedtocarefulleyfreethemoncewearefinished.Thesebuffersareintendedforlongtimeoperationsorforlargeamountsofdata.ThisisachievedbyusingtheMemoryUtilclass.

Youcanconsultthedetailshere:https://blog.lwjgl.org/memory-management-in-lwjgl-3/.

Inthiscase,ourdataissenttotheGPUsowecoulduseauto-managedbuffers.Butsince,lateron,wewillusethemtoholdpotentiallylargevolumesofdatawewillneedtomanuallymanagethem.ThisisthereasonwhyweareusingtheMemoryUtilclassandthus,whywearefreeingthebufferinafinallyblock.Innextchapterswewilllearnhowtouseauto-managedbuffers.

NowweneedtocreatetheVAOandbindit.

vaoId=glGenVertexArrays();

glBindVertexArray(vaoId);

ThenweneedtocreatetheVBO,binditandputthedataintoit.

vboId=glGenBuffers();

glBindBuffer(GL_ARRAY_BUFFER,vboId);

glBufferData(GL_ARRAY_BUFFER,verticesBuffer,GL_STATIC_DRAW);

memFree(verticesBuffer);

Rendering

34

Nowcomesthemostimportantpart.WeneedtodefinethestructureofourdataandstoreitinoneoftheattributelistsoftheVAO.Thisisdonewiththefollowingline.

glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

Theparametersare:

index:Specifiesthelocationwheretheshaderexpectsthisdata.size:Specifiesthenumberofcomponentspervertexattribute(from1to4).Inthiscase,wearepassing3Dcoordinates,soitshouldbe3.type:Specifiesthetypeofeachcomponentinthearray,inthiscaseafloat.normalized:Specifiesifthevaluesshouldbenormalizedornot.stride:Specifiesthebyteoffsetbetweenconsecutivegenericvertexattributes.(Wewillexplainitlater).offset:Specifiesanoffsettothefirstcomponentinthebuffer.

AfterwearefinishedwithourVBOwecanunbinditandtheVAO(bindthemto0)

//UnbindtheVBO

glBindBuffer(GL_ARRAY_BUFFER,0);

//UnbindtheVAO

glBindVertexArray(0);

Oncethishasbeencompletedwemustfreetheoff-heapmemorythatwasallocatedbytheFloatBuffer.ThisisdonebymanuallycallingmemFree,asJavagarbagecollectionwillnotcleanupoff-heapallocations.

if(verticesBuffer!=null){

MemoryUtil.memFree(verticesBuffer);

}

That’sallthecodethatshouldbeinourinitmethod.Ourdataisalreadyinthegraphicscard,readytobeused.Weonlyneedtomodifyourrendermethodtouseiteachrenderstepduringourgameloop.

Rendering

35

publicvoidrender(Windowwindow){

clear();

if(window.isResized()){

glViewport(0,0,window.getWidth(),window.getHeight());

window.setResized(false);

}

shaderProgram.bind();

//BindtotheVAO

glBindVertexArray(vaoId);

glEnableVertexAttribArray(0);

//Drawthevertices

glDrawArrays(GL_TRIANGLES,0,3);

//Restorestate

glDisableVertexAttribArray(0);

glBindVertexArray(0);

shaderProgram.unbind();

}

Asyoucanseewejustclearthewindow,bindtheshaderprogram,bindtheVAO,drawtheverticesstoredintheVBOassociatedtotheVAOandrestorethestate.That’sit.

WealsoaddedacleanupmethodtoourRendererclasswhichfreesacquiredresources.

publicvoidcleanup(){

if(shaderProgram!=null){

shaderProgram.cleanup();

}

glDisableVertexAttribArray(0);

//DeletetheVBO

glBindBuffer(GL_ARRAY_BUFFER,0);

glDeleteBuffers(vboId);

//DeletetheVAO

glBindVertexArray(0);

glDeleteVertexArrays(vaoId);

}

And,that’sall!Ifyoufollowedthestepscarefullyyouwillseesomethinglikethis.

Rendering

36

Ourfirsttriangle!Youmaythinkthatthiswillnotmakeitintothetoptengamelist,andyouwillbetotallyright.Youmayalsothinkthatthishasbeentoomuchworkfordrawingaboringtriangle.Butkeepinmindthatweareintroducingkeyconceptsandpreparingthebaseinfrastructuretodomorecomplexthings.Pleasebepatientandcontinuereading.

Rendering

37

MoreonRenderingInthischapterwewillcontinuetalkingabouthowOpenGLrendersthings.Inordertotidyupourcodealittlebitlet’screateanewclasscalledMeshwhich,takingasaninputanarrayofpositions,createstheVBOandVAOobjectsneededtoloadthatmodelintothegraphicscard.

packageorg.lwjglb.engine.graph;

importjava.nio.FloatBuffer;

importstaticorg.lwjgl.opengl.GL11.*;

importstaticorg.lwjgl.opengl.GL15.*;

importstaticorg.lwjgl.opengl.GL20.*;

importstaticorg.lwjgl.opengl.GL30.*;

importorg.lwjgl.system.MemoryUtil;

publicclassMesh{

privatefinalintvaoId;

privatefinalintvboId;

privatefinalintvertexCount;

publicMesh(float[]positions){

FloatBufferverticesBuffer=null;

try{

verticesBuffer=MemoryUtil.memAllocFloat(positions.length);

vertexCount=positions.length/3;

verticesBuffer.put(positions).flip();

vaoId=glGenVertexArrays();

glBindVertexArray(vaoId);

vboId=glGenBuffers();

glBindBuffer(GL_ARRAY_BUFFER,vboId);

glBufferData(GL_ARRAY_BUFFER,verticesBuffer,GL_STATIC_DRAW);

glVertexAttribPointer(0,3,GL_FLOAT,false,0,0);

glBindBuffer(GL_ARRAY_BUFFER,0);

glBindVertexArray(0);

}finally{

if(verticesBuffer!=null){

MemoryUtil.memFree(verticesBuffer);

}

}

}

MoreonRendering

38

publicintgetVaoId(){

returnvaoId;

}

publicintgetVertexCount(){

returnvertexCount;

}

publicvoidcleanUp(){

glDisableVertexAttribArray(0);

//DeletetheVBO

glBindBuffer(GL_ARRAY_BUFFER,0);

glDeleteBuffers(vboId);

//DeletetheVAO

glBindVertexArray(0);

glDeleteVertexArrays(vaoId);

}

}

WewillcreateourMeshinstanceinourDummyGameclass,removingtheVAOandVBOcodefromRendererinitmethod.OurrendermethodintheRendererclasswillacceptalsoaMeshinstancetorender.ThecleanupmethodwillalsobesimplifiedsincetheMeshclassalreadyprovidesoneforfreeingVAOandVBOresources.

MoreonRendering

39

publicvoidrender(Meshmesh){

clear();

if(window.isResized()){

glViewport(0,0,window.getWidth(),window.getHeight());

window.setResized(false);

}

shaderProgram.bind();

//Drawthemesh

glBindVertexArray(mesh.getVaoId());

glEnableVertexAttribArray(0);

glDrawArrays(GL_TRIANGLES,0,mesh.getVertexCount());

//Restorestate

glDisableVertexAttribArray(0);

glBindVertexArray(0);

shaderProgram.unbind();

}

publicvoidcleanup(){

if(shaderProgram!=null){

shaderProgram.cleanup();

}

}

Oneimportantthingtonoteisthisline:

glDrawArrays(GL_TRIANGLES,0,mesh.getVertexCount());

OurMeshcountsthenumberofverticesbydividingthepositionarrayby3(sincewearepassingX,YandZcoordinates)).Nowthatwecanrendermorecomplexshapes,letustrytorenderamorecomplexshape.Letusrenderaquad.Aquadcanbeconstructedbyusingtwotrianglesasshowninthenextfigure.

MoreonRendering

40

Asyoucanseeeachofthetwotrianglesiscomposedofthreevertices.ThefirstoneformedbytheverticesV1,V2andV4(theorangeone)andthesecondoneformedbytheverticesV4,V2andV3(thegreenone).Verticesarespecifiedinacounter-clockwiseorder,sothefloatarraytobepassedwillbe[V1,V2,V4,V4,V2,V3].Thus,theinitmethodinourDummyGameclasswillbe:

@Override

publicvoidinit()throwsException{

renderer.init();

float[]positions=newfloat[]{

-0.5f,0.5f,0.0f,

-0.5f,-0.5f,0.0f,

0.5f,0.5f,0.0f,

0.5f,0.5f,0.0f,

-0.5f,-0.5f,0.0f,

0.5f,-0.5f,0.0f,

};

mesh=newMesh(positions);

}

Nowyoushouldseeaquadrenderedlikethis:

MoreonRendering

41

Arewedoneyet?Unfortunatelynot.Thecodeabovestillpresentssomeissues.Wearerepeatingcoordinatestorepresentthequad.WearepassingtwiceV2andV4coordinates.Withthissmallshapeitmaynotseemabigdeal,butimagineamuchmorecomplex3Dmodel.Wewouldberepeatingthecoordinatesmanytimes.Keepinmindalsothatnowwearejustusingthreefloatsforrepresentingthepositionofavertex.Butlateronwewillneedmoredatatorepresentthetexture,etc.Alsotakeintoconsiderationthatinmorecomplexshapesthenumberofverticessharedbetweentrianglescanbeevenhigherlikeinthefigurebelow(whereavertexcanbesharedbetweensixtriangles).

AttheendwewouldneedmuchmorememorybecauseofthatduplicateinformationandthisiswhereIndexBufferscometotherescue.Fordrawingthequadweonlyneedtospecifyeachvertexoncethisway:V1,V2,V3,V4).Eachvertexhasapositioninthearray.V1hasposition0,V2hasposition1,etc:

V1 V2 V3 V4

0 1 2 3

Thenwespecifytheorderinwhichthoseverticesshouldbedrawnbyreferringtotheirposition:

0 1 3 3 1 2

V1 V2 V4 V4 V2 V3

SoweneedtomodifyourMeshclasstoacceptanotherparameter,anarrayofindices,andnowthenumberofverticestodrawwillbethelengthofthatindicesarray.

MoreonRendering

42

publicMesh(float[]positions,int[]indices){

vertexCount=indices.length;

AfterwehavecreatedourVBOthatstoresthepositions,weneedtocreateanotherVBOwhichwillholdtheindices.SowerenametheidentifierthatholdstheidentifierforthepositionsVBOandcreateanewonefortheindexVBO(idxVboId).TheprocessofcreatingthatVBOissimilarbutthetypeisnowGL_ELEMENT_ARRAY_BUFFER.

idxVboId=glGenBuffers();

indicesBuffer=MemoryUtil.memAllocInt(indices.length);

indicesBuffer.put(indices).flip();

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,idxVboId);

glBufferData(GL_ELEMENT_ARRAY_BUFFER,indicesBuffer,GL_STATIC_DRAW);

memFree(indicesBuffer);

SincewearedealingwithintegersweneedtocreateanIntBufferinsteadofaFloatBuffer.

Andthat’sit.TheVAOwillcontainnowtwoVBOs,oneforpositionsandanotheronethatwillholdtheindicesandthatwillbeusedforrendering.OurcleanupmethodinourMeshclassmusttakeintoconsiderationthatthereisanotherVBOtofree.

publicvoidcleanUp(){

glDisableVertexAttribArray(0);

//DeletetheVBOs

glBindBuffer(GL_ARRAY_BUFFER,0);

glDeleteBuffers(posVboId);

glDeleteBuffers(idxVboId);

//DeletetheVAO

glBindVertexArray(0);

glDeleteVertexArrays(vaoId);

}

Finally,weneedtomodifyourdrawingcallthatusedtheglDrawArraysmethod:

glDrawArrays(GL_TRIANGLES,0,mesh.getVertexCount());

ToanothercallthatusesthemethodglDrawElements:

glDrawElements(GL_TRIANGLES,mesh.getVertexCount(),GL_UNSIGNED_INT,0);

Theparametersofthatmethodare:

MoreonRendering

43

mode:Specifiestheprimitivesforrendering,trianglesinthiscase.Nochangeshere.count:Specifiesthenumberofelementstoberendered.type:Specifiesthetypeofvalueintheindicesdata.Inthiscaseweareusingintegers.indices:Specifiestheoffsettoapplytotheindicesdatatostartrendering.

Andnowwecanuseournewerandmuchmoreefficientmethodofdrawingcomplexmodelsbyjustspecifyingtheindices.

publicvoidinit()throwsException{

renderer.init();

float[]positions=newfloat[]{

-0.5f,0.5f,0.0f,

-0.5f,-0.5f,0.0f,

0.5f,-0.5f,0.0f,

0.5f,0.5f,0.0f,

};

int[]indices=newint[]{

0,1,3,3,1,2,

};

mesh=newMesh(positions,indices);

}

Nowlet’saddsomecolourtoourexample.WewillpassanotherarrayoffloatstoourMeshclasswhichholdsthecolourforeachcoordinateinthequad.

publicMesh(float[]positions,float[]colours,int[]indices){

Withthatarray,wewillcreateanotherVBOwhichwillbeassociatedtoourVAO.

//ColourVBO

colourVboId=glGenBuffers();

FloatBuffercolourBuffer=memAllocFloat(colours.length);

colourBuffer.put(colours).flip();

glBindBuffer(GL_ARRAY_BUFFER,colourVboId);

glBufferData(GL_ARRAY_BUFFER,colourBuffer,GL_STATIC_DRAW);

memFree(colourBuffer);

glVertexAttribPointer(1,3,GL_FLOAT,false,0,0);

PleasenoticethatintheglVertexAttribPointercall,thefirstparameterisnowa“1”.Thisisthelocationwhereourshaderwillbeexpectingthatdata.(Ofcourse,sincewehaveanotherVBOweneedtofreeitinthecleanupmethod).

Thenextstepistomodifytheshaders.Thevertexshaderisnowexpectingtwoparameters,thecoordinates(inlocation0)andthecolour(inlocation1).Thevertexshaderwilljustoutputthereceivedcoloursoitcanbeprocessedbythefragmentshader.

MoreonRendering

44

#version330

layout(location=0)invec3position;

layout(location=1)invec3inColour;

outvec3exColour;

voidmain()

{

gl_Position=vec4(position,1.0);

exColour=inColour;

}

Andnowourfragmentshaderreceivesasaninputthecolourprocessedbyourvertexshaderandusesittogeneratethecolour.

#version330

invec3exColour;

outvec4fragColor;

voidmain()

{

fragColor=vec4(exColour,1.0);

}

Thelastimportantthingtodoistomodifyourrenderingcodetousethatsecondarrayofdata:

publicvoidrender(Windowwindow,Meshmesh){

clear();

if(window.isResized()){

glViewport(0,0,window.getWidth(),window.getHeight());

window.setResized(false);

}

shaderProgram.bind();

//Drawthemesh

glBindVertexArray(mesh.getVaoId());

glEnableVertexAttribArray(0);

glEnableVertexAttribArray(1);

glDrawElements(GL_TRIANGLES,mesh.getVertexCount(),GL_UNSIGNED_INT,0);

//...

MoreonRendering

45

YoucanseethatweneedtoenabletheVAOattributeatposition1tobeusedduringrendering.WecannowpassanarrayofcolourslikethistoourMeshclassinordertoaddsomecolourtoourquad.

float[]colours=newfloat[]{

0.5f,0.0f,0.0f,

0.0f,0.5f,0.0f,

0.0f,0.0f,0.5f,

0.0f,0.5f,0.5f,

};

Andwewillgetafancycolouredquadlikethis.

MoreonRendering

46

Transformations

ProjectingLet’sgetbacktoournicecolouredquadwecreatedinthepreviouschapter.Ifyoulookcarefullyatit,itmoreresemblesarectangle.Youcanevenchangethewidthofthewindowfrom600pixelsto900andthedistortionwillbemoreevident.What’shappeninghere?

Ifyourevisitourvertexshadercodewearejustpassingourcoordinatesdirectly.Thatis,whenwesaythatavertexhasavalueforcoordinatexof0.5wearesayingtoOpenGLtodrawitatxposition0.5onourscreen.ThefollowingfigureshowstheOpenGLcoordinates(justforxandyaxis).

Thosecoordinatesaremapped,consideringourwindowsize,towindowcoordinates(whichhavetheoriginatthetop-leftcornerofthepreviousfigure).So,ifourwindowhasasizeof900x480,OpenGLcoordinates(1,0)willbemappedtocoordinates(900,0)creatingarectangleinsteadofaquad.

Transformations

47

But,theproblemismoreseriousthanthat.Modifythezcoordinateofourquadfrom0.0to1.0andto-1.0.Whatdoyousee?Thequadisexactlydrawninthesameplacenomatterifit’sdisplacedalongthezaxis.Whyisthishappening?Objectsthatarefurtherawayshouldbedrawnsmallerthanobjectsthatarecloser.Butwearedrawingthemwiththesamexandycoordinates.

But,wait.Shouldthisnotbehandledbythezcoordinate?Theanswerisyesandno.ThezcoordinatetellsOpenGLthatanobjectiscloserorfartheraway,butOpenGLdoesnotknowanythingaboutthesizeofyourobject.Youcouldhavetwoobjectsofdifferentsizes,onecloserandsmallerandonebiggerandfurtherthatcouldbeprojectedcorrectlyontothescreenwiththesamesize(thosewouldhavesamexandycoordinatesbutdifferentz).OpenGLjustusesthecoordinateswearepassing,sowemusttakecareofthis.Weneedtocorrectlyprojectourcoordinates.

Nowthatwehavediagnosedtheproblem,howdowedothis?Theanswerisusingaprojectionmatrixorfrustum.Theprojectionmatrixwilltakecareoftheaspectratio(therelationbetweensizeandheight)ofourdrawingareasoobjectswon’tbedistorted.Italsowillhandlethedistancesoobjectsfarawayfromuswillbedrawnsmaller.Theprojectionmatrixwillalsoconsiderourfieldofviewandhowfarthemaximumdistanceisthatshouldbedisplayed.

Forthosenotfamiliarwithmatrices,amatrixisabi-dimensionalarrayofnumbersarrangedincolumnsandrows.Eachnumberinsideamatrixiscalledanelement.Amatrixorderisthenumberofrowsandcolumns.Forinstance,hereyoucanseea2x2matrix(2rowsand2columns).

Matriceshaveanumberofbasicoperationsthatcanbeappliedtothem(suchasaddition,multiplication,etc.)thatyoucanconsultinanymathsbook.Themaincharacteristicsofmatrices,relatedto3Dgraphics,isthattheyareveryusefultotransformpointsinthespace.

Youcanthinkabouttheprojectionmatrixasacamera,whichhasafieldofviewandaminimumandmaximumdistance.Thevisionareaofthatcamerawillbeatruncatedpyramid.Thefollowingpictureshowsatopviewofthatarea.

Transformations

48

Aprojectionmatrixwillcorrectlymap3Dcoordinatessotheycanbecorrectlyrepresentedona2Dscreen.Themathematicalrepresentationofthatmatrixisasfollows(don’tbescared).

Whereaspectratioistherelationbetweenourscreenwidthandourscreenheight(a = width/height).Inordertoobtaintheprojectedcoordinatesofagivenpointwejustneedtomultiplytheprojectionmatrixbytheoriginalcoordinates.Theresultwillbeanothervectorthatwillcontaintheprojectedversion.

Soweneedtohandleasetofmathematicalentitiessuchasvectors,matricesandincludetheoperationsthatcanbedoneonthem.Wecouldchoosetowriteallthatcodebyourownfromscratchoruseanalreadyexistinglibrary.WewillchoosetheeasypathanduseaspecificlibraryfordealingwithmathoperationsinLWJGLwhichiscalledJOML(JavaOpenGLMathLibrary).Inordertousethatlibrarywejustneedtoaddanotherdependencytoourpom.xmlfile.

<dependency>

<groupId>org.joml</groupId>

<artifactId>joml</artifactId>

<version>${joml.version}</version>

</dependency>

Anddefinetheversionofthelibrarytouse.

Transformations

49

<properties>

[...]

<joml.version>1.7.1</joml.version>

[...]

</properties>

Nowthateverythinghasbeensetuplet’sdefineourprojectionmatrix.WewillcreateaninstanceoftheclassMatrix4f(providedbytheJOMLlibrary)inourRendererclass.TheMatrix4fclassprovidesamethodtosetupaprojectionmatrixnamedperspective.Thismethodneedsthefollowingparameters:

FieldofView:TheFieldofViewangleinradians.WewilldefineaconstantthatholdsthatvalueAspectRatio.Distancetothenearplane(z-near)Distancetothefarplane(z-far).

WewillinstantiatethatmatrixinourinitmethodsoweneedtopassareferencetoourWindowinstancetogetitssize(youcanseeitinthesourcecode).Thenewconstantsandvariablesare:

/**

*FieldofViewinRadians

*/

privatestaticfinalfloatFOV=(float)Math.toRadians(60.0f);

privatestaticfinalfloatZ_NEAR=0.01f;

privatestaticfinalfloatZ_FAR=1000.f;

privateMatrix4fprojectionMatrix;

Theprojectionmatrixiscreatedasfollows:

floataspectRatio=(float)window.getWidth()/window.getHeight();

projectionMatrix=newMatrix4f().perspective(FOV,aspectRatio,

Z_NEAR,Z_FAR);

Atthismomentwewillignorethattheaspectratiocanchange(byresizingourwindow).Thiscouldbecheckedintherendermethodtochangeourprojectionmatrixaccordingly.

Nowthatwehaveourmatrix,howdoweuseit?Weneedtouseitinourshader,anditshouldbeappliedtoallthevertices.Atfirst,youcouldthinkofbundlingitinthevertexinput(likethecoordinatesandthecolours).Inthiscasewewouldbewastinglotsofspacesince

Transformations

50

theprojectionmatrixshouldnotchangeevenbetweenseveralrendercalls.Youmayalsothinkofmultiplyingtheverticesbythematrixinthejavacode.Butthen,ourVBOswouldbeuselessandwewillnotbeusingtheprocesspoweravailableinthegraphicscard.

Theansweristouse“uniforms”.UniformsareglobalGLSLvariablesthatshaderscanuseandthatwewillemploytocommunicatewiththem.

SoweneedtomodifyourvertexshadercodeanddeclareanewuniformcalledprojectionMatrixanduseittocalculatetheprojectedposition.

#version330

layout(location=0)invec3position;

layout(location=1)invec3inColour;

outvec3exColour;

uniformmat4projectionMatrix;

voidmain()

{

gl_Position=projectionMatrix*vec4(position,1.0);

exColour=inColour;

}

AsyoucanseewedefineourprojectionMatrixasa4x4matrixandthepositionisobtainedbymultiplyingitbyouroriginalcoordinates.Nowweneedtopassthevaluesoftheprojectionmatrixtoourshader.First,weneedtogetareferencetotheplacewheretheuniformwillholditsvalues.

ThisisdonewiththemethodglGetUniformLocationwhichreceivestwoparameters:

Theshaderprogramidentifier.Thenameoftheuniform(itshouldmatchtheoncedefinedintheshadercode).

Thismethodreturnsanidentifierholdingtheuniformlocation.Sincewemayhavemorethanoneuniform,wewillstorethoselocationsinaMapindexedbythelocation'sname(Wewillneedthatlocationnumberlater).SointheShaderProgramclasswecreateanewvariablethatholdsthoseidentifiers:

privatefinalMap<String,Integer>uniforms;

Thisvariablewillbeinitializedinourconstructor:

uniforms=newHashMap<>();

Transformations

51

Andfinallywecreateamethodtosetupnewuniformsandstoretheobtainedlocation.

publicvoidcreateUniform(StringuniformName)throwsException{

intuniformLocation=glGetUniformLocation(programId,

uniformName);

if(uniformLocation<0){

thrownewException("Couldnotfinduniform:"+

uniformName);

}

uniforms.put(uniformName,uniformLocation);

}

Now,inourRendererclasswecaninvokethecreateUniformmethodoncetheshaderprogramhasbeencompiled(inthiscase,wewilldoitoncetheprojectionmatrixhasbeeninstantiated).

shaderProgram.createUniform("projectionMatrix");

Atthismoment,wealreadyhaveaholderreadytobesetupwithdatatobeusedasourprojectionmatrix.Sincetheprojectionmatrixwon’tchangebetweenrenderingcallswemaysetupthevaluesrightafterthecreationoftheuniform.Butwewilldoitinourrendermethod.Youwillseelaterthatwemayreusethatuniformtodoadditionaloperationsthatneedtobedoneineachrendercall.

WewillcreateanothermethodinourShaderProgramclasstosetupthedata,namedsetUniform.Basicallywetransformourmatrixintoa4x4FloatBufferbyusingtheutilitymethodsprovidedbytheJOMLlibraryandsendthemtothelocationwestoredinourlocationsmap.

publicvoidsetUniform(StringuniformName,Matrix4fvalue){

//Dumpthematrixintoafloatbuffer

try(MemoryStackstack=MemoryStack.stackPush()){

FloatBufferfb=stack.mallocFloat(16);

value.get(fb);

glUniformMatrix4fv(uniforms.get(uniformName),false,fb);

}

}

Asyoucanseewearecreatingbuffersinadifferentwayhere.Weareusingauto-managedbuffers,andallocatingthemonthestack.Thisisduetothefactthatthesizeofthisbufferissmallandthatitwillnotbeusedbeyondthismethod.Thus,weusetheMemoryStackclass.

NowwecanusethatmethodintheRendererclassintherendermethod,aftertheshaderprogramhasbeenbound:

Transformations

52

shaderProgram.setUniform("projectionMatrix",projectionMatrix);

Wearealmostdone.Wecannowshowthequadcorrectlyrendered.Soyoucannowlaunchyourprogramandwillobtaina....blackbackgroundwithoutanycolouredquad.What’shappening?Didwebreaksomething?Well,actuallyno.Rememberthatwearenowsimulatingtheeffectofacameralookingatourscene.Andweprovidedtwodistances,onetothefarthestplane(equalto1000f)andonetotheclosestplane(equalto0.01f).Ourcoordinatesare:

float[]positions=newfloat[]{

-0.5f,0.5f,0.0f,

-0.5f,-0.5f,0.0f,

0.5f,-0.5f,0.0f,

0.5f,0.5f,0.0f,

};

Thatis,ourzcoordinatesareoutsidethevisiblezone.Let’sassignthemavalueof-0.05f.Nowyouwillseeagiantgreensquarelikethis:

Whatishappeningnowisthatwearedrawingthequadtooclosetoourcamera.Weareactuallyzoomingintoit.Ifweassignnowavalueof-1.05ftothezcoordinatewecannowseeourcolouredquad.

Transformations

53

Ifwecontinuepushingthequadbackwardswewillseeitbecomingsmaller.Noticealsothatourquaddoesnotresemblearectangleanymore.

ApplyingTransformationsLet’srecallwhatwe’vedonesofar.Wehavelearnedhowtopassdatainanefficientformattoourgraphicscard,andhowtoprojectthatdataandassignthemcoloursusingvertexandfragmentsshaders.Nowweshouldstartdrawingmorecomplexmodelsinour3Dspace.Butinordertodothatwemustbeabletoloadanarbitrarymodelandrepresentitinour3Dspaceataspecificposition,withtheappropriatesizeandtherequiredrotation.

Sorightnow,inordertodothatrepresentationweneedtoprovidesomebasicoperationstoactuponanymodel:

Translation:Moveanobjectbysomeamountonanyofthethreeaxes.Rotation:Rotateanobjectbysomeamountofdegreesaroundanyofthethreeaxes.Scale:Adjustthesizeofanobject.

Theoperationsdescribedaboveareknownastransformations.Andyouprobablemaybeguessingthatthewaywearegoingtoachievethatisbymultiplyingourcoordinatesbyasetofmatrices(onefortranslation,oneforrotationandoneforscaling).Thosethreematriceswillbecombinedintoasinglematrixcalledworldmatrixandpassedasauniformtoourvertexshader.

Thereasonwhyitiscalledworldmatrixisbecausewearetransformingfrommodelcoordinatestoworldcoordinates.Whenyouwilllearnaboutloading3Dmodelsyouwillseethatthosemodelsaredefinedintheirowncoordinatesystems.Theydon’tknowthesizeofyour3Dspaceandtheyneedtobeplacedinit.Sowhenwemultiplyourcoordinatesbyourmatrixwhatwearedoingistransformingfromacoordinatesystem(themodelone)toanothercoordinatesystem(theoneforour3Dworld).

Thatworldmatrixwillbecalculatedlikethis(theorderisimportantsincemultiplicationusingmatricesisnotcommutative):

Transformations

54

WorldMatrix TranslationMatrix RotationMatrix ScaleMatrix

Ifweincludeourprojectionmatrixinthetransformationmatrixitwouldbelikethis:

Transf = ProjMatrix TranslationMatrix RotationMatrix ScaleMatrix = ProjMatr

Thetranslationmatrixisdefinedlikethis:

TranslationMatrixParameters:

dx:Displacementalongthexaxis.dy:Displacementalongtheyaxis.dz:Displacementalongthezaxis.

Thescalematrixisdefinedlikethis:

ScaleMatrixParameters:

sx:Scalingalongthexaxis.sy:Scalingalongtheyaxis.sz:Scalingalongthezaxis.

Therotationmatrixismuchmorecomplex.Butkeepinmindthatitcanbeconstructedbythemultiplicationof3rotationmatricesforasingleaxis,each.

Now,inordertoapplythoseconceptsweneedtorefactorourcodealittlebit.Inourgamewewillbeloadingasetofmodelswhichcanbeusedtorendermanyobjectsatdifferentpositionsaccordingtoourgamelogic(imagineaFPSgamewhichloadsthreemodelsfordifferentenemies.Thereareonlythreemodelsbutusingthesemodelswecandrawasmanyenemiesaswewant).DoweneedtocreateaVAOandthesetofVBOsforeachofthoseobjects?Theanswerisno.Weonlyneedtoloaditoncepermodel.Whatweneedtodoistodrawitindependentlyaccordingtoitsposition,sizeandrotation.Weneedtotransformthosemodelswhenwearerenderingthem.

SowewillcreateanewclassnamedGameItemthatwillholdareferencetoamodel,toaMeshinstance.AGameIteminstancewillhavevariablesforstoringitsposition,itsrotationstateanditsscale.Thisisthedefinitionofthatclass.

[ ] [ ] [ ]

[ ] [ ] [ ] [ ] [

⎣⎢⎢⎡1000

0100

0010

dx

dy

dz

1 ⎦⎥⎥⎤

⎣⎢⎢⎡sx

000

0sy

00

00sz

0

0001⎦⎥⎥⎤

Transformations

55

packageorg.lwjglb.engine;

importorg.joml.Vector3f;

importorg.lwjglb.engine.graph.Mesh;

publicclassGameItem{

privatefinalMeshmesh;

privatefinalVector3fposition;

privatefloatscale;

privatefinalVector3frotation;

publicGameItem(Meshmesh){

this.mesh=mesh;

position=newVector3f(0,0,0);

scale=1;

rotation=newVector3f(0,0,0);

}

publicVector3fgetPosition(){

returnposition;

}

publicvoidsetPosition(floatx,floaty,floatz){

this.position.x=x;

this.position.y=y;

this.position.z=z;

}

publicfloatgetScale(){

returnscale;

}

publicvoidsetScale(floatscale){

this.scale=scale;

}

publicVector3fgetRotation(){

returnrotation;

}

publicvoidsetRotation(floatx,floaty,floatz){

this.rotation.x=x;

this.rotation.y=y;

this.rotation.z=z;

}

publicMeshgetMesh(){

returnmesh;

Transformations

56

}

}

WewillcreateanotherclasswhichwilldealwithtransformationsnamedTransformation.

packageorg.lwjglb.engine.graph;

importorg.joml.Matrix4f;

importorg.joml.Vector3f;

publicclassTransformation{

privatefinalMatrix4fprojectionMatrix;

privatefinalMatrix4fworldMatrix;

publicTransformation(){

worldMatrix=newMatrix4f();

projectionMatrix=newMatrix4f();

}

publicfinalMatrix4fgetProjectionMatrix(floatfov,floatwidth,floatheight,fl

oatzNear,floatzFar){

floataspectRatio=width/height;

projectionMatrix.identity();

projectionMatrix.perspective(fov,aspectRatio,zNear,zFar);

returnprojectionMatrix;

}

publicMatrix4fgetWorldMatrix(Vector3foffset,Vector3frotation,floatscale){

worldMatrix.identity().translate(offset).

rotateX((float)Math.toRadians(rotation.x)).

rotateY((float)Math.toRadians(rotation.y)).

rotateZ((float)Math.toRadians(rotation.z)).

scale(scale);

returnworldMatrix;

}

}

Asyoucanseethisclassgroupstheprojectionandworldmatrices.Givenasetofvectorsthatmodelthedisplacement,rotationandscaleitreturnstheworldmatrix.ThemethodgetWorldMatrixreturnsthematrixthatwillbeusedtotransformthecoordinatesforeachGameIteminstance.ThatclassalsoprovidesamethodthatgetstheprojectionmatrixbasedontheFieldOfView,theaspectratioandthenearandfardistances.

AnimportantthingtonoticeisthatthemulmethodoftheMatrix4fclassmodifiesthematrixinstancewhichthemethodisbeingappliedto.Soifwedirectlymultiplytheprojectionmatrixwiththetransformationmatrixwewillmodifytheprojectionmatrixitself.Thisiswhy

Transformations

57

wearealwaysinitializingeachmatrixtotheidentitymatrixuponeachcall.

IntheRendererclass,intheconstructormethod,wejustinstantiatetheTransformationwithnoargumentsandintheinitmethodwejustcreatetheuniform.

publicRenderer(){

transformation=newTransformation();

}

publicvoidinit(Windowwindow)throwsException{

//....Somecodebefore...

//Createuniformsforworldandprojectionmatrices

shaderProgram.createUniform("projectionMatrix");

shaderProgram.createUniform("worldMatrix");

window.setClearColor(0.0f,0.0f,0.0f,0.0f);

}

IntherendermethodofourRendererclasswenowreceiveanarrayofGameItems:

publicvoidrender(Windowwindow,GameItem[]gameItems){

clear();

if(window.isResized()){

glViewport(0,0,window.getWidth(),window.getHeight());

window.setResized(false);

}

shaderProgram.bind();

//UpdateprojectionMatrix

Matrix4fprojectionMatrix=transformation.getProjectionMatrix(FOV,window.getWidt

h(),window.getHeight(),Z_NEAR,Z_FAR);

shaderProgram.setUniform("projectionMatrix",projectionMatrix);

//RendereachgameItem

for(GameItemgameItem:gameItems){

//Setworldmatrixforthisitem

Matrix4fworldMatrix=

transformation.getWorldMatrix(

gameItem.getPosition(),

gameItem.getRotation(),

gameItem.getScale());

shaderProgram.setUniform("worldMatrix",worldMatrix);

//Renderthemesforthisgameitem

gameItem.getMesh().render();

}

shaderProgram.unbind();

}

Transformations

58

Weupdatetheprojectionmatrixonceperrendercall.Bydoingitthiswaywecandealwithwindowresizeoperations.ThenweiterateovertheGameItemarrayandcreateatransformationmatrixaccordingtotheposition,rotationandscaleofeachofthem.ThismatrixispushedtotheshaderandtheMeshisdrawn.Theprojectionmatrixisthesameforalltheitemstoberendered.Thisisthereasonwhyit’saseparatevariableinourTransformationclass.

WemovedtherenderingcodetodrawaMeshtothisclass:

publicvoidrender(){

//Drawthemesh

glBindVertexArray(getVaoId());

glEnableVertexAttribArray(0);

glEnableVertexAttribArray(1);

glDrawElements(GL_TRIANGLES,getVertexCount(),GL_UNSIGNED_INT,0);

//Restorestate

glDisableVertexAttribArray(0);

glDisableVertexAttribArray(1);

glBindVertexArray(0);

}

OurvertexshaderismodifiedbysimplyaddinganewworldMatrixmatrixanditusesitwiththeprojectionMatrixtocalculatetheposition:

#version330

layout(location=0)invec3position;

layout(location=1)invec3inColour;

outvec3exColour;

uniformmat4worldMatrix;

uniformmat4projectionMatrix;

voidmain()

{

gl_Position=projectionMatrix*worldMatrix*vec4(position,1.0);

exColour=inColour;

}

Asyoucanseethecodeisexactlythesame.Weareusingtheuniformtocorrectlyprojectourcoordinatestakingintoconsiderationourfrustum,position,scaleandrotationinformation.

Transformations

59

Anotherimportantthingtothinkaboutis,whydon’twepassthetranslation,rotationandscalematricesinsteadofcombiningthemintoaworldmatrix?Thereasonisthatweshouldtrytolimitthematricesweuseinourshaders.Alsokeepinmindthatthematrixmultiplicationthatwedoinourshaderisdoneoncepereachvertex.TheprojectionmatrixdoesnotchangebetweenrendercallsandtheworldmatrixdoesnotchangeperGameIteminstance.Ifwepassedthetranslation,rotationandscalematricesindependentlywewouldbedoingmanymorematrixmultiplications.Thinkaboutamodelwithtonsofvertices.That’salotofextraoperations.

Butyoumaynowthink,thatiftheworldmatrixdoesnotchangeperGameIteminstance,whydidn'twedothematrixmultiplicationinourJavaclass?WewouldmultiplytheprojectionmatrixandtheworldmatrixjustonceperGameItemandwesenditasasingleuniform.Inthiscasewewouldbesavingmanymoreoperations.Theansweristhatthisisavalidpointrightnow.Butwhenweaddmorefeaturestoourgameenginewewillneedtooperatewithworldcoordinatesintheshadersanyway,soit’sbettertohandlethosetwomatricesinanindependentway.

FinallyweonlyneedtochangetheDummyGameclasstocreateaninstanceofGameItemwithitsassociatedMeshandaddsomelogictotranslate,rotateandscaleourquad.Sinceit’sonlyatestexampleanddoesnotaddtoomuchyoucanfinditinthesourcecodethataccompaniesthisbook.

Transformations

60

Textures

Createa3DcubeInthischapterwewilllearnhowtoloadtexturesandusethemintherenderingprocess.Inordertoshowalltheconceptsrelatedtotextureswewilltransformthequadthatwehavebeenusinginpreviouschaptersintoa3Dcube.Withthecodebasewehavecreated,inordertodrawacubewejustneedtocorrectlydefinethecoordinatesofacubeanditshouldbedrawncorrectly.

Inordertodrawacubewejustneedtodefineeightvertices.

Sotheassociatedcoordinatesarraywillbelikethis:

Textures

61

float[]positions=newfloat[]{

//VO

-0.5f,0.5f,0.5f,

//V1

-0.5f,-0.5f,0.5f,

//V2

0.5f,-0.5f,0.5f,

//V3

0.5f,0.5f,0.5f,

//V4

-0.5f,0.5f,-0.5f,

//V5

0.5f,0.5f,-0.5f,

//V6

-0.5f,-0.5f,-0.5f,

//V7

0.5f,-0.5f,-0.5f,

};

Ofcourse,sincewehave4moreverticesweneedtoupdatethearrayofcolours.Justrepeatthefirstfouritemsbynow.

float[]colours=newfloat[]{

0.5f,0.0f,0.0f,

0.0f,0.5f,0.0f,

0.0f,0.0f,0.5f,

0.0f,0.5f,0.5f,

0.5f,0.0f,0.0f,

0.0f,0.5f,0.0f,

0.0f,0.0f,0.5f,

0.0f,0.5f,0.5f,

};

Finally,sinceacubeismadeofsixfacesweneedtodrawtwelvetriangles(twoperface),soweneedtoupdatetheindicesarray.Rememberthattrianglesmustbedefinedincounter-clockwiseorder.Ifyoudothisbyhand,iseasytomakemistakes.Allwaysputthefacethatyouwanttodefineindicesforinfrontofyou.Then,idenifietheverticesanddrawthetrianglesincounter-clockwiseorder.

Textures

62

int[]indices=newint[]{

//Frontface

0,1,3,3,1,2,

//TopFace

4,0,3,5,4,3,

//Rightface

3,2,7,5,3,7,

//Leftface

6,1,0,6,0,4,

//Bottomface

2,1,6,2,6,7,

//Backface

7,6,4,7,4,5,

};

InordertobetterviewthecubewewillchangecodethatrotatesthemodelintheDummyGameclasstorotatealongthethreeaxes.

//Updaterotationangle

floatrotation=gameItem.getRotation().x+1.5f;

if(rotation>360){

rotation=0;

}

gameItem.setRotation(rotation,rotation,rotation);

Andthat’sall.Wearenowabletodisplayaspinning3Dcube.Youcannowcompileandrunyourexampleandyouwillobtainsomethinglikethis.

Thereissomethingweirdwiththiscube.Somefacesarenotbeingpaintedcorrectly.Whatishappening?Thereasonwhythecubehasthisaspectisthatthetrianglesthatcomposethecubearebeingdrawninasortofrandomorder.Thepixelsthatarefarawayshouldbedrawnbeforepixelsthatarecloser.Thisisnothappeningrightnowandinordertodothatwemustenabledepthtesting.

Textures

63

ThiscanbedoneintheWindowclassattheendoftheinitmethod:

glEnable(GL_DEPTH_TEST);

Nowourcubeisbeingrenderedcorrectly!

IfyouseethecodeforthispartofthechapteryoumayseethatwehavedoneaminorreorganizationintheMeshclass.TheidentifiersoftheVBOsarenowstoredinalisttoeasilyiterateoverthem.

AddingtexturetothecubeNowwearegoingtoapplyatexturetoourcube.Atextureisanimagewhichisusedtodrawthecolourofthepixelsofacertainmodel.Youcanthinkofatextureasaskinthatiswrappedaroundyour3Dmodel.Whatyoudoisassignpointsintheimagetexturetotheverticesinyourmodel.WiththatinformationOpenGLisabletocalculatethecolourtoapplytotheotherpixelsbasedonthetextureimage.

Thetextureimagedoesnothavetohavethesamesizeasthemodel.Itcanbelargeror

Textures

64

smaller.OpenGLwillextrapolatethecolourifthepixeltobeprocessedcannotbemappedtoaspecificpointinthetexture.Youcancontrolhowthisprocessisdonewhenaspecifictextureiscreated.

Sobasicallywhatwemustdo,inordertoapplyatexturetoamodel,isassigningtexturecoordinatestoeachofourvertices.Thetexturecoordinatesystemisabitdifferentfromthecoordinatesystemofourmodel.Firstofall,wehavea2Dtexturesoourcoordinateswillonlyhavetwocomponents,xandy.Besidesthat,theoriginissetupinthetopleftcorneroftheimageandthemaximumvalueofthexoryvalueis1.

Howdowerelatetexturecoordinateswithourpositioncoordinates?Easy,inthesamewaywepassedthecolourinformation.WesetupaVBOwhichwillhaveatexturecoordinateforeachvertexposition.

Solet’sstartmodifyingthecodebasetousetexturesinour3Dcube.Thefirststepistoloadtheimagethatwillbeusedasatexture.Forthistask,inpreviousversionsofLWJGL,theSlick2Dlibrarywascommonlyused.AtthemomentofthiswritingitseemsthatthislibraryisnotcompatiblewithLWJGL3sowewillneedtofollowamoreverboseapproach.Wewillusealibrarycalledpngdecoder,thus,weneedtodeclarethatdependencyinourpom.xmlfile.

<dependency>

<groupId>org.l33tlabs.twl</groupId>

<artifactId>pngdecoder</artifactId>

<version>${pngdecoder.version}</version>

</dependency>

Anddefinetheversionofthelibrarytouse.

Textures

65

<properties>

[...]

<pngdecoder.version>1.0</pngdecoder.version>

[...]

</properties>

OnethingthatyoumayseeinsomewebpagesisthatthefirstthingwemustdoisenablethetexturesinourOpenGLcontextbycallingglEnable(GL_TEXTURE_2D).Thisistrueifyouareusingthefixed-functionpipepline.SinceweareusingGLSLshadersitisnotrequiredanymore.

NowwewillcreateanewTextureclassthatwillperformallthenecessarystepstoloadatexture.OurtextureimagewillbelocatedintheresourcesfolderandcanbeaccessedasaCLASSPATHresourceandpassedasaninputstreamtothePNGDecoderclass.

PNGDecoderdecoder=newPNGDecoder(

Texture.class.getResourceAsStream(fileName));

ThenweneedtodecodethePNGimageandstoreitscontentintoabufferbyusingthedecodemethodofthePNGDecoderclass.ThePNGimagewillbedecodedinRGBAformat(RGBforRed,Green,BlueandAforAlphaortransparency)whichusesfourbytesperpixel.

Thedecodemethodrequiresthreeparameters:

buffer:TheByteBufferthatwillholdthedecodedimage(sinceeachpixelusesfourbytesitssizewillbe4widthheight).stride:Specifiesthedistanceinbytesfromthestartofalinetothestartofthenextline.Inthiscaseitwillbethenumberofbytesperline.format:Thetargetformatintowhichtheimageshouldbedecoded(RGBA).

ByteBufferbuf=ByteBuffer.allocateDirect(

4*decoder.getWidth()*decoder.getHeight());

decoder.decode(buf,decoder.getWidth()*4,Format.RGBA);

buf.flip();

OneimportantthingtorememberisthatOpenGL,forhistoricalreasons,requiresthattextureimageshaveasize(numberoftexelsineachdimension)ofapoweroftwo(2,4,8,16,....).Somedriversremovethatconstraintbutit’sbettertosticktoittoavoidproblems.

Thenextstepistouploadthetextureintothegraphicscardmemory.Firstofallweneedtocreateanewtextureidentifier.Eachoperationrelatedtothattexturewillusethatidentifiersoweneedtobindit.

Textures

66

//CreateanewOpenGLtexture

inttextureId=glGenTextures();

//Bindthetexture

glBindTexture(GL_TEXTURE_2D,textureId);

ThenweneedtotellOpenGLhowtounpackourRGBAbytes.Sinceeachcomponentisonebyteinsizeweneedtoaddthefollowingline:

glPixelStorei(GL_UNPACK_ALIGNMENT,1);

Andfinallywecanuploadourtexturedata:

glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,decoder.getWidth(),

decoder.getHeight(),0,GL_RGBA,GL_UNSIGNED_BYTE,buf);

TheglTextImage2Dmethodhasthefollowingparameters:

target:Specifiesthetargettexture(itstype).Inthiscase:GL_TEXTURE_2D.level:Specifiesthelevel-of-detailnumber.Level0isthebaseimagelevel.Levelnisthenthmipmapreductionimage.Moreonthislater.internalformat:Specifiesthenumberofcolourcomponentsinthetexture.width:Specifiesthewidthofthetextureimage.height:Specifiestheheightofthetextureimage.border:Thisvaluemustbezero.format:Specifiestheformatofthepixeldata:RGBAinthiscase.type:Specifiesthedatatypeofthepixeldata.Weareusingunsignedbytesforthis.data:Thebufferthatstoresourdata.

InsomecodesnippetsthatyoumayfindyowwillprobablyseethatfilteringparametersaresetupbeforecallingtheglTextImage2Dmethod.Filteringreferstohowtheimagewillbedrawnwhenscalingandhowpixelswillbeinterpolated.

Ifthoseparametersarenotsetthetexturewillnotbedisplayed.SobeforetheglTextImage2Dmethodyoucouldseesomethinglikethis:

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

Thisparameterbasicallysaysthatwhenapixelisdrawnwithnodirectonetooneassociationtoatexturecoordinateitwillpickthenearesttexturecoordinatepoint.

Textures

67

Bythismomentwewillnotsetupthoseparameters.Insteadwewillgenerateamipmap.Amipmapisadecreasingresolutionsetofimagesgeneratedfromahighdetailedtexture.Thoselowerresolutionimageswillbeusedautomaticallywhenourobjectisscaled.

Inordertogeneratemipmapswejustneedtosetthefollowingline(inthiscaseaftertheglTextImage2Dmethod:

glGenerateMipmap(GL_TEXTURE_2D);

Andthat’sall,wehavesuccessfullyloadedourtexture.Nowweneedtouseit.AswesaidbeforeweneedtopasstexturecoordinatesasanotherVBO.SowewillmodifyourMeshclasstoacceptanarrayoffloats,thatcontainstexturecoordinates,insteadofthecolour(wecouldhavecoloursandtexturebutinordertosimplifyitwewillstripcoloursoff).Ourconstructorwillbelikethis:

publicMesh(float[]positions,float[]textCoords,int[]indices,

Texturetexture)

ThetexturecoordinatesVBOiscreatedinthesamewayasthecolourone.Theonlydifferenceisthatithastwoelementsinsteadofthree:

vboId=glGenBuffers();

vboIdList.add(vboId);

textCoordsBuffer=MemoryUtil.memAllocFloat(textCoords.length);

textCoordsBuffer.put(textCoords).flip();

glBindBuffer(GL_ARRAY_BUFFER,vboId);

glBufferData(GL_ARRAY_BUFFER,textCoordsBuffer,GL_STATIC_DRAW);

glVertexAttribPointer(1,2,GL_FLOAT,false,0,0);

Nowweneedtousethosetexturesinourshader.Inthevertexshaderwehavechangedtheseconduniformparameterbecausenowit’savec2(wealsochangedtheuniformname,soremembertochangeitintheRendererclass).Thevertexshader,asinthecolourcase,justpassesthetexturecoordinatestobeusedbythefragmentshader.

Textures

68

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

outvec2outTexCoord;

uniformmat4worldMatrix;

uniformmat4projectionMatrix;

voidmain()

{

gl_Position=projectionMatrix*worldMatrix*vec4(position,1.0);

outTexCoord=texCoord;

}

Inthefragmentshaderwemustusethosetexturecoordinatesinordertosetthepixelcolours:

#version330

invec2outTexCoord;

outvec4fragColor;

uniformsampler2Dtexture_sampler;

voidmain()

{

fragColor=texture(texture_sampler,outTexCoord);

}

Beforeanalyzingthecodelet’sclarifysomeconcepts.Agraphicscardhasseveralspacesorslotstostoretextures.Eachofthesespacesiscalledatextureunit.Whenweareworkingwithtextureswemustsetthetextureunitthatwewanttoworkwith.Asyoucanseewehaveanewuniformnamedtexture_sampler.Thatuniformhasasampler2Dtypeandwillholdthevalueofthetextureunitthatwewanttoworkwith.

Inthemainfunctionweusethetexturelookupfunctionnamedtexture.Thisfunctiontakestwoarguments:asamplerandatexturecoordinateandwillreturnthecorrectcolour.Thesampleruniformallowustodomulti-texturing.Wewillnotcoverthattopicrightnowbutwewilltrytopreparethecodetoadditeasilylateron.

Thus,inourShaderProgramclasswewillcreateanewmethodthatallowsustosetanintegervalueforauniform:

Textures

69

publicvoidsetUniform(StringuniformName,intvalue){

glUniform1i(uniforms.get(uniformName),value);

}

IntheinitmethodoftheRendererclasswewillcreateanewuniform:

shaderProgram.createUniform("texture_sampler");

Also,intherendermethodofourRendererclasswewillsettheuniformvalueto0.(Wearenotusingseveraltexturesrightnowsowearejustusingunit0).

shaderProgram.setUniform("texture_sampler",0);

FinallywejustneedtochangetherendermethodoftheMeshclasstousethetexture.Atthebeginningofthatmethodweputthefollowinglines:

//Activatefirsttextureunit

glActiveTexture(GL_TEXTURE0);

//Bindthetexture

glBindTexture(GL_TEXTURE_2D,texture.getId());

Webasicallyarebindingthetextureidentifiedbytexture.getId()tothetextureunit0.

Rightnow,wehavejustmodifiedourcodebasetosupporttextures.Nowweneedtosetuptexturecoordinatesforour3Dcube.Ourtextureimagefilewillbesomethinglikethis:

Inour3Dmodelwehaveeightvertices.Let’sseehowthiscanbedone.Let’sfirstdefinethefrontfacetexturecoordinatesforeachvertex.

Textures

70

Vertex TextureCoordinate

V0 (0.0,0.0)

V1 (0.0,0.5)

V2 (0.5,0.5)

V3 (0.5,0.0)

Now,let’sdefinethetexturemappingofthetopface.

Vertex TextureCoordinate

V4 (0.0,0.5)

V5 (0.5,0.5)

V0 (0.0,1.0)

V3 (0.5,1.0)

Asyoucanseewehaveaproblem,weneedtosetupdifferenttexturecoordinatesforthesamevertices(V0andV3).Howcanwesolvethis?Theonlywaytosolveitistorepeatsomeverticesandassociatedifferenttexturecoordinates.Forthetopfaceweneedtorepeatthefourverticesandassignthemthecorrecttexturecoordinates.

Textures

71

Sincethefront,backandlateralfacesusethesametexturewewillnotneedtorepeatallofthesevertices.Youhavethecompletedefinitioninthesourcecode,butweneededtomovefrom8pointsto20.Thefinalresultislikethis.

Inthenextchapterswewilllearnhowtoloadmodelsgeneratedby3Dmodelingtoolssowewon’tneedtodefinebyhandthepositionsandtexturecoordinates(whichbytheway,wouldbeimpracticalformorecomplexmodels).

Textures

72

CameraInthischapterwewilllearnhowtomoveinsidearendered3Dscene,thiscapabilityislikehavingacamerathatcantravelinsidethe3Dworldandinfactisthetermusedtorefertoit.

ButifyoutrytosearchforspecificcamerafunctionsinOpenGLyouwilldiscoverthatthereisnocameraconcept,orinotherwordsthecameraisalwaysfixed,centeredinthe(0,0,0)positionatthecenterofthescreen.

Sowhatwewilldoisasimulationthatgivesustheimpressionthatwehaveacameracapableofmovinginsidethe3Dscene.Howdoweachievethis?Well,ifwecannotmovethecamerathenwemustmovealltheobjectscontainedinour3Dspaceatonce.Inotherwords,ifwecannotmoveacamerawewillmovethewholeworld.

So,supposethatwewouldliketomovethecamerapositionalongthezaxisfromastartingposition(Cx,Cy,Cz)toaposition(Cx,Cy,Cz+dz)togetclosertotheobjectwhichisplacedatthecoordinates(Ox,Oy,Oz).

Whatwewillactuallydoismovetheobject(alltheobjectsinour3Dspaceindeed)intheoppositedirectionthatthecamerashouldmove.Thinkaboutitliketheobjectsbeingplacedinatreadmill.

Acameracanbedisplacedalongthethreeaxis(x,yandz)andalsocanrotatealongthem(roll,pitchandyaw).

Camera

73

Sobasicallywhatwemustdoistobeabletomoveandrotatealloftheobjectsofour3Dworld.Howarewegoingtodothis?Theansweristoapplyanothertransformationthatwilltranslatealloftheverticesofalloftheobjectsintheoppositedirectionofthemovementofthecameraandthatwillrotatethemaccordingtothecamerarotation.Thiswillbedoneofcoursewithanothermatrix,thesocalledviewmatrix.Thismatrixwillfirstperformthetranslationandthentherotationalongtheaxis.

Let'sseehowwecanconstructthatmatrix.Ifyourememberfromthetransformationschapterourtransformationequationwaslikethis:

Transf = [ProjMatrix] ⋅ [TranslationMatrix] ⋅ [RotationMatrix] ⋅ [ScaleMatrix] = [ProjM

Theviewmatrixshouldbeappliedbeforemultiplyingbytheprojectionmatrix,soourequationshouldbenowlikethis:

Transf = [ProjMatrix] ⋅ [V iewMatrix] ⋅ [TranslationMatrix] ⋅ [RotationMatrix] ⋅ [ScaleM

Nowwehavethreematrices,let'sthinkalittlebitaboutthelifecyclesofthosematrices.Theprojectionmatrixshouldnotchangeverymuchwhileourgameisrunning,intheworstcaseitmaychangeonceperrendercall.Theviewmatrixmaychangeonceperrendercallifthecameramoves.TheworldmatrixchangesonceperGameIteminstance,soitwillchangeseveraltimesperrendercall.

So,howmanymatricesshouldwepushtoorvertexshader?Youmayseesomecodethatusesthreeuniformsforeachofthosematrices,butinprinciplethemostefficientapproachwouldbetocombinetheprojectionandtheviewmatrices,let’scallitpvmatrix,andpushtheworldandthepvmatricestoourshader.Withthisapproachwewouldhavethepossibilitytoworkwithworldcoordinatesandwouldbeavoidingsomeextramultiplications.

Actually,themostconvenientapproachistocombinetheviewandtheworldmatrix.Whythis?Becauserememberthatthewholecameraconceptisatrick,whatwearedoingispushingthewholeworldtosimulateworlddisplacementandtoshowonlyasmallportionofthe3Dworld.Soifweworkdirectlywithworldcoordinateswemaybeworkingwithworldcoordinatesthatarefarawayfromtheoriginandwemayincurinsomeprecisionproblems.

Camera

74

Ifweworkinwhat’scalledthecameraspacewewillbeworkingwithpointsthat,althougharefarawayfromtheworldorigin,areclosertothecamera.Thematrixthatresultsofthecombinationoftheviewandtheworldmatrixisoftencalledasthemodelviewmatrix.

Solet’sstartmodifyingourcodetosupportacamera.FirstofallwewillcreateanewclasscalledCamerawhichwillholdthepositionandrotationstateofourcamera.Thisclasswillprovidemethodstosetthenewpositionorrotationstate(setPositionorsetRotation)ortoupdatethosevalueswithanoffsetuponthecurrentstate(movePositionandmoveRotation)

packageorg.lwjglb.engine.graph;

importorg.joml.Vector3f;

publicclassCamera{

privatefinalVector3fposition;

privatefinalVector3frotation;

publicCamera(){

position=newVector3f(0,0,0);

rotation=newVector3f(0,0,0);

}

publicCamera(Vector3fposition,Vector3frotation){

this.position=position;

this.rotation=rotation;

}

publicVector3fgetPosition(){

returnposition;

}

publicvoidsetPosition(floatx,floaty,floatz){

position.x=x;

position.y=y;

position.z=z;

}

publicvoidmovePosition(floatoffsetX,floatoffsetY,floatoffsetZ){

if(offsetZ!=0){

position.x+=(float)Math.sin(Math.toRadians(rotation.y))*-1.0f*offset

Z;

position.z+=(float)Math.cos(Math.toRadians(rotation.y))*offsetZ;

}

if(offsetX!=0){

position.x+=(float)Math.sin(Math.toRadians(rotation.y-90))*-1.0f*o

ffsetX;

position.z+=(float)Math.cos(Math.toRadians(rotation.y-90))*offsetX;

Camera

75

}

position.y+=offsetY;

}

publicVector3fgetRotation(){

returnrotation;

}

publicvoidsetRotation(floatx,floaty,floatz){

rotation.x=x;

rotation.y=y;

rotation.z=z;

}

publicvoidmoveRotation(floatoffsetX,floatoffsetY,floatoffsetZ){

rotation.x+=offsetX;

rotation.y+=offsetY;

rotation.z+=offsetZ;

}

}

NextintheTransformationclasswewillholdanewmatrixtoholdthevaluesoftheviewmatrix.

privatefinalMatrix4fviewMatrix;

Wewillalsoprovideamethodtoupdateitsvalue.Liketheprojectionmatrixthismatrixwillbethesameforalltheobjectstoberenderedinarendercycle.

publicMatrix4fgetViewMatrix(Cameracamera){

Vector3fcameraPos=camera.getPosition();

Vector3frotation=camera.getRotation();

viewMatrix.identity();

//Firstdotherotationsocamerarotatesoveritsposition

viewMatrix.rotate((float)Math.toRadians(rotation.x),newVector3f(1,0,0))

.rotate((float)Math.toRadians(rotation.y),newVector3f(0,1,0));

//Thendothetranslation

viewMatrix.translate(-cameraPos.x,-cameraPos.y,-cameraPos.z);

returnviewMatrix;

}

Asyoucanseewefirstneedtodotherotationandthenthetranslation.Ifwedotheoppositewewouldnotberotatingalongthecamerapositionbutalongthecoordinatesorigin.PleasealsonotethatinthemovePositionmethodoftheCameraclasswejustnotsimplyincreasethecamerapositionbyandoffset.Wealsotakeintoconsiderationthe

Camera

76

rotationalongtheyaxis,theyaw,inordertocalculatethefinalposition.Ifwewouldjustincreasethecamerapositionbytheoffsetthecamerawillnotmoveinthedirectionitsfacing.

Besideswhatismentionedabove,wedonothavehereafullfreeflycamera(forinstance,ifwerotatealongthexaxisthecameradoesnotmoveupordowninthespacewhenwemoveitforward).Thiswillbedoneinlaterchapterssinceisalittlebitmorecomplex.

FinallywewillremovethepreviousmethodgetWorldMatrixandaddanewonecalledgetModelViewMatrix.

publicMatrix4fgetModelViewMatrix(GameItemgameItem,Matrix4fviewMatrix){

Vector3frotation=gameItem.getRotation();

modelViewMatrix.identity().translate(gameItem.getPosition()).

rotateX((float)Math.toRadians(-rotation.x)).

rotateY((float)Math.toRadians(-rotation.y)).

rotateZ((float)Math.toRadians(-rotation.z)).

scale(gameItem.getScale());

Matrix4fviewCurr=newMatrix4f(viewMatrix);

returnviewCurr.mul(modelViewMatrix);

}

ThegetModelViewMatrixmethodwillbecalledpereachGameIteminstancesowemustworkoveracopyoftheviewmatrixsotransformationsdonotgetaccumulatedineachcall(RememberthatMatrix4fclassisnotimmutable).

IntherendermethodoftheRendererclasswejustneedtoupdatetheviewmatrixaccordingtothecameravalues,justaftertheprojectionmatrixisalsoupdated.

//UpdateprojectionMatrix

Matrix4fprojectionMatrix=transformation.getProjectionMatrix(FOV,window.getWidth(),

window.getHeight(),Z_NEAR,Z_FAR);

shaderProgram.setUniform("projectionMatrix",projectionMatrix);

//UpdateviewMatrix

Matrix4fviewMatrix=transformation.getViewMatrix(camera);

shaderProgram.setUniform("texture_sampler",0);

//RendereachgameItem

for(GameItemgameItem:gameItems){

//Setmodelviewmatrixforthisitem

Matrix4fmodelViewMatrix=transformation.getModelViewMatrix(gameItem,viewMatrix)

;

shaderProgram.setUniform("modelViewMatrix",modelViewMatrix);

//Renderthemesforthisgameitem

gameItem.getMesh().render();

}

Camera

77

Andthat’sall,ourbasecodesupportstheconceptofacamera.Nowweneedtouseit.Wecanchangethewaywehandletheinputandupdatethecamera.Wewillsetthefollowingcontrols:

Keys“A”and“D”tomovethecameratotheleftandright(xaxis)respectively.Keys“W”and“S”tomovethecameraforwardandbackwards(zaxis)respectively.Keys“Z”and“X”tomovethecameraupanddown(yaxis)respectively.

Wewillusethemousepositiontorotatethecameraalongthexandyaxiswhentherightbuttonofthemouseispressed.Asyoucanseewewillbeusingthemouseforthefirsttime.WewillcreateanewclassnamedMouseInputthatwillencapsulatemouseaccess.Here’sthecodeforthatclass.

packageorg.lwjglb.engine;

importorg.joml.Vector2d;

importorg.joml.Vector2f;

importstaticorg.lwjgl.glfw.GLFW.*;

publicclassMouseInput{

privatefinalVector2dpreviousPos;

privatefinalVector2dcurrentPos;

privatefinalVector2fdisplVec;

privatebooleaninWindow=false;

privatebooleanleftButtonPressed=false;

privatebooleanrightButtonPressed=false;

publicMouseInput(){

previousPos=newVector2d(-1,-1);

currentPos=newVector2d(0,0);

displVec=newVector2f();

}

publicvoidinit(Windowwindow){

glfwSetCursorPosCallback(window.getWindowHandle(),(windowHandle,xpos,ypos)

->{

currentPos.x=xpos;

currentPos.y=ypos;

});

glfwSetCursorEnterCallback(window.getWindowHandle(),(windowHandle,entered)-

>{

inWindow=entered;

});

glfwSetMouseButtonCallback(window.getWindowHandle(),(windowHandle,button,ac

Camera

78

tion,mode)->{

leftButtonPressed=button==GLFW_MOUSE_BUTTON_1&&action==GLFW_PRESS;

rightButtonPressed=button==GLFW_MOUSE_BUTTON_2&&action==GLFW_PRESS

;

});

}

publicVector2fgetDisplVec(){

returndisplVec;

}

publicvoidinput(Windowwindow){

displVec.x=0;

displVec.y=0;

if(previousPos.x>0&&previousPos.y>0&&inWindow){

doubledeltax=currentPos.x-previousPos.x;

doubledeltay=currentPos.y-previousPos.y;

booleanrotateX=deltax!=0;

booleanrotateY=deltay!=0;

if(rotateX){

displVec.y=(float)deltax;

}

if(rotateY){

displVec.x=(float)deltay;

}

}

previousPos.x=currentPos.x;

previousPos.y=currentPos.y;

}

publicbooleanisLeftButtonPressed(){

returnleftButtonPressed;

}

publicbooleanisRightButtonPressed(){

returnrightButtonPressed;

}

}

TheMouseInputclassprovidesaninitmethodwhichshouldbecalledduringtheinitializationphaseandregistersasetofcallbackstoprocessmouseevents:

glfwSetCursorPosCallback:Registersacallbackthatwillbeinvokedwhenthemouseismoved.glfwSetCursorEnterCallback:Registersacallbackthatwillbeinvokedwhenthemouseentersourwindow.Wewillbereceivingmouseeventsevenifthemouseisnotinourwindow.Weusethiscallbacktotrackwhenthemouseisinourwindow.glfwSetMouseButtonCallback:Registersacallbackthatwillbeinvokedwhenamousebuttonispressed.

Camera

79

TheMouseInputclassprovidesaninputmethodwhichshouldbecalledwhengameinputisprocessed.ThismethodcalculatesthemousedisplacementfromthepreviouspositionandstoresitintoVector2fdisplVecvariablesoitcanbeusedbyourgame.

TheMouseInputclasswillbeinstantiatedinourGameEngineclassandwillbepassedasaparameterintheinitandupdatemethodsofthegameimplementation(soweneedtochangetheinterfaceaccordingly).

voidinput(Windowwindow,MouseInputmouseInput);

voidupdate(floatinterval,MouseInputmouseInput);

ThemouseinputwillbeprocessedintheinputmethodoftheGameEngineclassbeforepassingthecontroltothegameimplementation.

protectedvoidinput(){

mouseInput.input(window);

gameLogic.input(window,mouseInput);

}

NowwearereadytoupdateourDummyGameclasstoprocessthekeyboardandmouseinput.Theinputmethodofthatclasswillbelikethis:

@Override

publicvoidinput(Windowwindow,MouseInputmouseInput){

cameraInc.set(0,0,0);

if(window.isKeyPressed(GLFW_KEY_W)){

cameraInc.z=-1;

}elseif(window.isKeyPressed(GLFW_KEY_S)){

cameraInc.z=1;

}

if(window.isKeyPressed(GLFW_KEY_A)){

cameraInc.x=-1;

}elseif(window.isKeyPressed(GLFW_KEY_D)){

cameraInc.x=1;

}

if(window.isKeyPressed(GLFW_KEY_Z)){

cameraInc.y=-1;

}elseif(window.isKeyPressed(GLFW_KEY_X)){

cameraInc.y=1;

}

}

ItjustupdatesaVector3fvariablenamedcameraIncwhichholdsthecameradisplacementthatshouldbeapplied.TheupdatemethodoftheDummyGameclassmodifiesthecamerapositionandrotation

Camera

80

accordingtotheprocesseskeyandmouseevents.

@Override

publicvoidupdate(floatinterval,MouseInputmouseInput){

//Updatecameraposition

camera.movePosition(cameraInc.x*CAMERA_POS_STEP,

cameraInc.y*CAMERA_POS_STEP,

cameraInc.z*CAMERA_POS_STEP);

//Updatecamerabasedonmouse

if(mouseInput.isRightButtonPressed()){

Vector2frotVec=mouseInput.getDisplVec();

camera.moveRotation(rotVec.x*MOUSE_SENSITIVITY,rotVec.y*MOUSE_SENSITIVITY

,0);

}

}

Nowwecanaddmorecubestoourworld,scalethemsetthemupinaspecificlocationandplaywithournewcamera.Asyoucanseeallthecubessharethesamemesh.

GameItemgameItem1=newGameItem(mesh);

gameItem1.setScale(0.5f);

gameItem1.setPosition(0,0,-2);

GameItemgameItem2=newGameItem(mesh);

gameItem2.setScale(0.5f);

gameItem2.setPosition(0.5f,0.5f,-2);

GameItemgameItem3=newGameItem(mesh);

gameItem3.setScale(0.5f);

gameItem3.setPosition(0,0,-2.5f);

GameItemgameItem4=newGameItem(mesh);

gameItem4.setScale(0.5f);

gameItem4.setPosition(0.5f,0,-2.5f);

gameItems=newGameItem[]{gameItem1,gameItem2,gameItem3,gameItem4};

Youwillgetsomethinglikethis.

Camera

81

Camera

82

LoadingmorecomplexmodelsInthischapterwewilllearntoloadmorecomplexmodelsdefinedinexternalfiles.Thosemodelswillbecreatedby3Dmodellingtools(suchasBlender).Uptonowwewerecreatingourmodelsbyhanddirectlycodingthearraysthatdefinetheirgeometry,inthischapterwewilllearnhowtoloadmodelsdefinedinOBJformat.

OBJ(or.OBJ)isageometrydefinitionopenfileformatdevelopedbyWavefrontTechnologieswhichhasbeenwidelyadopted.AnOBJfiledefinesthevertices,texturecoordinatesandpolygonsthatcomposea3Dmodel.It’sarelativeeasyformattoparsesinceistextbasedandeachlinedefinesanelement(avertex,atexturecoordinate,etc.).

Inan.objfileeachlinestartswithatokenwithidentifiesthetypeofelement:

Commentsarelineswhichstartwith#.Thetoken“v”definesageometricvertexwithcoordinates(x,y,z,w).Example:v0.1550.2110.321.0.Thetoken“vn”definesavertexnormalwithcoordinates(x,y,z).Example:vn0.710.210.82.Moreonthislater.Thetoken“vt”definesatexturecoordinate.Example:vt0.5001.Thetoken“f”definesaface.Withtheinformationcontainedintheselineswewillconstructourindicesarray.Wewillhandleonlythecasewerefacesareexportedastriangles.Itcanhaveseveralvariants:

Itcandefinejustvertexpositions(fv1v2v3).Example:f631.Inthiscasethistriangleisdefinedbythegeometricverticesthatoccupypositions6,3aand1.(Vertexindicesalwaysstartsby1).Itcandefinevertexpositions,texturecoordinatesandnormals(fv1/t1/n1v2/t2/n2V3/t3/n3).Example:f6/4/13/5/37/6/5.Thefirstblockis“6/4/1”anddefinesthecoordinates,texturecoordinatesandnormalvertex.Whatyouseehereistheposition,sowearesaying:pickthegeometricvertexnumbersix,thetexturecoordinatenumber4andthevertexnormalnumberone.

OBJformathasmanymoreentrytypes(likeonetogrouppolygons,definingmaterials,etc.).Bynowwewillsticktothissubset,ourOBJloaderwillignoreotherentrytypes.

Butwhatisanormal?Let’sdefineitfirst.Whenyouhaveaplaneitsnormalisavectorperpendiculartothatplanewhichhasalengthequaltoone.

Loadingmorecomplexmodels

83

Asyoucanseeinthefigureaboveaplanecanhavetwonormals,whichoneshouldweuse?Normalsin3Dgraphicsareusedforlightning,soweshouldchosethenormalwhichisorientedtowardsthesourceoflight.Inotherwordsweshouldchoosethenormalthatpointsoutfromtheexternalfaceofourmodel.

Whenwehavea3Dmodel,itiscomposedbypolygons,trianglesinourcase.Eachtriangleiscomposedbythreevertices.TheNormalvectorforatrianglewillbethevectorperpendiculartothetrianglesurfacewhichhasalengthequaltoone.

Avertexnormalisassociatedtoaspecificvertexandisthecombinationofthenormalsofthesurroundingtriangles(ofcourseitslengthisequaltoone).Hereyoucanseethevertexmodelsofa3Dmesh(takenfromWikipedia)

Normalswillbeusedforlighting.

Solet’sstartcreatingourOBJloader.FirstofallwewillmodifyourMeshclasssincenowit’smandatorytouseatexture.Someoftheobjfilesthatwemayloadmaynotdefinetexturecoordinatesandwemustbeabletorenderthemusingacolourinsteadofatexture.Inthiscasethefacedefinitionwillbelikethis:“fv//n”.

OurMeshclasswillhaveanewattributenamedcolour

privateVector3fcolour;

Loadingmorecomplexmodels

84

AndtheconstructorwillnotrequireaTextureinstanceanymore.Insteadwewillprovidegettersandsettersfortextureandcolourattributes.

publicMesh(float[]positions,float[]textCoords,float[]normals,int[]indices){

Ofcourse,intherenderandcleanupmethodswemustcheckiftextureattributeisnotnullbeforeusingit.Asyoucanseeintheconstructorwepassnowanewarrayoffloatsnamednormals.Howdoweusenormalsforrendering?TheansweriseasyitwillbejustanotherVBOinsideourVAO,soweneedtoaddthiscode.

//VertexnormalsVBO

vboId=glGenBuffers();

vboIdList.add(vboId);

vecNormalsBuffer=MemoryUtil.memAllocFloat(normals.length);

vecNormalsBuffer.put(normals).flip();

glBindBuffer(GL_ARRAY_BUFFER,vboId);

glBufferData(GL_ARRAY_BUFFER,vecNormalsBuffer,GL_STATIC_DRAW);

glVertexAttribPointer(2,3,GL_FLOAT,false,0,0);

InourrendermethodwemustenablethisVBObeforerenderinganddisableitwhenwehavefinished.

//Drawthemesh

glBindVertexArray(getVaoId());

glEnableVertexAttribArray(0);

glEnableVertexAttribArray(1);

glEnableVertexAttribArray(2);

glDrawElements(GL_TRIANGLES,getVertexCount(),GL_UNSIGNED_INT,0);

//Restorestate

glDisableVertexAttribArray(0);

glDisableVertexAttribArray(1);

glDisableVertexAttribArray(2);

glBindVertexArray(0);

glBindTexture(GL_TEXTURE_2D,0);

NowthatwehavefinishedthemodificationsintheMeshclasswecanchangeourcodetouseeithertexturecoordinatesorafixedcolour.Thusweneedtomodifyourfragmentshaderlikethis:

Loadingmorecomplexmodels

85

#version330

invec2outTexCoord;

outvec4fragColor;

uniformsampler2Dtexture_sampler;

uniformvec3colour;

uniformintuseColour;

voidmain()

{

if(useColour==1)

{

fragColor=vec4(colour,1);

}

else

{

fragColor=texture(texture_sampler,outTexCoord);

}

}

Asyoucanseewehavecreatetwonewuniforms:

colour:Willcontainthebasecolour.useColour:It’saflagthatwewillsetto1whenwedon’twanttousetextures.

IntheRendererclassweneedtocreatethosetwouniforms.

//Createuniformfordefaultcolourandtheflagthatcontrolsit

shaderProgram.createUniform("colour");

shaderProgram.createUniform("useColour");

Andlikeanyotheruniform,intherendermethodoftheRendererclassweneedtosetthevaluesforthisuniformsforeachgameItem.

for(GameItemgameItem:gameItems){

Meshmesh=gameItem.getMesh();

//Setmodelviewmatrixforthisitem

Matrix4fmodelViewMatrix=transformation.getModelViewMatrix(gameItem,viewMatrix)

;

shaderProgram.setUniform("modelViewMatrix",modelViewMatrix);

//Renderthemesforthisgameitem

shaderProgram.setUniform("colour",mesh.getColour());

shaderProgram.setUniform("useColour",mesh.isTextured()?0:1);

mesh.render();

}

Loadingmorecomplexmodels

86

NowwecancreateanewclassnamedOBJLoaderwhichparsesOBJfilesandwillcreateaMeshinstancewiththedatacontainedinit.YoumayfindsomeotherimplementationsinthewebthatmaybeabitmoreefficientthanthisonebutIthinkthisversionissimplertounderstand.Thiswillbeanutilityclasswhichwillhaveastaticmethodlikethis:

publicstaticMeshloadMesh(StringfileName)throwsException{

Theparameterfilenamespecifiesthenameofthefile,thatmustbeintheCLASSPATHthatcontainstheOBJmodel.

Thefirstthingthatwewilldointhatmethodistoreadthefilecontentsandstoreallthelinesinanarray.Thenwecreateseveralliststhatwillholdthevertices,thetexturecoordinates,thenormalsandthefaces.

List<String>lines=Utils.readAllLines(fileName);

List<Vector3f>vertices=newArrayList<>();

List<Vector2f>textures=newArrayList<>();

List<Vector3f>normals=newArrayList<>();

List<Face>faces=newArrayList<>();

Thenwillparseeachlineanddependingonthestartingtokenwillgetavertexposition,atexturecoordinate,avertexnormalorafacedefinition.Attheendwewillneedtoreorderthatinformation.

Loadingmorecomplexmodels

87

for(Stringline:lines){

String[]tokens=line.split("\\s+");

switch(tokens[0]){

case"v":

//Geometricvertex

Vector3fvec3f=newVector3f(

Float.parseFloat(tokens[1]),

Float.parseFloat(tokens[2]),

Float.parseFloat(tokens[3]));

vertices.add(vec3f);

break;

case"vt":

//Texturecoordinate

Vector2fvec2f=newVector2f(

Float.parseFloat(tokens[1]),

Float.parseFloat(tokens[2]));

textures.add(vec2f);

break;

case"vn":

//Vertexnormal

Vector3fvec3fNorm=newVector3f(

Float.parseFloat(tokens[1]),

Float.parseFloat(tokens[2]),

Float.parseFloat(tokens[3]));

normals.add(vec3fNorm);

break;

case"f":

Faceface=newFace(tokens[1],tokens[2],tokens[3]);

faces.add(face);

break;

default:

//Ignoreotherlines

break;

}

}

returnreorderLists(vertices,textures,normals,faces);

Beforetalkingaboutreorderinglet’sseehowfacedefinitionsareparsed.WehavecreateaclassnamedFacewhichparsesthedefinitionofaface.AFaceiscomposedbyalistofindicesgroups,inthiscasesincewearedealingwithtriangleswewillhavethreeindicesgroup).

Loadingmorecomplexmodels

88

WewillcreateanotherinnerclassnamedIndexGroupthatwillholdtheinformationforagroup.

protectedstaticclassIdxGroup{

publicstaticfinalintNO_VALUE=-1;

publicintidxPos;

publicintidxTextCoord;

publicintidxVecNormal;

publicIdxGroup(){

idxPos=NO_VALUE;

idxTextCoord=NO_VALUE;

idxVecNormal=NO_VALUE;

}

}

OurFaceclasswillbelikethis.

Loadingmorecomplexmodels

89

protectedstaticclassFace{

/**

*ListofidxGroupgroupsforafacetriangle(3verticesperface).

*/

privateIdxGroup[]idxGroups=newIdxGroup[3];

publicFace(Stringv1,Stringv2,Stringv3){

idxGroups=newIdxGroup[3];

//Parsethelines

idxGroups[0]=parseLine(v1);

idxGroups[1]=parseLine(v2);

idxGroups[2]=parseLine(v3);

}

privateIdxGroupparseLine(Stringline){

IdxGroupidxGroup=newIdxGroup();

String[]lineTokens=line.split("/");

intlength=lineTokens.length;

idxGroup.idxPos=Integer.parseInt(lineTokens[0])-1;

if(length>1){

//Itcanbeemptyiftheobjdoesnotdefinetextcoords

StringtextCoord=lineTokens[1];

idxGroup.idxTextCoord=textCoord.length()>0?Integer.parseInt(textCoor

d)-1:IdxGroup.NO_VALUE;

if(length>2){

idxGroup.idxVecNormal=Integer.parseInt(lineTokens[2])-1;

}

}

returnidxGroup;

}

publicIdxGroup[]getFaceVertexIndices(){

returnidxGroups;

}

}

Whenparsingfaceswemayseeobjectswithnotexturesbutwithvectornormals,inthiscaseafacelinecouldbelikethisf11//117//113//1,soweneedtodetectthosecases.

Nowwecantalkabouthowtoreordertheinformationwehave.Finallyweneedtoreorderthatinformation.OurMeshclassexpectsfourarrays,oneforpositioncoordinates,otherfortexturecoordinates,otherforvectornormalsandanotheronefortheindices.Thefirstthreearraysshallhavethesamenumberofelementssincetheindicesarrayisunique(notethatthesamenumberofelementsdoesnotimplythesamelength.Positionelements,vertexcoordinates,are3Dandarecomposedbythreefloats.Textureelements,texture

Loadingmorecomplexmodels

90

coordinates,are2Dandthusarecomposedbytwofloats).OpenGLdoesnotallowustodefinedifferentindicesarrayspertypeofelement(ifso,wewouldnotneedtorepeatverticeswhileapplyingtextures).

WhenyouopenanOBJlineyouwillfirstprobablyseethatthelistthatholdstheverticespositionshasahighernumberofelementsthantheliststhatholdthetexturecoordinatesandthenumberofvertices.That’ssomethingthatweneedtosolve.Let’suseasimpleexamplewhichdefinesaquadwithatexturewithapixelheight(justforillustrationpurposes).TheOBJfilemaybelikethis(don’tpaytoomuchattentionaboutthenormalscoordinatesinceit’sjustforillustrationpurpose).

v000

v100

v110

v010

vt01

vt11

vn001

f1/2/12/1/13/2/1

f1/2/13/2/14/1/1

Whenwehavefinishedparsingthefilewehavethefollowinglists(thenumberofeachelementisitspositioninthefileuponorderofappearance)

Nowwewillusethefacedefinitionstoconstructthefinalarraysincludingtheindices.Athingtotakeintoconsiderationisthattheorderinwhichtexturescoordinatesandvectornormalsaredefineddoesnotcorrespondtotheordersinwhichverticesaredefined.Ifthesizeofthelistswouldbethesameandtheywereordered,facedefinitionlineswouldonlyjustneedtoincludeanumberpervertex.

Soweneedtoorderthedataandsetupaccordinglytoourneeds.Thefirstthingthatwemustdoiscreatethreearraysandonelist,oneforthevertices,otherforthetexturecoordinates,otherforthenormalsandthelistfortheindices.Aswehavesaidbeforethethreearrayswillhavethesamenumberofelements(equaltothenumberofvertices).Theverticesarraywillhaveacopyofthelistofvertices.

Loadingmorecomplexmodels

91

Nowwestartprocessingthefaces.Thefirstindexgroupofthefirstfaceis1/2/1.Weusethefirstindexintheindexgroup,theonethatdefinesthegeometricvertextoconstructtheindexlist.Let’snameitasposIndex.Ourfaceisspecifiyingthattheweshouldaddtheindexoftheelementthatoccupiesthefirstpositionintoourindiceslist.SoweputthevalueofposIndexminusoneintotheindicesList(wemustsubstract1sincearraysstartat0butOBJfileformatassumesthattheystartat1).

ThenweusetherestoftheindicesoftheindexgrouptosetupthetexturesArrayandnormalsArray.Thesecondindex,intheindexgroup,is2,sowhatwemustdoisputthesecondtexturecoordinateinthesamepositionastheonethatoccupiesthevertexdesignatedposIndex(V1).

Loadingmorecomplexmodels

92

Thenwepickthethirdindex,whichis1,sowhatwemustdoisputthefirstvectornormalcoordinateinthesamepositionastheonethatoccupiesthevertexdesignatedposIndex(V1).

Afterwehaveprocessedthefirstfacethearraysandlistswillbelikethis.

Afterwehaveprocessedthesecondfacethearraysandlistswillbelikethis.

Thesecondfacedefinesverticeswhichalreadyhavebeenassigned,buttheycontainthesamevalues,sothere’snoprobleminreprocessingthis.Ihopetheprocesshasbeenclarifiedenough,itcanbesometrickyuntilyougetit.Themethodsthatreorderthedataaresetbelow.Keepinmindthatwhatwehavearefloatarrayssowemusttransformthose

Loadingmorecomplexmodels

93

arraysofvertices,texturesandnormalsintoarraysoffloats.Sothelengthofthesearrayswillbethelengthoftheverticeslistmultipliedbythenumberthreeinthecaseofverticesandnormalsormultipliedbytwointhecaseoftexturecoordinates.

Loadingmorecomplexmodels

94

privatestaticMeshreorderLists(List<Vector3f>posList,List<Vector2f>textCoordList,

List<Vector3f>normList,List<Face>facesList){

List<Integer>indices=newArrayList();

//Createpositionarrayintheorderithasbeendeclared

float[]posArr=newfloat[posList.size()*3];

inti=0;

for(Vector3fpos:posList){

posArr[i*3]=pos.x;

posArr[i*3+1]=pos.y;

posArr[i*3+2]=pos.z;

i++;

}

float[]textCoordArr=newfloat[posList.size()*2];

float[]normArr=newfloat[posList.size()*3];

for(Faceface:facesList){

IdxGroup[]faceVertexIndices=face.getFaceVertexIndices();

for(IdxGroupindValue:faceVertexIndices){

processFaceVertex(indValue,textCoordList,normList,

indices,textCoordArr,normArr);

}

}

int[]indicesArr=newint[indices.size()];

indicesArr=indices.stream().mapToInt((Integerv)->v).toArray();

Meshmesh=newMesh(posArr,textCoordArr,normArr,indicesArr);

returnmesh;

}

privatestaticvoidprocessFaceVertex(IdxGroupindices,List<Vector2f>textCoordList,

List<Vector3f>normList,List<Integer>indicesList,

float[]texCoordArr,float[]normArr){

//Setindexforvertexcoordinates

intposIndex=indices.idxPos;

indicesList.add(posIndex);

//Reordertexturecoordinates

if(indices.idxTextCoord>=0){

Vector2ftextCoord=textCoordList.get(indices.idxTextCoord);

texCoordArr[posIndex*2]=textCoord.x;

texCoordArr[posIndex*2+1]=1-textCoord.y;

}

if(indices.idxVecNormal>=0){

//Reordervectornormals

Vector3fvecNorm=normList.get(indices.idxVecNormal);

normArr[posIndex*3]=vecNorm.x;

normArr[posIndex*3+1]=vecNorm.y;

normArr[posIndex*3+2]=vecNorm.z;

}

}

Loadingmorecomplexmodels

95

AnotherthingtonoticeisthattexturecoordinatesareinUVformatsoycoordinatesneedtobecalculatedas1minusthevaluecontainedinthefile.

Now,atlast,wecanrenderobjmodels.I’veincludedanOBJfilethatcontainsthetexturedcubethatwehavebeenusinginpreviouschapters.InordertouseitintheinitmethodofourDummyGameclasswejustneedtoconstructaGameIteminstancelikethis.

Texturetexture=newTexture("/textures/grassblock.png");

mesh.setTexture(texture);

GameItemgameItem=newGameItem(mesh);

gameItem.setScale(0.5f);

gameItem.setPosition(0,0,-2);

gameItems=newGameItem[]{gameItem};

Andwewillgetourfamiliartexturedcube.

Wecannowtrywithothermodels.WecanusethefamousStandfordBunny(itcanbefreelydownloaded)model,whichisincludedintheresources.Thismodelisnottexturedsowecanuseitthisway:

Meshmesh=OBJLoader.loadMesh("/models/bunny.obj");

GameItemgameItem=newGameItem(mesh);

gameItem.setScale(1.5f);

gameItem.setPosition(0,0,-2);

gameItems=newGameItem[]{gameItem};

Loadingmorecomplexmodels

96

Themodellooksalittlebitstrangebecausewehavenotexturesandthere’snolightsowecannotappreciatethevolumesbutyoucancheckthatthemodeliscorrectlyloaded.IntheWindowclasswhenwesetuptheOpenGLparametersaddthisline.

glPolygonMode(GL_FRONT_AND_BACK,GL_LINE);

Youshouldnowseesomethinglikethiswhenyouzoomin.

Nowyoucannowseeallthetrianglesthatcomposethemodel.

WiththisOBJloaderclassyoucannowuseBlendertocreateyourmodels.Blenderisapowerfultoolbutitcanbesomebitofoverwhelmingatfirst,therearelotsofoptions,lotsofkeycombinationsandyouneedtotakeyourtimetodothemostbasicthingsbythefirsttime.Whenyouexportthemodelsusingblenderpleasemakesuretoincludethenormalsandexportfacesastriangles.

Loadingmorecomplexmodels

97

Alsoremebertosplitedgeswhenexporting,sincewecannotassignseveraltexturecoordinatestothesamevertex.Also,weneedthenormalstobedefinedpereachtriangle,notasignedtovertices.Ifyoufindlightproblems(nextchapters),withsomemodels,youshouldverifythenormals.Youcanvisualizetheminsideblender.

Loadingmorecomplexmodels

98

LettherebelightInthischapterwewilllearnhowtoaddlighttoour3Dgameengine.Wewillnotimplementaphysicalperfectlightmodelbecause,takingasidethecomplexity,itwouldrequireatremendousamountofcomputerrecourses,insteadwewillimplementanapproximationwhichwillprovidedecentresults.WewilluseanalgorithmnamedPhongshading(developedbyBuiTuongPhong).Anotherimportantthingtopointisthatwewillonlymodellightsbutwewon’tmodeltheshadowsthatshouldbegeneratedbythoselights(thiswillbedoneinanotherchapter).

Beforewestart,letusdefinesomelighttypes:

Pointlight:Thistypeoflightmodelsalightsourcethat’semitteduniformlyformapointinspaceinalldirections.Spotlight:Thistypeoflightmodelsalightsourcethat’semittedfromapointinspace,butinsteadofemittinginalldirectionsisrestrictedtoacone.Directionallight:Thistypeforlightmodelsthelightthatwereceivefromthesun,alltheobjectsinthe3Dthespacearehitbyparallelraylightscomingfromaspecificdirection.Nomatteriftheobjectiscloseoroffaraway,alltheraylightsimpacttheobjectswiththesameangle.Ambientlight:Thistypeoflightcomesfromeverywhereinthespaceandilluminatesalltheobjectsinthesameway.

Thus,tomodellightweneedtotakeintoconsiderationthetypeoflightplus,itspositionandsomeotherparameterslikeitscolour.Ofcourse,wemustalsoconsiderthewaythatobjects,impactedbyraylights,absorbandreflectlight.

ThePhongshadingalgorithmwillmodeltheeffectsoflightforeachpointinourmodel,thatisforeveryvertex.Thisiswhyit’scalledalocalilluminationsimulation,andthisisthereasonwhichthisalgorithmwillnotcalculateshadows,itwilljustcalculatethelighttobeappliedtoeveryvertexwithouttakingintoconsiderationifthevertexisbehindanobjectthat

Lettherebelight

99

blocksthelight.Wewillovercomethisinlaterchapters.But,becauseofthat,isaverysimpleandfastalgorithmthatprovidesverygoodeffects.Wewillusehereasimplifiedversionthatdoesnottakeintoaccountmaterialsdeeply.

ThePhongalgorithmconsidersthreecomponentsforlighting:

Ambientlight:modelslightthatcomesfromeverywhere,thiswillserveustoilluminate(withtherequireintensity)theareasthatarenothitbyanylight,it’slikeabackgroundlight.Diffusereflectance:Ittakesintoconsiderationthatsurfacesthatarefacingthelightsourcearebrighter.Specularreflectance:modelshowlightreflectsinpolishedormetallicsurfaces

Attheendwhatwewanttoobtainisafactorthat,multipliedbycolourassignedtoafragment,willsetthatcolourbrighterordarkerdependingonthelightitreceives.Let’snameourcomponentsasAforambient,DfordiffuseandSforspecular.Thatfactorwillbetheadditionofthosecomponents:

L = A+D + S

Infact,thosecomponentsareindeedcolours,thatisthecolourcomponentsthateachlightcomponentcontributesto.Thisisduetothefactthatlightcomponentswillnotonlyprovideadegreeofintensitybutitcanmodifiythecolourofmodel.Inourfragmentshaderwejustneedtomultiplythatlightcolourbytheoriginalfragmentcolour(obtainedfromatextureorabasecolour).

Wecanassignalsodifferentcolours,forthesamematerials,thatwillbeusedintheambient,diffuseandspecularcomponents.Hence,thesecomponentswillbemodulatedbythecoloursassociatedtothematerial.Ifthematerialhasatexture,wewillsimplyuseasingletextureforeachofthecomponents.

Sothefinalcolourforanontexturedmaterialwillbe:L = A ∗ ambientColour +D ∗ diffuseColour + S ∗ specularColour.

Andthefinalcolourforatexturedmaterialwillbe:

L = A ∗ textureColour +D ∗ textureColour + S ∗ textureColour

AmbientLightcomponentLet’sviewthefirstcomponent,theambientlightcomponentit’sjustaconstantfactorthatwillmakeallofourobjectsbrighterordarker.Wecanuseittosimulatelightforaspecificperiodoftime(dawn,dusk,etc.)alsoitcanbeusedtoaddsomelighttopointsthatarenothit

Lettherebelight

100

directlybyraylightsbutcouldbelightedbyindirectlight(causedbyreflections)inaneasyway.

Ambientlightistheeasiestcomponenttocalculate,wejustneedtopassacolour,sinceitwillbemultipliedbyourbasecolouritjustmodulatesthatbasecolour.Imaginethatwehavedeterminedthatacolourforafragmentis(1.0, 0.0, 0.0),thatisredcolour.Withoutambientlightitwillbedisplayedasafullyredfragment.Ifwesetambientlightto(0.5, 0.5, 0.5)thefinalcolourwillbe(0.5, 0, 0),thatisadarkerversionofred.Thislightwilldarkenallthefragmentsinthesameway(itmayseemtobealittlestrangetobetalkingaboutlightthatdarkensobjectsbutinfactthatistheeffectthatweget).Besidesthat,itcanaddsomecolouriftheRGBcomponentsarenotthesame,sowejustneedavectortomodulateambientlightintensityandcolour.

DiffuserefletanceLet’stalknowaboutdiffusereflectance.Itmodelsthefactthatsurfaceswhichfaceinaperpendicularwaytothelightsourcelookbrighterthansurfaceswherelightisreceivedinamoreindirectangle.Thoseobjectsreceivemorelight,thelightdensity(letmecallitthisway)ishigher.

But,howdowecalculatethis?Doyourememberfrompreviouschapterthatweintroducedthenormalconcept?Thenormalwasthevectorperpendiculartoasurfacethathadalengthequaltoone.So,Let’sdrawthenormalsforthreepointsinthepreviousfigure,asyoucansee,thenormalforeachpointwillbethevectorperpendiculartothetangentplaneforeachpoint.Insteadofdrawingrayscomingfromthesourceoflightwewilldrawvectorsfromeachpointtothepointoflight(thatis,intheoppositedirection).

Lettherebelight

101

Asyoucansee,thenormalassociatedtoP1,namedN1,isparalleltothevectorthatpointstothelightsource,whichmodelstheoppositeofthelightray(N1hasbeensketcheddisplacedsoyoucanseeit,butit’sequivalentmathematically).P1hasanangleequalto0withthevectorthatpointstothelightsource.It’ssurfaceisperpendiculartothelightsourceandP1wouldbethebrightestpoint.

ThenormalassociatedtoP2,namedN2,hasanangleofaround30degreeswiththevectorthatpointsthelightsource,soitshouldbedarkertanP1.Finally,thenormalassociatedtoP3,namedN3,isalsoparalleltothevectorthatpointstothelightsourcebutbothvectorsareintheoppositedirection.P3hasanangleof180degreeswiththevectorthatpointsthelightsource,andshouldnotgetanylightatall.

Soitseemsthatwehaveagoodapproachtodeterminethelightintensitythatgetstoapointandit’srelatedtotheanglethatformsthenormalwithavectorthatpointstothelightsource.Howcanwecalculatethis?

There’samathematicaloperationthatwecanuseandit’scalleddotproduct.Thisoperationtakestwovectorsandproducesanumber(ascalar),thatispositiveiftheanglebetweenthemissmall,andproducesanegativenumberiftheanglebetweenthemiswide.Ifbothvectorsarenormalized,thatisthebothhavealengthequaltoone,thedotproductwillbebetween−1and1.Thedotproductwillbeoneifbothvectorslookinthesamedirection(angle0)anditwillbe0ifbothvectorsformasquareangleandwillbe−1ifbothvectorsfaceoppositedirection.

Let’sdefinetwovectors,v1andv2,andletalphabetheanglebetweenthem.Thedotproductisdefinedbythefollowingformula.

Lettherebelight

102

Ifbothvectorsarenormalized,theirlength,theirmodulewillbeequaltoone,sothedotproductisequaltothecosineiftheanglebetweenthem.Wewillusethatoperationtocalculatethediffusereflectancecomponent.

Soweneedtocalculatethevectorthatpointstothesourceoflight.Howwedothis?Wehavethepositionofeachpoint(thevertexposition)andwehavethepositionofthelightsource.Firstofall,bothcoordinatesmustbeinthesamecoordinatespace.Tosimplify,let’sassumethattheyarebothinworldcoordinatespace,thenthosepositionsarethecoordinatesofthevectorsthatpointtothevertexposition(V P )andtothelightsource(V S),asshowninthenextfigure.

IfwesubstractVSfromV P wegetthevectorthatwearelookingforwhichit’scalledL.

Nowwecandothedotproductbetweenthevectorthatpointstothelightsourceandthenormal,thatproductiscalledtheLambertterm,duetoJohannLambertwhichwasthefirsttoproposethatrelationtomodelthebrightnessofasurface.

Let’ssummarizehowwecancalculateit,wedefinethefollowingvariables:

vPos:Positionofourvertexinmodelviewspacecoordinates.lPos:Positionofthelightinviewspacecoordinates.

Lettherebelight

103

intensity:Intensityofthelight(from0to1).lColour:Colourofthelight.normal:Thevertexnormal.Firstweneedtocalculatethevectorthatpointstothelightsourcefromcurrentposition:toLightDirection = lPos− vPos.Theresultofthatoperationneedstobenormalized

Thenweneedtocalculatethediffusefactor(anscalar):diffuseFactor = normal ⋅ toLightDirection.It’scalculatedasdotproductbetweentwovectors,sincewewantittobebetween−1and1bothvectorsneedtobenormalized.Coloursneedtobebetween0and1soifavalueit’slowerthan0wewillsetitto0.

Finallywejustneedtomodulatethelightcolourbythediffusefactorandthelightintensity:

colour = diffuseColour ∗ lColour ∗ diffuseFactor ∗ intensity

SpecularcomponentLet’sviewnowthespecularcomponent,butfirstweneedtoexaminehowlightisreflected.Whenlighthitsasurfacesomepartofitisabsorbedandtheotherpartisreflected,ifyourememberfromyourphysicsclass,reflectioniswhenlightbouncesoffanobject.

Ofcourse,surfacesarenottotallypolished,andifyoulookatcloserdistanceyouwillseealotofimperfections.Besidesthat,youhavemanyraylights(photonsinfact),thatimpactthatsurface,andthatgetreflectedinawiderangeofangles.Thus,whatweseeislikeabeamoflightbeingreflectedfromthesurface.Thatis,lightisdiffusedwhenimpactingoverasurface,andthat’sthedisusecomponentthatwehavebeentalkingaboutpreviously.

Lettherebelight

104

Butwhenlightimpactsapolishedsurface,forinstanceametal,thelightsuffersfromlowerdiffusionandmostofitgetsreflectedintheoppositedirectionasithitthatsurface.

Thisiswhatthespecularcomponentmodels,anditdependsonthematerialcharacteristics.Regardingspecularreflectance,it’simportanttonotethatthereflectedlightwillonlybevisibleifthecameraisinaproperposition,thatis,ifit'sintheareaofwherethereflectedlightisemitted.

Oncethemechanismthat’sbehindsepecularreflectionhasbeenexplainedwearereadytocalculatethatcomponent.Firstweneedavectorthatpointsfromthelightsourcetothevertexpoint.Whenwewerecalculatingthedifussecomponentwecalculatedjusttheopposite,avectorthatpointstothelightsource.toLightDirection,solet’scalculateitasfromLightDirection = −(toLightDirection).

ThenweneedtocalculatethereflectedlightthatresultsfromtheimpactofthefromLightDirectionintothesurfacebytakingintoconsiderationitsnormal.There’saGLSLfunctionthatdoesthatnamedreflect.So,

Lettherebelight

105

reflectedLight = reflect(fromLightSource,normal).

Wealsoneedavectorthatpointstothecamera,let’snameitcameraDirection,anditwillbecalculatedasthedifferencebetweenthecamerapositionandthevertexposition:cameraDirection = cameraPos− vPos.Thecamerapositionvectorandthevertexpositionneedtobeinthesamecoordinatesystemandtheresultingvectorneedstobenormalized.Thefollowingfiguresketchesthemaincomponentswehavecalculateduptonow.

NowweneedtocalculatethelightintensitythatweseewhichwewillcallspecularFactor.ThiscomponentwillbehigherifthecameraDirectionandthereflectedLightvectorsareparallelandpointinthesamedirectionandwilltakeitslowervalueiftheypointinoppositedirections.Inordertocalculatethisthedotproductcomestotherescueagain.SospecularFactor = cameraDirection ⋅ reflectedLight.Weonlywantthisvaluetobebetween0and1soifit’slowerthan0itwillbesetto0.

Wealsoneedtotakeintoconsiderationthatthislightmustbemoreintenseifthecameraispointingtothereflectedlightcone.ThiswillbeachievedbypoweringthespecularFactortoaparameternamedspecularPower.

specularFactor = specularFactor .

Finallyweneedtomodelthereflectivityofthematerial,whichwillalsomodulatetheintensityifthelightreflected,thiswillbedonewithanotherparameternamedreflectance.Sothecolourcomponentofthespecularcomponentwillbe:specularColour ∗ lColour ∗ reflectance ∗ specularFactor ∗ intensity.

Attenuation

specularPower

Lettherebelight

106

Wenowknowhowtocalculatethethreecomponentsthatwillserveustomodelapointlightwithanambientlight.Butourlightmodelisstillnotcomplete,thelightthatanobjectreflectsisindependentofthedistancethatthelightis,weneedtosimulatelightattenuation.

Attenuationisafunctionofthedistanceandlight.Theintensityoflightisinverselyproportionaltothesquareofdistance.Thatfactiseasytovisualize,aslightispropagatingitsenergyisdistributedalongthesurfaceofaspherewitharadiusthat’sequaltothedistancetraveledbythelight.Thesurfaceofasphereisproportionaltothesquareofitsradius.Wecancalculatetheattenuationfactorwiththisformula:

1.0/(atConstant+ atLinear ∗ dist+ atExponent ∗ dist ).

Inordertosimulateattenuationwejustneedtomultiplythatattenuationfactorbythefinalcolour.

ImplementationNowwecanstartcodingalltheconceptsdescribedabove,wewillstartwithourshaders.Mostoftheworkwillbedoneinthefragmentshaderbutweneedtopasssomedatafromthevertexshadertoit.Inpreviouschapterthefragmentshaderjustreceivedthetexturecoordinates,nowwearegoingtopassalsotwomoreparameters:

Thevertexnormal(normalized)transformedtomodelviewspacecoordinates.Thevertexpositiontransformedtomodelviewspacecoordinates.Thisisthecodeofthevertexshader.

2

Lettherebelight

107

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

outvec2outTexCoord;

outvec3mvVertexNormal;

outvec3mvVertexPos;

uniformmat4modelViewMatrix;

uniformmat4projectionMatrix;

voidmain()

{

vec4mvPos=modelViewMatrix*vec4(position,1.0);

gl_Position=projectionMatrix*mvPos;

outTexCoord=texCoord;

mvVertexNormal=normalize(modelViewMatrix*vec4(vertexNormal,0.0)).xyz;

mvVertexPos=mvPos.xyz;

}

Beforewecontinuewiththefragmentshaderthere’saveryimportantconceptthatmustbehighlighted.FromthecodeaboveyoucanseethatmvVertexNormal,thevariablecontainsthevertexnormal,istransformedintomodelviewspacecoordinates.ThisisdonebymultiplyingthevertexNormalbythemodelViewMatrixaswiththevertexposition.Butthere’sasubtledifference,thewcomponentofthatvertexnormalissetto0beforemultiplyingitbythematrix:vec4(vertexNormal,0.0).Whyarewedoingthis?Becausewedowantthenormaltoberotatedandscaledbutwedonotwantittobetranslated,weareonlyinterestedintoitsdirectionbutnotinitsposition.Thisisachievedbysettingiswcomponentto0andisoneoftheadvantagesofusinghomogeneouscoordinates,bysettingthewcomponentwecancontrolwhattransformationsareapplied.Youcandothematrixmultiplicationbyhandandseewhythishappens.

Nowwecanstarttodotherealworkinourfragmentshader,besidesdeclaringasinputparametersthevaluesthatcomefromthevertexshaderwearegoingtodefinesomeusefulstructurestomodellightandmaterialcharacteristic.Firstofall,wewilldefinethestructuresthatmodelthelight.

Lettherebelight

108

structAttenuation

{

floatconstant;

floatlinear;

floatexponent;

};

structPointLight

{

vec3colour;

//Lightpositionisassumedtobeinviewcoordinates

vec3position;

floatintensity;

Attenuationatt;

};

Apointlightisdefinedbyacolour,aposition,anumberbetween0and1whichmodelsitsintensityandasetofparameterswhichwillmodeltheattenuationequation.

Thestructurethatmodelsamaterialcharacteristicsis:

structMaterial

{

vec4ambient;

vec4diffuse;

vec4specular;

inthasTexture;

floatreflectance;

};

Amaterialisdefinedbyaasetofcolours(ifwedon’tusetexturetocolourthefragments):

Thecoloutusedfortheambientcomponent.Thecolourusedforthediffusecomponent.Thecolourusedforthespecularcomponent.

Amaterialalsoisdefinedbyaflagthatcontrolsifithasanassociatedtextureornotandareflectanceindex.Wewillusethefollowinguniformsinourfragmentshader.

uniformsampler2Dtexture_sampler;

uniformvec3ambientLight;

uniformfloatspecularPower;

uniformMaterialmaterial;

uniformPointLightpointLight;

uniformvec3camera_pos;

Wearecreatingnewuniformstosetthefollowingvariables:

Lettherebelight

109

Theambientlight:whichwillcontainacolourthatwillaffecteveryfragmentinthesameway.Thespecularpower(theexponentusedintheequationthatwaspresentedwhentalkingaboutthespecularlight).Apointlight.Thematerialcharacteristics.Thecamerapositioninviewspacecoordinates.

Wewillalsodefinesomeglobalvariablesthatwillholdthematerialcolourcomponentstobeusedintheambient,diffuseandspecularcomponents.Weusethesevariablessince,ifthecomponenthasatexture,wewillusethesamecolourforallthecomponentsandwedonotwanttoperformredundanttexturelookups.Thevariablesaredefinedlikethis:

vec4ambientC;

vec4diffuseC;

vec4speculrC;

Wenowcandefineafunctionthatwillsetupthesevariablesaccodringtothematerialcharacteristics:

voidsetupColours(Materialmaterial,vec2textCoord)

{

if(material.hasTexture==1)

{

ambientC=texture(texture_sampler,textCoord);

diffuseC=ambientC;

speculrC=ambientC;

}

else

{

ambientC=material.ambient;

diffuseC=material.diffuse;

speculrC=material.specular;

}

}

Nowwearegoingtodefineafunctionthat,takingasitsinputapointlight,thevertexpositionanditsnormalreturnsthecolourcontributioncalculatedforthediffuseandspecularlightcomponentsdescribedpreviously.

Lettherebelight

110

vec4calcPointLight(PointLightlight,vec3position,vec3normal)

{

vec4diffuseColour=vec4(0,0,0,0);

vec4specColour=vec4(0,0,0,0);

//DiffuseLight

vec3light_direction=light.position-position;

vec3to_light_source=normalize(light_direction);

floatdiffuseFactor=max(dot(normal,to_light_source),0.0);

diffuseColour=diffuseC*vec4(light.colour,1.0)*light.intensity*diffuseFact

or;

//SpecularLight

vec3camera_direction=normalize(-position);

vec3from_light_source=-to_light_source;

vec3reflected_light=normalize(reflect(from_light_source,normal));

floatspecularFactor=max(dot(camera_direction,reflected_light),0.0);

specularFactor=pow(specularFactor,specularPower);

specColour=speculrC*specularFactor*material.reflectance*vec4(light.colour,

1.0);

//Attenuation

floatdistance=length(light_direction);

floatattenuationInv=light.att.constant+light.att.linear*distance+

light.att.exponent*distance*distance;

return(diffuseColour+specColour)/attenuationInv;

}

Thepreviouscodeisrelativelystraightforward,itjustcalculatesacolourforthediffusecomponent,anotheroneforthespecularcomponentandmodulatesthembytheattenuationsufferedbythelightinitstraveltothevertexweareprocessing.

Pleasebeawarethatverticescoordinatesareinviewspace.Whencalculatingthespecularcomponent,wemustgetthedirecttiontothepointofview,thatisthecamera.This,couldbedonelikethis:

vec3camera_direction=normalize(camera_pos-position);

But,sincepositionisinviewspace,thecamerapositionisallwaysattheorigin,thatis,(0, 0, 0),sowecalculateitlikethis:

vec3camera_direction=normalize(vec3(0,0,0)-position);

Whichcanbesimplifiedlikethis:

Lettherebelight

111

vec3camera_direction=normalize(-position);

Withthepreviousfunction,themainfunctionofthevertexfunctionisverysimple.

voidmain()

{

setupColours(material,outTexCoord);

vec4diffuseSpecularComp=calcPointLight(pointLight,mvVertexPos,mvVertexNormal)

;

fragColor=ambientC*vec4(ambientLight,1)+diffuseSpecularComp;

}

ThecalltothesetupColoursfunctionwillsetuptheambientC,diffuseCandspeculrCvariableswiththeappropriatecolours.Then,wecalculatethediffuseandspecularcomponents,takingintoconsiderationtheattennuation.Wedothisusingasinglefunctioncallforconvenience,asithasbeenexplainedabove.Finalcolouriscalculatedbyaddingtheambientcomponent(multiplyingambientCbytheambientlight).Asyoucanseeambientlightisnotaffectedbyattenuation.

Wehaveintroducedsomenewconceptsintoourshaderthatrequirefurtherexplanation,wearedefiningstructuresandusingthemasuniforms.Howdowepassthosestructures?Firstofallwewilldefinetwonewclassesthatmodelthepropertiesofapointlightandamaterial,namedohsurprise,PointLightandMaterial.TheyarejustplainPOJOssoyoucanchecktheminthesourcecodethataccompaniesthisbook.Then,weneedtocreatenewmethodsintheShaderProgramclass,firsttobeabletocreatetheuniformsforthepointlightandmaterialstructures.

publicvoidcreatePointLightUniform(StringuniformName)throwsException{

createUniform(uniformName+".colour");

createUniform(uniformName+".position");

createUniform(uniformName+".intensity");

createUniform(uniformName+".att.constant");

createUniform(uniformName+".att.linear");

createUniform(uniformName+".att.exponent");

}

publicvoidcreateMaterialUniform(StringuniformName)throwsException{

createUniform(uniformName+".ambient");

createUniform(uniformName+".diffuse");

createUniform(uniformName+".specular");

createUniform(uniformName+".hasTexture");

createUniform(uniformName+".reflectance");

}

Lettherebelight

112

Asyoucansee,it’sverysimple,wejustcreateaseparateuniformforalltheattributesthatcomposethestructure.NowweneedtocreateanothertwomethodstosetupthevaluesofthoseuniformsandthatwilltakeasparametersPointLightandMaterialinstances.

publicvoidsetUniform(StringuniformName,PointLightpointLight){

setUniform(uniformName+".colour",pointLight.getColor());

setUniform(uniformName+".position",pointLight.getPosition());

setUniform(uniformName+".intensity",pointLight.getIntensity());

PointLight.Attenuationatt=pointLight.getAttenuation();

setUniform(uniformName+".att.constant",att.getConstant());

setUniform(uniformName+".att.linear",att.getLinear());

setUniform(uniformName+".att.exponent",att.getExponent());

}

publicvoidsetUniform(StringuniformName,Materialmaterial){

setUniform(uniformName+".ambient",material.getAmbientColour());

setUniform(uniformName+".diffuse",material.getDiffuseColour());

setUniform(uniformName+".specular",material.getSpecularColour());

setUniform(uniformName+".hasTexture",material.isTextured()?1:0);

setUniform(uniformName+".reflectance",material.getReflectance());

}

InthischaptersourcecodeyouwillseealsothatwealsohavemodifiedtheMeshclasstoholdamaterialinstanceandthatwehavecreatedasimpleexamplethatcreatesapointlightthatcanbemovedbyusingthe“N”and“M”keysinordertoshowhowapointlightfocusingoverameshwithareflectancevaluehigherthan0lookslike.

Let'sgetbacktoourfragmentshader,aswehavesaidweneedanotheruniformwhichcontainsthecameraposition,camera_pos.Thesecoordinatesmustbeinviewspace.Usuallywewillsetuplightcoordinatesinworldspacecoordinates,soweneedtomultiplythembytheviewmatrixinordertobeabletousetheminourshader,soweneedtocreateanewmethodintheTransformationclassthatreturnstheviewmatrixsowetransformlightcoordinates.

//Getacopyofthelightobjectandtransformitspositiontoviewcoordinates

PointLightcurrPointLight=newPointLight(pointLight);

Vector3flightPos=currPointLight.getPosition();

Vector4faux=newVector4f(lightPos,1);

aux.mul(viewMatrix);

lightPos.x=aux.x;

lightPos.y=aux.y;

lightPos.z=aux.z;

shaderProgram.setUniform("pointLight",currPointLight);

Lettherebelight

113

Wewillnotincludethewholesourcecodebecausethischapterwouldbetoolonganditwouldnotcontributetoomuchtoclarifytheconceptsexplainedhere.Youcancheckitinthesourcecodethataccompaniesthisbook.

Lettherebelight

114

LettherebeevenmorelightInthischapterwearegoingtoimplementotherlighttypesthatweintroducedinpreviouschapter.Wewillstartwithdirectionallightning.

DirectionalLightIfyourecall,directionallightinghitsalltheobjectsbyparallelraysallcomingfromthesamedirection.ItmodelslightsourcesthatarefarawaybuthaveahighintensitysuchustheSun.

Anothercharacteristicofdirectionallightisthatitisnotaffectedbyattenuation.ThinkagainaboutSunlight,allobjectsthatarehitbyraylightsareilluminatedwiththesameintensity,thedistancefromthesunissohugethatthepositionoftheobjectsisirrelevant.Infact,directionallightsaremodeledaslightsourcesplacedattheinfinity,ifitwasaffectedbyattenuationitwouldhavenoeffectinanyobject(it’scolourcontributionwouldbeequalto0).

Besidesthat,directionallightiscomposedalsobyadiffuseandspecularcomponents,theonlydifferenceswithpointlightsisthatitdoesnothaveapositionbutadirectionandthatitisnotaffectedbyattenuation.Let’sgetbacktothedirectionattributeofdirectionallight,andimaginewearemodelingthemovementofthesunacrossour3Dworld.Ifweareassumingthatthenorthisplacedtowardstheincreasingz-axis,thefollowingpictureshowsthedirectiontothelightsourceatdawn,midnightanddusk.

Lettherebeevenmorelight

115

Lightdirectionsfortheabovepositionsare:

Dawn:(-1,0,0)Midday:(0,1,0)Dusk:(1,0,0)

Sidenote:Youmaythinkthatabovecoordinatesareequaltopositionones,buttheymodelavector,adirection,notaposition.Fromthemathematicalpointofviewavectorandapositionarenotdistinguishablebuttheyhaveatotallydifferentmeaning.

But,howdowemodelthefactthatthislightislocatedattheinfinity?Theanswerisbyusingthewcoordinate,thatis,byusinghomogeneouscoordinatesandsettingthewcoordinateto0:

Dawn:(-1,0,0,0)Midday:(0,1,0,0)Dusk:(1,0,0,0)

Thisisthesamecaseaswhenwepassthenormals,fornormalswesetthewcomponentto0tostatethatwearenotinterestedindisplacements,justinthedirection.Also,whenwedealwithdirectionallightweneedtodothesame,cameratranslationsshouldnotaffectthedirectionofadirectionallight.

Solet’sstartcodingandmodelourdirectionallight.Thefirstthingthatwearegoingtodoistocreateaclassthatmodelsitsattributes.ItwillbeanotherPOJOwithacopyconstructorwhichstoresthedirection,thecolourandtheintensity.

Lettherebeevenmorelight

116

packageorg.lwjglb.engine.graph;

importorg.joml.Vector3f;

publicclassDirectionalLight{

privateVector3fcolor;

privateVector3fdirection;

privatefloatintensity;

publicDirectionalLight(Vector3fcolor,Vector3fdirection,floatintensity){

this.color=color;

this.direction=direction;

this.intensity=intensity;

}

publicDirectionalLight(DirectionalLightlight){

this(newVector3f(light.getColor()),newVector3f(light.getDirection()),light

.getIntensity());

}

//Gettersandsettesbeyondthispoint...

Asyoucansee,wearestillusingaVector3ftomodelthedirection.Keepcalm,wewilldealwiththewcomponentwhenwetransferthedirectionallighttotheshader.Andbytheway,thenextthingthatwewilldoistoupdatetheShaderProgramtocreateandupdatetheuniformthatwillholdthedirectionallight.

Inourfragmentshaderwewilldefineastructurethatmodelsadirectionallight.

structDirectionalLight

{

vec3colour;

vec3direction;

floatintensity;

};

WiththatdefinitionthenewmethodsintheShaderProgramclassarestraightforward.

Lettherebeevenmorelight

117

//...

publicvoidcreateDirectionalLightUniform(StringuniformName)throwsException{

createUniform(uniformName+".colour");

createUniform(uniformName+".direction");

createUniform(uniformName+".intensity");

}

//...

publicvoidsetUniform(StringuniformName,DirectionalLightdirLight){

setUniform(uniformName+".colour",dirLight.getColor());

setUniform(uniformName+".direction",dirLight.getDirection());

setUniform(uniformName+".intensity",dirLight.getIntensity());

}

Nowweneedtousethatuniform.WewillmodelhowthesunappearstomoveacrosstheskybycontrollingitsangleinourDummyGameclass.

Weneedtoupdatelightdirectionsowhenthesunit’satdawn(-90º)itsdirectionis(-1,0,0)anditsxcoordinateprogressivelyincreasesfrom-1to0andthe“y”coordinateincreasesto1asitapproachesmidday.Thenthe“x”coordinateincreasesto1andthe“y”coordinatesdecreasesto0again.Thiscanbedonebysettingthexcoordinatetothesineoftheangleandycoordinatetothecosineoftheangle.

Lettherebeevenmorelight

118

Wewillalsomodulatelightintensity,theintensitywillbeincreasingwhenit’sgettingawayfromdawnandwilldecreaseasitapproachestodusk.Wewillsimulatethenightbysettingtheintensityto0.Besidesthat,wewillalsomodulatethecoloursothelightgetsmoreredatdawnandatdusk.ThiswillbedoneintheupdatemethodoftheDummyGameclass.

//Updatedirectionallightdirection,intensityandcolour

lightAngle+=1.1f;

if(lightAngle>90){

directionalLight.setIntensity(0);

if(lightAngle>=360){

lightAngle=-90;

}

}elseif(lightAngle<=-80||lightAngle>=80){

floatfactor=1-(float)(Math.abs(lightAngle)-80)/10.0f;

directionalLight.setIntensity(factor);

directionalLight.getColor().y=Math.max(factor,0.9f);

directionalLight.getColor().z=Math.max(factor,0.5f);

}else{

directionalLight.setIntensity(1);

directionalLight.getColor().x=1;

directionalLight.getColor().y=1;

directionalLight.getColor().z=1;

}

doubleangRad=Math.toRadians(lightAngle);

directionalLight.getDirection().x=(float)Math.sin(angRad);

directionalLight.getDirection().y=(float)Math.cos(angRad);

ThenweneedtopassthedirectionallighttoourshadersintherendermethodoftheRendererclass.

Lettherebeevenmorelight

119

//Getacopyofthedirectionallightobjectandtransformitspositiontoviewcoord

inates

DirectionalLightcurrDirLight=newDirectionalLight(directionalLight);

Vector4fdir=newVector4f(currDirLight.getDirection(),0);

dir.mul(viewMatrix);

currDirLight.setDirection(newVector3f(dir.x,dir.y,dir.z));

shaderProgram.setUniform("directionalLight",currDirLight);

Asyoucanseeweneedtotransformthelightdirectioncoordinatestoviewspace,butwesetthewcomponentto0sincewearenotinterestedinapplyingtranslations.

Nowwearereadytodotherealworkwhichwillbedoneinthefragmentshadersincethevertexshaderdoesnotbemodified.Wehaveyetstatedabovethatweneedtodefineanewstruct,namedDirectionalLight,tomodeladirectionallight,andwewillneedanewuniformformthat.

uniformDirectionalLightdirectionalLight;

Weneedtorefactorourcodealittlebit,inthepreviouschapterwehadafunctioncalledcalcPointLightthatcalculatethediffuseandspecularcomponentsandalsoappliedtheattenuation.Aswehaveexplaineddirectionallightalsocontributestothediffuseandspecularcomponentsbutisnotaffectedbyattenuation,sowewillcreateanewfunctionnamedcalcLightColourthatjustcalculatesthosecomponents.

Lettherebeevenmorelight

120

vec4calcLightColour(vec3light_colour,floatlight_intensity,vec3position,vec3to_

light_dir,vec3normal)

{

vec4diffuseColour=vec4(0,0,0,0);

vec4specColour=vec4(0,0,0,0);

//DiffuseLight

floatdiffuseFactor=max(dot(normal,to_light_dir),0.0);

diffuseColour=diffuseC*vec4(light_colour,1.0)*light_intensity*diffuseFact

or;

//SpecularLight

vec3camera_direction=normalize(camera_pos-position);

vec3from_light_dir=-to_light_dir;

vec3reflected_light=normalize(reflect(from_light_dir,normal));

floatspecularFactor=max(dot(camera_direction,reflected_light),0.0);

specularFactor=pow(specularFactor,specularPower);

specColour=speculrC*light_intensity*specularFactor*material.reflectance*

vec4(light_colour,1.0);

return(diffuseColour+specColour);

}

ThenthemethodcalcPointLightappliesattenuationfactortothelightcolourcalculatedinthepreviousfunction.

vec4calcPointLight(PointLightlight,vec3position,vec3normal)

{

vec3light_direction=light.position-position;

vec3to_light_dir=normalize(light_direction);

vec4light_colour=calcLightColour(light.colour,light.intensity,position,to_li

ght_dir,normal);

//ApplyAttenuation

floatdistance=length(light_direction);

floatattenuationInv=light.att.constant+light.att.linear*distance+

light.att.exponent*distance*distance;

returnlight_colour/attenuationInv;

}

WewillcreatealsoanewfunctiontocalculatetheeffectofadirectionallightwhichjustinvokesthecalcLightColourfunctionwiththelightdirection.

Lettherebeevenmorelight

121

vec4calcDirectionalLight(DirectionalLightlight,vec3position,vec3normal)

{

returncalcLightColour(light.colour,light.intensity,position,normalize(light.di

rection),normal);

}

Finally,ourmainmethodjustaggregatesthecolourcomponentsoftheambientpointanddirectionallightstocalculatethefragmentcolour.

voidmain()

{

setupColours(material,outTexCoord);

vec4diffuseSpecularComp=calcDirectionalLight(directionalLight,mvVertexPos,mvV

ertexNormal);

diffuseSpecularComp+=calcPointLight(pointLight,mvVertexPos,mvVertexNormal);

fragColor=ambientC*vec4(ambientLight,1)+diffuseSpecularComp;

}

Andthat’sit,wecannowsimulatethemovementofthe,artificial,sunacrosstheskyandgetsomethinglikethis(movementisacceleratedsoitcanbeviewedwithoutwaitingtoolong).

SpotLightNowwewillimplementspotlightswhichareverysimilartopointlightsbuttheemittedlightisrestrictedtoa3Dcone.Itmodelsthelightthatcomesoutfromfocusesoranyotherlightsourcethatdoesnotemitinalldirections.Aspotlighthasthesameattributesasapointlightbutaddstwonewparameters,theconeangleandtheconedirection.

Lettherebeevenmorelight

122

Spotlightcontributioniscalculatedinthesamewayasapointlightwithsomeexceptions.Thepointwhichthevectorthatpointsfromthevertexpositiontothelightsourceisnotcontainedinsidethelightconearenotaffectedbythepointlight.

Howdowecalculateifit’sinsidethelightconeornot?Weneedtodoadotproductagainbetweenthevectorthatpointsfromthelightsourceandtheconedirectionvector(bothofthemnormalized).

Lettherebeevenmorelight

123

ThedotproductbetweenLandCvectorsisequalto: ⋅ = ∣ ∣ ⋅ ∣ ∣ ⋅ Cos(α).If,inourspotlightdefinitionwestorethecosineofthecutoffangle,ifthedotproductishigherthanthatvaluewewillknowthatitisinsidethelightcone(recallthecosinegraph,whenαangleis0,thecosinewillbe1,thesmallertheanglethehigherthecosine).

Theseconddifferenceisthatthepointsthatarefarawayfromtheconevectorwillreceivelesslight,thatis,theattenuationwillbehigher.Thereareseveralwaysofcalculatethis,wewillchoseasimpleapproachbymultiplyingtheattenuationbythefollowingfactor:

1 − (1 − Cos(α))/(1 − Cos(cutOffAngle)

(Inourfragmentshaderswewon’thavetheanglebutthecosineofthecutoffangle.Youcancheckthattheformulaaboveproducesvaluesfrom0to1,0whentheangleisequaltothecutoffangleand1whentheangleis0).

Theimplementationwillbeverysimilartotherestoflights.WeneedtocreateanewclassnamedSpotLight,setuptheappropriateuniforms,passittotheshaderandmodifythefragmentshadertogetit.Youcancheckthesourcecodeforthischapter.

Anotherimportantthingwhenpassingtheuniformsisthattranslationsshouldnotbeappliedtothelightconedirectionsinceweareonlyinterestedindirections.Soasinthecaseofthedirectionallight,whentransformingtoviewspacecoordinateswemustsetwcomponentto0.

L C L C

Lettherebeevenmorelight

124

MultipleLightsSoatlastwehavefinallyimplementedallthefourtypesoflight,butcurrentlywecanonlyuseoneinstanceforeachtype.Thisisokforambientanddirectionallightbutwedefinitivelywanttouseseveralpointandspotlights.Weneedtosetupourfragmentshadertoreceivealistoflights,sowewillusearraystostorethatinformation.Let’sseehowthiscanbedone.

Beforewestart,it’simportanttonotethatinGLSLthelengthofthearraymustbesetatcompiletimesoitmustbebigenoughtoaccommodatealltheobjectsweneedlater,atruntime.Thefirstthingthatwewilldoisdefinesomeconstantstosetupthemaximumnumberofpointandspotlightsthatwearegoingtouse.

constintMAX_POINT_LIGHTS=5;

constintMAX_SPOT_LIGHTS=5;

Thenweneedtomodifytheuniformsthatpreviouslystorejustasinglepointandspotlighttouseanarray.

uniformPointLightpointLights[MAX_POINT_LIGHTS];

uniformSpotLightspotLights[MAX_SPOT_LIGHTS];

Inthemainfunctionwejustneedtoiterateoverthosearraystocalculatethecolourcontributionsofeachinstanceusingtheexistingfunctions.Wemaynotpassasmanylightsasthearraylengthsoweneedtocontrolit.Therearemanypossiblewaystodothis,oneistopassauniformwiththeactualarraylengthbutthismaynotworkwitholdergraphicscards.Insteadwewillcheckthelightintensity(emptypositionsinarraywillhavealightintensityequalto0).

Lettherebeevenmorelight

125

for(inti=0;i<MAX_POINT_LIGHTS;i++)

{

if(pointLights[i].intensity>0)

{

diffuseSpecularComp+=calcPointLight(pointLights[i],mvVertexPos,mvVertexNor

mal);

}

}

for(inti=0;i<MAX_SPOT_LIGHTS;i++)

{

if(spotLights[i].pl.intensity>0)

{

diffuseSpecularComp+=calcSpotLight(spotLights[i],mvVertexPos,mvVertexNorma

l);

}

}

NowweneedtocreatethoseuniformsintheRenderclass.Whenweareusingarraysweneedtocreateauniformforeachelementofthelist.So,forinstance,forthepointLights

arrayweneedtocreateauniformnamedpointLights[0],pointLights[1],etc.Andofocurse,thistranslatesalsotothestructureattributes,sowewillhavepointLights[0].colour,pointLights[1],colour,etc.Themethodstocreatethoseuniformsareasfollows.

publicvoidcreatePointLightListUniform(StringuniformName,intsize)throwsException

{

for(inti=0;i<size;i++){

createPointLightUniform(uniformName+"["+i+"]");

}

}

publicvoidcreateSpotLightListUniform(StringuniformName,intsize)throwsException

{

for(inti=0;i<size;i++){

createSpotLightUniform(uniformName+"["+i+"]");

}

}

Wealsoneedmethodstosetupthevaluesofthoseuniforms.

Lettherebeevenmorelight

126

publicvoidsetUniform(StringuniformName,PointLight[]pointLights){

intnumLights=pointLights!=null?pointLights.length:0;

for(inti=0;i<numLights;i++){

setUniform(uniformName,pointLights[i],i);

}

}

publicvoidsetUniform(StringuniformName,PointLightpointLight,intpos){

setUniform(uniformName+"["+pos+"]",pointLight);

}

publicvoidsetUniform(StringuniformName,SpotLight[]spotLights){

intnumLights=spotLights!=null?spotLights.length:0;

for(inti=0;i<numLights;i++){

setUniform(uniformName,spotLights[i],i);

}

}

publicvoidsetUniform(StringuniformName,SpotLightspotLight,intpos){

setUniform(uniformName+"["+pos+"]",spotLight);

}

FinallywejustneedtoupdatetheRenderclasstoreceivealistofpointandspotlights,andmodifyaccordinglytheDummyGameclasstocreatethoselisttoseesomethinglikethis.

Lettherebeevenmorelight

127

GameHUDInthischapterwewillcreateaHUD(Heads-UpDisplay)forourgame.Thatis,asetof2Dshapesandtextthataredisplayedatanytimeoverthe3Dscenetoshowrelevantinformation.WewillcreateasimpleHUDthatwillserveustoshowsomebasictechniquesforrepresentingthatinformation.

Whenyouexaminethesourcecodeforthischpater,youwillseealsothatsomelittlerefactoringhasbeenappliedtothesourcecode.ThechangesaffectespeciallytheRendererclassinordertoprepareitfortheHUDrendering.

TextrenderingThefirstthingthatwewilldotocreateaHUDisrendertext.Inordertodothat,wearegoingtomapatexturethatcontainsalphabetcharactersintoaquad.Thatquadwillbedividedbyasetoftileswhichwillrepresentasingleletter.lateron,wewillusethattexturetodrawthetextinthescreen.Sothefirststepistocreatethetexturethatcontainsthealphabet.Youcanusemanyprogramsouttherethatcandothistask,suchas,CBG,F2IBuilder,etc.Inthiscase,WewilluseCodehead’sBitmapFontGenerator(CBFG).

CBGletsyouconfiguremanyoptionssuchasthetexturesize,thefonttype,theanti-aliasingtobeapplied,etc.Thefollowingfiguredepictstheconfigurationthatwewillusetogenerateatexturefile.InthischapterwewillassumethatwewillberenderingtextencodedinISO-8859-1format,ifyouneedtodealwithdifferentcharactersetsyouwillneedtotweakalittlebitthecode.

HUD

128

WhenyouhavefinishedconfiguringallthesettingsinCBGyoucanexporttheresulttoseveralimageformats.InthiscasewewillexportitasaBMPfileandthentransformittoPNGsoitcanbeloadedasatexture.WhentransformingittoPNGwewillsetupalsotheblackbackgroundastransparent,thatis,wewillsettheblackcolourtohaveanalphavalueequalto0(YoucanusetoolslikeGIMPtodothat).Attheendyouwillhavesomethingsimilarasthefollowingpicture.

Asyoucansee,theimagehasallthecharactersdisplayedinrowsandcolumns.Inthiscasetheimageiscomposedby15columnsand17rows.Byusingthecharactercodeofaspecificletterwecancalculatetherowandthecolumnthatisenclosedintheimage.The

HUD

129

columncanbecalculatedasfollows:column = codemodnumberOfColumns.Wheremod

isthemoduleoperator.Therowcanbecalculatedasfollows:row = code/numberOfCols,inthiscasewewilldoaintegerbyintegeroperationsowecanignorethedecimalpart.

WewillcreateanewclassnamedTextItemthatwillconstructallthegraphicalelementsneededtorendertext.Thisisasimplifiedversionthatdoesnotdealwithmultilinetexts,etc.butitwillallowustopresenttextualinformationintheHUD.Hereyoucanseethefirstlinesandtheconstructorofthisclass.

packageorg.lwjglb.engine;

importjava.nio.charset.Charset;

importjava.util.ArrayList;

importjava.util.List;

importorg.lwjglb.engine.graph.Material;

importorg.lwjglb.engine.graph.Mesh;

importorg.lwjglb.engine.graph.Texture;

publicclassTextItemextendsGameItem{

privatestaticfinalfloatZPOS=0.0f;

privatestaticfinalintVERTICES_PER_QUAD=4;

privateStringtext;

privatefinalintnumCols;

privatefinalintnumRows;

publicTextItem(Stringtext,StringfontFileName,intnumCols,intnumRows)throws

Exception{

super();

this.text=text;

this.numCols=numCols;

this.numRows=numRows;

Texturetexture=newTexture(fontFileName);

this.setMesh(buildMesh(texture,numCols,numRows));

}

AsyoucanseethisclassextendstheGameItemclass,thisisbecausewewillbeinterestedinchangingthetextpositioninthescreenandmayalsoneedtoscaleandrotateit.Theconstructorreceivesthetexttobedisplayedandtherelevantdataofthetexturefilethatwillbeusedtorenderit(thefilethatcontainstheimageandthenumberofcolumnsandrows).

IntheconstructorweloadthetextureimagefileandinvokeamethodthatwillcreateaMeshinstancethatmodelsourtext.Let’sexaminethebuildMeshmethod.

HUD

130

privateMeshbuildMesh(Texturetexture,intnumCols,intnumRows){

byte[]chars=text.getBytes(Charset.forName("ISO-8859-1"));

intnumChars=chars.length;

List<Float>positions=newArrayList();

List<Float>textCoords=newArrayList();

float[]normals=newfloat[0];

List<Integer>indices=newArrayList();

floattileWidth=(float)texture.getWidth()/(float)numCols;

floattileHeight=(float)texture.getHeight()/(float)numRows;

Thefirstlinesofcodecreatethedatastructuresthatwillbeusedtostorethepositions,texturecoordinates,normalsandindicesoftheMesh.Inthiscasewewillnotapplylightingsothenormalsarraywillbeempty.Whatwearegoingtodoisconstructaquadcomposedbyasetoftiles,eachofthemrepresentingasinglecharacter.Weneedtoassignalsotheappropriatetexturecoordinatesdependingonthecharactercodeassociatedtoeachtile.Thefollowingpictureshowsthedifferentelementsthatcomposethetilesandthequad.

So,foreachcharacterweneedtocreateatilewhichisformedbytwotriangleswhichcanbedefinedbyusingfourvertices(V1,V2,V3andV4).Theindiceswillbe(0,1,2)forthefirsttriangle(thelowerone)and(3,0,2)fortheothertriangle(theupperone).Texturecoordinatesarecalculatedbasedonthecolumnandtherowassociatedtoeachcharacterinthetextureimage.Texturecoordinatesneedtobeintherange[0,1]sowejustneedtodividethecurrentroworthecurrentcolumnbythetotalnumberofrowsorcolumnstogetthecoordinateassociatedtoV1.Fortherestofverticeswejustneedtoincreasethecurrentcolumnorrowbyoneinordertogettheappropriatecoordinate.

Thefollowingloopcreatesallthevertexposition,texturecoordinatesandindicesassociatedtothequadthatcontainsthetext.

HUD

131

for(inti=0;i<numChars;i++){

bytecurrChar=chars[i];

intcol=currChar%numCols;

introw=currChar/numCols;

//Buildacharactertilecomposedbytwotriangles

//LeftTopvertex

positions.add((float)i*tileWidth);//x

positions.add(0.0f);//y

positions.add(ZPOS);//z

textCoords.add((float)col/(float)numCols);

textCoords.add((float)row/(float)numRows);

indices.add(i*VERTICES_PER_QUAD);

//LeftBottomvertex

positions.add((float)i*tileWidth);//x

positions.add(tileHeight);//y

positions.add(ZPOS);//z

textCoords.add((float)col/(float)numCols);

textCoords.add((float)(row+1)/(float)numRows);

indices.add(i*VERTICES_PER_QUAD+1);

//RightBottomvertex

positions.add((float)i*tileWidth+tileWidth);//x

positions.add(tileHeight);//y

positions.add(ZPOS);//z

textCoords.add((float)(col+1)/(float)numCols);

textCoords.add((float)(row+1)/(float)numRows);

indices.add(i*VERTICES_PER_QUAD+2);

//RightTopvertex

positions.add((float)i*tileWidth+tileWidth);//x

positions.add(0.0f);//y

positions.add(ZPOS);//z

textCoords.add((float)(col+1)/(float)numCols);

textCoords.add((float)row/(float)numRows);

indices.add(i*VERTICES_PER_QUAD+3);

//Addindicesporlefttopandbottomrightvertices

indices.add(i*VERTICES_PER_QUAD);

indices.add(i*VERTICES_PER_QUAD+2);

}

Thearesomeimportantthingstonoticeinthepreviousfragmentofcode:

Wewillrepresenttheverticesusingscreencoordinates(rememberthattheoriginofthescreencoordinatesislocatedatthetopleftcorner).Theycoordinateoftheverticesontopofthetrianglesislowerthantheycoordinateoftheverticesonthebottomofthetriangles.

HUD

132

Wedon’tscaletheshape,soeachtileisataxdistanceequaltoacharacterwidth.Theheightofthetriangleswillbetheheightofeachcharacter.Thisisbecausewewanttorepresentthetextassimilaraspossibleastheoriginaltexture.(AnywaywecanlaterscaletheresultsinceTextItemclassinheritsfromGameItem).Wesetafixedvalueforthezcoordinate,sinceitwillbeirrelevantinordertodrawthisobject.

Thenextfigureshowsthecoordinatesofsomevertices.

Whydoweusescreencoordinates?Firstofall,becausewewillberendering2DobjectsinourHUDandoftenismorehandytousethem,andsecondlybecausewewilluseanorthographicprojectioninordertodrawthem.Wewillexplainwhatisanorthographicprojectionlateron.

TheTextItemclassiscompletedwithothermethodstogetthetextandtochangeitatruntime.Wheneverthetextischanged,weneedtocleanupthepreviousVAOs(storedintheMeshinstance)andcreateanewone.Wedonotneedtodestroythetexture,sowehavecreatedanewmethodintheMeshclasstojustremovethatdata.

publicStringgetText(){

returntext;

}

publicvoidsetText(Stringtext){

this.text=text;

Texturetexture=this.getMesh().getMaterial().getTexture();

this.getMesh().deleteBuffers();

this.setMesh(buildMesh(texture,numCols,numRows));

}

HUD

133

Nowthatwehavesetuptheinfrastuctureneededtodarwtext,Howdowedoit?Thebasisisfirsttorenderthe3Dscene,asinthepreviouschapters,andthenrenderthe2DHUDoverit.InordertorendertheHUDwewilluseanorthographicprojection(alsonamedorthogonalprojection).AnOrthographicprojectionisa2Drepresentationofa3Dobject.Youmayalreadyhaveseensomesamplesinblueprintsof3Dobjectswhichshowtherepresentationofthoseobjectsfromthetoporfromsomesides.Thefollowingpictureshowstheorthographicprojectionofacylinderfromthetopandfromthefront.

Thisprojectionisveryconvenientinordertodraw2Dobjectsbecauseit"ignores"thevaluesofthezcoordinates,thatis,thedistancetotheview.Withthisprojectiontheobjectssizesdonotdecreasewiththedistance(asintheperspectiveprojection).Inordertoprojectanobjectusinganortographicprojectionwewillneedtouseanothermatrix,theorthographicmatrixwhichformulaisshownbelow.

Thismatrixalsocorrectsthedistortionsthatotherwisewillbegeneratedduetothefactthatourwindowisnotalwaysaperfectsquarebutarectangle.Therightandbottomparameterswillbethescreensize,theleftandthetoponeswillbetheorigin.Theorthographicprojectionmatrixisusedtotransformscreencoordinatesto3Dspacecoordinates.Thefollowingpictureshowshowthismappingisdone.

HUD

134

Thepropertiesofthismatrix,willallowustousescreencoordinates.

WecannowcontinuewiththeeimplementationoftheHUD.Thenextthingthatweshoulddoiscreateanothersetofshaders,avertexandafragmentshaders,inordertodrawtheobjectsoftheHUD.Thevertexshaderisactuallyverysimple.

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

outvec2outTexCoord;

uniformmat4projModelMatrix;

voidmain()

{

gl_Position=projModelMatrix*vec4(position,1.0);

outTexCoord=texCoord;

}

Itwilljustreceivetheverticespositions,thetexturecoordinates,theindicesandthenormalsandwilltransformthemtothe3Dspacecoordinatesusingamatrix.Thatmatrixisthemultiplicationoftheortographicprojectionmatrixandthemodelmatrix,projModelMatrix = ortographicMatrix ⋅modelMatrix.Sincewearenotdoinganythingwiththecoordinatesinmodelspace,it’smuchmoreefficienttomultiplybothmatricesinjavacodethanintheshadere.Bydoingsowewillbedoingthatmultipliaztiononceperiteminstedofdoingitforeachvertex.Rememberthatourverticesshouldbeexpressedinscreencoordinates.

Thefragmentshaderisalsoverysimple.

HUD

135

#version330

invec2outTexCoord;

invec3mvPos;

outvec4fragColor;

uniformsampler2Dtexture_sampler;

uniformvec4colour;

voidmain()

{

fragColor=colour*texture(texture_sampler,outTexCoord);

}

Itjustusesthetexturecoordinatesandmultiplesthatcolourbyabasecolour.Thiscanbeusedtochangethecolourofthetexttoberenderedwithouttheneedofcreatingseveraltexturefiles.NowthatwehavecreatedthenewpairofshaderswecanusethemintheRendererclass.But,beforethat,wewillcreateanewinterfacenamedIHudthatwillcontainalltheelementsthataretobedisplayedintheHUD.Thisinterfacewillalsoprovideadefaultcleanupmethod.

packageorg.lwjglb.engine;

publicinterfaceIHud{

GameItem[]getGameItems();

defaultvoidcleanup(){

GameItem[]gameItems=getGameItems();

for(GameItemgameItem:gameItems){

gameItem.getMesh().cleanUp();

}

}

}

ByusingthatinterfaceourdifferentgamescandefinecustomHUDsbuttherenderingmechanismdoesnotneedtobechanged.NowwecangetbacktotheRendererclass,whichbythewayhasbeenmovedtotheenginegraphicspackagebecausenowit’sgenericenoughtonotbedependentonthespecificimplementationofeachgame.IntheRendererclasswehaveaddedanewmethodtocreate,linkandsetupanewShaderProgramthatusestheshadersdescribedabove.

HUD

136

privatevoidsetupHudShader()throwsException{

hudShaderProgram=newShaderProgram();

hudShaderProgram.createVertexShader(Utils.loadResource("/shaders/hud_vertex.vs"));

hudShaderProgram.createFragmentShader(Utils.loadResource("/shaders/hud_fragment.fs"

));

hudShaderProgram.link();

//CreateuniformsforOrtographic-modelprojectionmatrixandbasecolour

hudShaderProgram.createUniform("projModelMatrix");

hudShaderProgram.createUniform("colour");

}

TherendermethodfirstinvokesthemethodrenderScenewhichcontainsthecodefrompreviouschapterthatrenderedthe3Dscene,andanewmethod,namedrenderHud,torendertheHUD.

publicvoidrender(Windowwindow,Cameracamera,GameItem[]gameItems,

SceneLightsceneLight,IHudhud){

clear();

if(window.isResized()){

glViewport(0,0,window.getWidth(),window.getHeight());

window.setResized(false);

}

renderScene(window,camera,gameItems,sceneLight);

renderHud(window,hud);

}

TherenderHudmethodisasfollows:

HUD

137

privatevoidrenderHud(Windowwindow,IHudhud){

hudShaderProgram.bind();

Matrix4fortho=transformation.getOrthoProjectionMatrix(0,window.getWidth(),win

dow.getHeight(),0);

for(GameItemgameItem:hud.getGameItems()){

Meshmesh=gameItem.getMesh();

//SetortohtaphicandmodelmatrixforthisHUDitem

Matrix4fprojModelMatrix=transformation.getOrtoProjModelMatrix(gameItem,ort

ho);

hudShaderProgram.setUniform("projModelMatrix",projModelMatrix);

hudShaderProgram.setUniform("colour",gameItem.getMesh().getMaterial().getAmbi

entColour());

//RenderthemeshforthisHUDitem

mesh.render();

}

hudShaderProgram.unbind();

}

Thepreviousfragmentofcode,iteratesovertheelementsthatcomposetheHUDandmultipliestheorthographicprojectionmatrixbythemodelmatrixassociatedtoeachelement.Theorthographicprojectionmatrixisupdatedineachrendercall(becausethescreendimensionscanchange),andit’scalculatedinthefollowingway:

publicfinalMatrix4fgetOrthoProjectionMatrix(floatleft,floatright,floatbottom,

floattop){

orthoMatrix.identity();

orthoMatrix.setOrtho2D(left,right,bottom,top);

returnorthoMatrix;

}

InourgamepackagewewillcreateaHudclasswhichimplementstheIHudinterfaceandreceivesatextintheconstructorcreatinginternallyaTexIteminstance.

HUD

138

packageorg.lwjglb.game;

importorg.joml.Vector4f;

importorg.lwjglb.engine.GameItem;

importorg.lwjglb.engine.IHud;

importorg.lwjglb.engine.TextItem;

publicclassHudimplementsIHud{

privatestaticfinalintFONT_COLS=15;

privatestaticfinalintFONT_ROWS=17;

privatestaticfinalStringFONT_TEXTURE="/textures/font_texture.png";

privatefinalGameItem[]gameItems;

privatefinalTextItemstatusTextItem;

publicHud(StringstatusText)throwsException{

this.statusTextItem=newTextItem(statusText,FONT_TEXTURE,FONT_COLS,FONT_R

OWS);

this.statusTextItem.getMesh().getMaterial().setColour(newVector4f(1,1,1,1)

);

gameItems=newGameItem[]{statusTextItem};

}

publicvoidsetStatusText(StringstatusText){

this.statusTextItem.setText(statusText);

}

@Override

publicGameItem[]getGameItems(){

returngameItems;

}

publicvoidupdateSize(Windowwindow){

this.statusTextItem.setPosition(10f,window.getHeight()-50f,0);

}

}

IntheDummyGameclasswecreateaninstanceofthatclassaninitializeitwithadefaulttext,andwewillgetsomethinglikethis.

HUD

139

IntheTextureclassweneedtomodifythewaytexturesareinterpolatedtoimprovetextreadibility(youwillonlynoticeifyouplaywiththetextscaling).

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

Butthesampleisnotfnishedyet.Ifyouplaywiththezoomsothetextoverlapswiththecubeyouwillseethiseffect.

Thetextisnotdrawnwithatransparentbackground.Inordertoachievethat,wemustexplicitlyenablesupportforblendingsothealphacomponentcanbeused.WewilldothisintheWindowclasswhenwesetuptheotherinitializationparameterswiththefollowingfragmentofcode.

//Supportfortransparencies

glEnable(GL_BLEND);

glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);

Nowyouwillseethetextdrawnwithatransparentbackground.

HUD

140

CompletetheHUDNowthatwehaverenderedatextwecanaddmoreelementstotheHUD.Wewilladdacompassthatrotatesdependingonthedirectionthecameraisfacing.Inthiscase,wewilladdanewGameItemtotheHudclassthatwillhaveameshthatmodelsacompass.

Thecompasswillbemodeledbyan.objfilebutwillnothaveatextureassociated,insteaditwillhavejustabackgroundcolour.SoweneedtochangethefragmentshaderfortheHUDalittlebittodetectifwehaveatextureornot.WewillbeabletodothisbysettinganewuniformnamedhasTexture.

HUD

141

#version330

invec2outTexCoord;

invec3mvPos;

outvec4fragColor;

uniformsampler2Dtexture_sampler;

uniformvec4colour;

uniforminthasTexture;

voidmain()

{

if(hasTexture==1)

{

fragColor=colour*texture(texture_sampler,outTexCoord);

}

else

{

fragColor=colour;

}

}

ToaddthecompassthetheHUDwejustneedtocreateanewGameIteminstance,tntheHudclass,thatloadsthecompassmodelandaddsittothelistofitems.Inthiscasewewillneedtoscaleupthecompass.Rememberthatitneedstobeexpressedinscreencoordinates,sousuallyyouwillneedtoincreaseitssize.

//Createcompass

Meshmesh=OBJLoader.loadMesh("/models/compass.obj");

Materialmaterial=newMaterial();

material.setAmbientColour(newVector4f(1,0,0,1));

mesh.setMaterial(material);

compassItem=newGameItem(mesh);

compassItem.setScale(40.0f);

//Rotatetotransformittoscreencoordinates

compassItem.setRotation(0f,0f,180f);

//CreatelistthatholdstheitemsthatcomposetheHUD

gameItems=newGameItem[]{statusTextItem,compassItem};

Noticealsothat,inorderforthecompasstopointupwardsweneedtorotate180degreessincethemodelwilloftentendtouseOpenGLspacecoordinates.Ifweareexpectingscreencoordinatesitwouldpointingdownwards.TheHudclasswillalsoprovideamethodtoupdatetheangleofthecompassthatmusttakethisalsointoconsideration.

HUD

142

publicvoidrotateCompass(floatangle){

this.compassItem.setRotation(0,0,180+angle);

}

IntheDummyGameclasswewillupdatetheanglewheneverthecameraismoved.Weneedtousetheyanglerotation.

//Updatecamerabasedonmouse

if(mouseInput.isRightButtonPressed()){

Vector2frotVec=mouseInput.getDisplVec();

camera.moveRotation(rotVec.x*MOUSE_SENSITIVITY,rotVec.y*MOUSE_SENSITIVITY,0)

;

//UpdateHUDcompass

hud.rotateCompass(camera.getRotation().y);

}

Wewillgetsomethinglikethis(rememberthatitisonlyasample,inarealgameyoumayprobablywanttousesometexturetogivethecompassadifferentlook).

TextrenderingrevisitedBeforereviewingothertopicslet’sgobacktothetextrenderingapproachwehavepresentedhere.ThesolutionisverysimpleandhandytointroducetheconceptsinvolvedinrenderingHUDelementsbutitpresentssomeproblems:

Itdoesnotsupportnonlatincharactersets.Ifyouwanttouseseveralfontsyouneedtocreateaseparatetexturefileforeachfont.Also,theonlywaytochangethetextsizeiseithertoscaleit,whichmayresultinapoorqualityrenderedtext,ortogenerateanothertexturefile.

HUD

143

Themostimportantone,charactersinmostofthefontsdonotoccupythesamesizebutwearedividingthefonttextureinequallysizedelements.Wehavecleverlyused“Consolas”fontwhichismonospaced(thatis,allthecharactersoccupythesameamountofhorizontalspace),butifyouuseanon-monospacedfontyouwillseeannoyingvariablewhitespacesbetweenthecharacters.

Weneedtochangeourapproachanprovideamoreflexiblewaytorendertext.Ifyouthinkaboutit,theoverallmechanismisok,thatis,thewayofrenderingtextbytexturingquadsforeachcharacter.Theissuehereishowwearegeneratingthetextures.WeneedtobeabletogeneratethosetexturedynamicallybyusingthefontsavailableintheSystem.

Thisiswherejava.awt.Fontcomestotherescue,wewillgeneratethetexturesbydrawingeachletterforaspecifiedfontfamilyandsizedynamically.Thattexturewillbeusedinthesamewayasdescribedpreviously,butitwillsolveperfectlyalltheissuesmentionedabove.WewillcreateanewclassnamedFontTexturethatwillreceiveaFontinstanceandacharsetnameandwilldynamicallycreateatexturethatcontainsalltheavailablecharacters.Thisistheconstructor.

publicFontTexture(Fontfont,StringcharSetName)throwsException{

this.font=font;

this.charSetName=charSetName;

charMap=newHashMap<>();

buildTexture();

}

Thefirststepistohandlethenonlatinissue,givenacharsetandafontwewillbuildaStringthatcontainsallthecharactersthatcanberendered.

privateStringgetAllAvailableChars(StringcharsetName){

CharsetEncoderce=Charset.forName(charsetName).newEncoder();

StringBuilderresult=newStringBuilder();

for(charc=0;c<Character.MAX_VALUE;c++){

if(ce.canEncode(c)){

result.append(c);

}

}

returnresult.toString();

}

Let’snowreviewthemethodthatactuallycreatesthetexture,namedbuildTexture.

HUD

144

privatevoidbuildTexture()throwsException{

//Getthefontmetricsforeachcharacterfortheselectedfontbyusingimage

BufferedImageimg=newBufferedImage(1,1,BufferedImage.TYPE_INT_ARGB);

Graphics2Dg2D=img.createGraphics();

g2D.setFont(font);

FontMetricsfontMetrics=g2D.getFontMetrics();

StringallChars=getAllAvailableChars(charSetName);

this.width=0;

this.height=0;

for(charc:allChars.toCharArray()){

//Getthesizeforeachcharacterandupdateglobalimagesize

CharInfocharInfo=newCharInfo(width,fontMetrics.charWidth(c));

charMap.put(c,charInfo);

width+=charInfo.getWidth();

height=Math.max(height,fontMetrics.getHeight());

}

g2D.dispose();

Wefirstobtainthefontmetricsbycreatingatemporaryimage.ThenweiterateovertheStringthatcontainsalltheavailablecharactersandgetthewidth,withthehelpofthefontmetrics,ofeachofthem.Westorethatinformationonamap,charMap,whichwilluseasakeythecharacter.Withthatprocesswedeterminethesizeoftheimagethatwillhavethetexture(withaheightequaltothemaximumsizeofallthecharactersanditswithequaltothesumofeachcharacterwidth).CharSetisaninnerclassthatholdstheinformationaboutacharacter(itswidthandwhereitstarts,inthexcoordinate,inthetextureimage).

publicstaticclassCharInfo{

privatefinalintstartX;

privatefinalintwidth;

publicCharInfo(intstartX,intwidth){

this.startX=startX;

this.width=width;

}

publicintgetStartX(){

returnstartX;

}

publicintgetWidth(){

returnwidth;

}

}

HUD

145

Thenwewillcreateanimagethatwillcontainalltheavailablecharacters.Inordertodothis,wejustdrawthestringoveraBufferedImage.

//Createtheimageassociatedtothecharset

img=newBufferedImage(width,height,BufferedImage.TYPE_INT_ARGB);

g2D=img.createGraphics();

g2D.setRenderingHint(RenderingHints.KEY_ANTIALIASING,RenderingHints.VALUE_ANTIALI

AS_ON);

g2D.setFont(font);

fontMetrics=g2D.getFontMetrics();

g2D.setColor(Color.WHITE);

g2D.drawString(allChars,0,fontMetrics.getAscent());

g2D.dispose();

Wearegeneratinganimagewhichcontainsallthecharactersinasinglerow(wemaybearenotfulfillingthepremisethatthetextureshouldhaveasizeofapoweroftwo,butitshouldworkonmostmoderncards.Inanycaseyoucouldalwaysachievethatbyaddingsomeextraemptyspace).Youcanevenseetheimagethatwearegenerating,ifafterthatblockofcode,youputalinelikethis:

ImageIO.write(img,IMAGE_FORMAT,newjava.io.File("Temp.png"));

Theimagewillbewrittentoatemporaryfile.Thatfilewillcontainalongstripwithalltheavailablecharacters,drawninwhiteovertransparentbackgroundusingantialiasing.

Finally,wejustneedtocreateaTextureinstancefromthatimage,wejustdumptheimagebytesusingaPNGformat(whichiswhattheTextureclassexpects).

//Dumpimagetoabytebuffer

InputStreamis;

try(

ByteArrayOutputStreamout=newByteArrayOutputStream()){

ImageIO.write(img,IMAGE_FORMAT,out);

out.flush();

is=newByteArrayInputStream(out.toByteArray());

}

texture=newTexture(is);

}

HUD

146

YoumaynoticethatwehavemodifiedalittlebittheTextureclasstohaveanotherconstructorthatreceivesanInputStream.NowwejustneedtochangetheTextItemclasstoreceiveaFontTextureinstanceinitsconstructor.

publicTextItem(Stringtext,FontTexturefontTexture)throwsException{

super();

this.text=text;

this.fontTexture=fontTexture;

setMesh(buildMesh());

}

ThebuildMeshmethodonlyneedstobechangedalittlebitwhensettingquadandtexturecoordinates,thisisasampleforoneofthevertices.

floatstartx=0;

for(inti=0;i<numChars;i++){

FontTexture.CharInfocharInfo=fontTexture.getCharInfo(characters[i]);

//Buildacharactertilecomposedbytwotriangles

//LeftTopvertex

positions.add(startx);//x

positions.add(0.0f);//y

positions.add(ZPOS);//z

textCoords.add((float)charInfo.getStartX()/(float)fontTexture.getWidth());

textCoords.add(0.0f);

indices.add(i*VERTICES_PER_QUAD);

//..Morecode

startx+=charInfo.getWidth();

}

Youcanchecktherestofthechangesdirectlyinthesourcecode.ThefollowingpictureshowswhatyouwillgetforanArialfontwithasizeof20:

HUD

147

Asyoucanseethequalityoftherenderedtexthasbeenimprovedalot,youcanplaywithdifferentfontsandsizesandcheckitbyyourown.There’sstillplentyofroomforimprovement(likesupportingmultilinetexts,effects,etc.),butthiswillbeleftasanexerciseforthereader.

Youmayalsonoticethatwearestillabletoapplyscalingtothetext(wepassamodelviewmatrixintheshader).ThismaynotbeneedednowfortextbutitmaybeusefulforotherHUDelements.

WehavesetupalltheinfrastructureneededinordertocreateaHUDforourgames.Nowitisjustamatterofcreatingalltheelementsthatrepresentrelevantinformationtotheuserandgivethemaprofessionallookandfeel.

OSXIfyoutrytorunthesamplesinthischapter,andthenextonesthatrendertext,youmayfindthattheapplicationblocksandnothingisshowninthescreen.ThisisduetothefactthatAWTandGLFWdogetalongverywellunderOSX.But,whatdoesithavetodowithAWT?WeareusingtheFontclass,whichbelongstoAWT,andjustbyinstantiatingit,AWTgetsinitializedalso.InOSXAWTtriestorununderthemainthread,whichisalsorequiredbyGLFW.Thisiswhatcausesthismess.

InordertobeabletousetheFontclass,GLFWmustbeinitializedbeforeAWTandthesamplesneedtoberuninheadlessmode.Youneedtosetupthispropertybeforeanythinggetsintialized:

System.setProperty("java.awt.headless","true");

Youmaygetawarning,butthesampleswillrun.

Amuchmorecleanapproachwouldbetousethestblibrarytorendertext.

HUD

148

SkyBoxandsomeoptimizations

SkyboxAskyboxwillallowustosetabackgroundtogivetheillusionthatour3Dworldisbigger.Thatbackgroundiswrappedaroundthecamerapositionandcoversthewholespace.Thetechniquethatwearegoingtousehereistoconstructabigcubethatwillbedisplayedaroundthe3Dscene,thatis,thecentreofthecamerapositionwillbethecentreofthecube.Thesidesofthatcubewillbewrappedwithatexturewithhillsablueskyandcloudsthatwillbemappedinawaythattheimagelooksacontinuouslandscape.

Thefollowingpicturedepictstheskyboxconcept.

Theprocessofcreatingaskyboxcanbesummarizedinthefollowingsteps:

Createabigcube.Applyatexturetoitthatprovidestheillusionthatweareseeingagiantlandscapewithnoedges.Renderthecubesoitssidesareatafardistanceanditsoriginislocatedatthecentreofthecamera.

Then,let’sstartwiththetexture.Youwillfindthattherearelotsofpre-generatedtexturesforyoutouseintheinternet.Theoneusedinthesampleforthischapterhasbeendownloadedfromhere:http://www.custommapmakers.org/skyboxes.php.Theconcretesamplethatwe

SkyBoxandsomeoptimizations

149

haveusedisthisone:http://www.custommapmakers.org/skyboxes/zips/ely_hills.zipandhasbeencreatedbyColinLowndes.

ThetexturesfromthatsitearecomposedbyseparateTGAfiles,oneforeachsideofthecube.ThetextureloaderthatwehavecreatedexpectsasinglefileinPNGformatsoweneedtocomposeasinglePNGimagewiththeimagesofeachface.Wecouldapplyothertechniques,suchuscubemapping,inordertoapplythetexturesautomatically.But,inordertokeepthischapterassimpleaspossible,youwillhavetomanuallayarrangethemintoasinglefile.Theresultimagewilllooklikethis.

Afterthat,weneedtocreatea.objfilewhichcontainsacubewiththecorrecttexturecoordinatesforeachface.Thepicturebelowshowsthetilesassociatedtoeachface(youcanfindthe.objfileusedinthischapterinthebook’ssourcecode).

SkyBoxandsomeoptimizations

150

Oncetheresoureshavebeensetup,wecanstartcoding.WewillstartbycreatinganewclassnamedSkyBoxwithaconstructorthatreceivesthepathtotheOBJmodelthatcontainstheskyboxcubeandthetexturefile.ThisclasswillinheritfromGameItemastheHUDclassfromthepreviouschapter.WhyitshouldinheritfromGameItem?Firstofall,forconvenience,wecanreusemostofthecodethatdealswithmeshesandtextures.Secondly,because,althoughtheskyboxwillnotmovewewillbeinterestedinapplyingrotationsandscalingtoit.IfyouthinkaboutitaSkyBoxisindeedagameitem.ThedefinitionoftheSkyBoxclassisasfollows.

packageorg.lwjglb.engine;

importorg.lwjglb.engine.graph.Material;

importorg.lwjglb.engine.graph.Mesh;

importorg.lwjglb.engine.graph.OBJLoader;

importorg.lwjglb.engine.graph.Texture;

publicclassSkyBoxextendsGameItem{

publicSkyBox(StringobjModel,StringtextureFile)throwsException{

super();

MeshskyBoxMesh=OBJLoader.loadMesh(objModel);

TextureskyBoxtexture=newTexture(textureFile);

skyBoxMesh.setMaterial(newMaterial(skyBoxtexture,0.0f));

setMesh(skyBoxMesh);

setPosition(0,0,0);

}

}

Ifyoucheckthesourcecodeforthischapteryouwillseethatwehavedonesomerefactoring.WehavecreatedaclassnamedScenewhichgroupsalltheinformationrelatedtothe3Dworld.ThisisthedefinitionandtheattributesoftheSceneclass,thatcontainsaninstanceoftheSkyBoxclass.

SkyBoxandsomeoptimizations

151

packageorg.lwjglb.engine;

publicclassScene{

privateGameItem[]gameItems;

privateSkyBoxskyBox;

privateSceneLightsceneLight;

publicGameItem[]getGameItems(){

returngameItems;

}

//Morecodehere...

Thenextstepistocreateanothersetofvertexandfragmentshadersfortheskybox.But,whynotreusethesceneshadersthatwealreadyhave?Theansweristhat,actually,theshadersthatwewillneedareasimplifiedversionofthoseshaders,wewillnotbeapplyinglightstotheskybox(ortobemoreprecise,wewon’tneedpoint,spotordirectionallights).Belowyoucanseetheskyboxvertexshader.

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

outvec2outTexCoord;

uniformmat4modelViewMatrix;

uniformmat4projectionMatrix;

voidmain()

{

gl_Position=projectionMatrix*modelViewMatrix*vec4(position,1.0);

outTexCoord=texCoord;

}

Youcanseethatwestillusethemodelviewmatrix.Atishasbeenexplainedbeorewewillscaletheskybox,soweneedthattransformationmatrix.Youmayseesomeotherimplementationsthatincreasethesizeofthecubethatmodelstheskyboxatstarttimeanddonotneedtomultiplythemodelandtheviewmatrix.Wehavechosenthisapproachbecauseit’smoreflexibleanditallowsustochangethesizeoftheskyboxatruntime,butyoucaneasilyswitchtotheotherapproachifyouwant.

Thefragmentshaderisalsoverysimple.

SkyBoxandsomeoptimizations

152

#version330

invec2outTexCoord;

invec3mvPos;

outvec4fragColor;

uniformsampler2Dtexture_sampler;

uniformvec3ambientLight;

voidmain()

{

fragColor=vec4(ambientLight,1)*texture(texture_sampler,outTexCoord);

}

Asyoucansee,weaddedanambientlightuniformtotheshader.Thepurposeofthisuniformistomodifythecolouroftheskyboxtexturetosimulatedayandnight(Ifnot,theskyboxwilllooklikeifitwasmiddaywhentherestoftheworldisdark).

IntheRendererclasswejusthaveaddedanewmethodtousethoseshadersandsetuptheuniforms(nothingnewhere).

privatevoidsetupSkyBoxShader()throwsException{

skyBoxShaderProgram=newShaderProgram();

skyBoxShaderProgram.createVertexShader(Utils.loadResource("/shaders/sb_vertex.vs")

);

skyBoxShaderProgram.createFragmentShader(Utils.loadResource("/shaders/sb_fragment.

fs"));

skyBoxShaderProgram.link();

skyBoxShaderProgram.createUniform("projectionMatrix");

skyBoxShaderProgram.createUniform("modelViewMatrix");

skyBoxShaderProgram.createUniform("texture_sampler");

skyBoxShaderProgram.createUniform("ambientLight");

}

Andofcourse,weneedtocreateanewrendermethodfortheskyboxthatwillbeinvokedintheglobalrendermethod.

SkyBoxandsomeoptimizations

153

privatevoidrenderSkyBox(Windowwindow,Cameracamera,Scenescene){

skyBoxShaderProgram.bind();

skyBoxShaderProgram.setUniform("texture_sampler",0);

//UpdateprojectionMatrix

Matrix4fprojectionMatrix=transformation.getProjectionMatrix(FOV,window.getWidt

h(),window.getHeight(),Z_NEAR,Z_FAR);

skyBoxShaderProgram.setUniform("projectionMatrix",projectionMatrix);

SkyBoxskyBox=scene.getSkyBox();

Matrix4fviewMatrix=transformation.getViewMatrix(camera);

viewMatrix.m30(0);

viewMatrix.m31(0);

viewMatrix.m32(0);

Matrix4fmodelViewMatrix=transformation.getModelViewMatrix(skyBox,viewMatrix);

skyBoxShaderProgram.setUniform("modelViewMatrix",modelViewMatrix);

skyBoxShaderProgram.setUniform("ambientLight",scene.getSceneLight().getAmbientLig

ht());

scene.getSkyBox().getMesh().render();

skyBoxShaderProgram.unbind();

}

Themethodaboveisquitesimilartotheotherrenderonesbutthere’sadifferencethatneedstobeexplained.Asyoucansee,wepasstheprojectionmatrixandthemodelviewmatrixasusual.But,whenwegettheviewmatrix,wesetsomeofthecomponentsto0.Whywedothis?Thereasonbehindthatisthatwedonotwanttranslationtobeappliedtotheskybox.

Rememberthatwhenwemovethecamera,whatweareactuallydoingismovingthewholeworld.Soifwejustmultiplytheviewmatrixasitis,theskyboxwillbedisplacedwhenthecameramovees.Butwedonotwantthis,wewanttostickitattheorigincoordinatesat(0,0,0).Thisisachievedbysettingto0thepartsoftheviewmatrixthatcontainthetranslationincrements(them30,m31andm32components).

Youmaythinkthatyoucouldavoidusingtheviewmatrixatallsincetheskyboxmustbefixedattheorigin.Inthatcasewhat,youwillseeisthattheskyboxwillnotrotatewiththecamera,whichisnotwhatwewant.Weneedittorotatebutnottranslate.

Andthat’sall,youcancheckinthesourcecodeforthischapterthatintheDummyGameclassthatwehavecreatedmoreblockinstancestosimulateagroundandtheskybox.Youcanalsocheckthatwenowchangetheambientlighttosimulatelightandday.Whatyouwillgetissomethinglikethis.

SkyBoxandsomeoptimizations

154

Theskyboxisasmallonesoyoucaneasilyseetheeffectofmovingthroughtheworld(inarealgameitshouldbemuchbigger).Youcanseealsothattheworldspaceobjects,theblocksthatformtheterrainarelargerthantheskybox,soasyoumovethroughityouwillseeblocksappearingthroughthemountains.Thisismoreevidentbecauseoftherelativesmallsizeoftheskyboxwehaveset.Butanywaywewillneedtoalleviatethatbyaddinganeffectthathidesorblurdistantobjects(forinstanceapplyingafogeffect).

Anotherreasonfornotcreatingabiggerskyboxisbecauseweneedtoapplyseveraloptimizationsinordertobemoreefficient(theywillbeexplainedlateron).

Youcanplaywiththerendermethodancommentthelinesthatpreventtheskyboxfromtranslating.Thenyouwillbeabletogetoutoftheboxandseesomethinglikethis.

SkyBoxandsomeoptimizations

155

Althoughitisnotwhataskyboxshoulddoitcanhelpyououttounderstandtheconceptbehindthistecnique.Rememberthatthisisasimpleexample,youcouldimproveitbyaddingothereffectssuchasthesunmovingthroughtheskyormovingclouds.Also,inordertocreatebiggerworldsyouwillneedtosplittheworldintofragmentsandonlyloadtheonesthatarecontiguoustothefragmentwheretheplayeriscurrentlyin.

Anotherpointthatisworthtomentioniswhenshouldwerendertheskybox,beforethesceneorafter?Renderingafterthescenehasbeendrawnismoreoptimal,sincemostofthefragmentswillbediscardedduetodepthtesting.Thatis,nonvisibleskyboxfragments,theones,thatwillbehiddenbysceneelementswillbediscarded.WhenOpenGLwilltrytorenderthem,anddepthtestisenabled,itwilldiscardtheoneswhicharebehindsomepreviouslyrenderedfragments,whichwillhavealowerdepthvalue.Sotheanswermightbeobvious,right?Justrendertheskyboxafterthescenehasbeenrendered.Theproblemwiththisapproachishandlingtransparenttextures.Ifwehave,inthescene,objectswithtransparenttextures,theywillbedrawnusingthe"background"colour,whichisnowblack.Ifwerendertheskyboxbefore,thetransparenteffectwillbeappliedcorrectly.So,shallwerenderitbeforethescenethen?Well,yesandno.Ifyourenderbeforethesceneisrenderedyouwillsolvetransparencyissuesbutyouwillimpactperformance.Infact,youstillmayfacetransparencyissueswithoutaskybox.Forinstance,let'ssaythatyouhaveatransparentitem,whichoverlapswithanobjectthatisfaraway.Ifthetransparentobjectisrenderedfirst,youwillfacealsotransperentissues.So,maybeanotherapproachcanbeto

SkyBoxandsomeoptimizations

156

drawtransparentitems,seperately,afteralltheotheritemshavebeenrendered.Thisistheapproachusedbysomecommercialgames.So,bynow,wewillrendertheskyboxafterthescenehasbeenrendered,tryingtogetbetterperformance.

SomeoptimizationsFromthepreviousexample,thefactthattheskyboxisrelativesmallmakestheeffectalittlebitweird(youcanseeobjectsappearingmagicallybehindthehills).So,ok,let’sincreasetheskyboxsizeandthesizeofourworld.Let’sscalethesizeoftheskyboxbyafactorof50sotheworldwillbecomposedby40,000GameIteminstances(cubes).

Ifyouchangethescalefactorandreruntheexampleyouwillseethatperformanceproblemstartstoariseandthemovementthroughthe3Dworldisnotsmooth.It’stimetoputaneyeonperformance(youmayknowtheoldsayingthatstatesthat“prematureoptimizationistherootofallevil”,butsincethischapter13,Ihopenobodywillsaythatthispremature).

Let’sstartwithaconceptthatwillreducetheamountofdatathatisbeingrendered,whichisnamedfaceculling.Inourexampleswearerenderingthousandsofcubes,andacubeismadeofsixfaces.Wearerenderingthesixfacesforeachcubeeveniftheyarenotvisible.Youcancheckthisifyouzoominsideacube,youwillseeitsinteriorlikethis.

Facesthatcannotbeseenshouldbediscardedimmediatelyandthisiswhatfacecullingdoes.Infact,foracubeyoucanonlysee3facesatthesametime,sowecanjustdiscardhalfofthefaces(40,00032triangles)justbyapplyingfaceculling(thiswillonlybevalidifyourgamedoesnotrequireyoutodiveintotheinnersideofamodel,youcanseewhylateron).

Facecullingchecks,foreverytriangleifitsfacingtowardsusanddiscardstheonesthatarenotfacingthatdirection.But,howdoweknowifatriangleisfacingtowardsusornot?Well,thewaythatOpenGLdoesthisisbythewindingorderoftheverticesthatcomposeatriangle.

SkyBoxandsomeoptimizations

157

Rememberfromthefirstchaptersthatwemaydefinetheverticesofatriangleinclockwiseorcounter-clockwiseorder.InOpenGL,bydefault,trianglesthatareincounter-clockwiseorderarefacingtowardstheviewerandtrianglesthatareinclockwiseorderarefacingbackwards.Thekeythinghere,isthatthisorderischeckedwhilerenderingtakingintoconsiderationthepointofview.Soatrianglethathasbeendefinedincounter-clockwiseordercanbeinterpreted,atrenderingtime,asbeingdefinedlockwisebecauseofthepointofview.

Let’sputitinpractice,intheinitmethodoftheWindowclassaddthefollowinglines:

glEnable(GL_CULL_FACE);

glCullFace(GL_BACK);

Thefirstlinewillenablefacecullingandthesecondlinestatesthatfacesthatarefacingbackwardsshouldbeculled(removed).Withthatlineifyoulookupwardsyouwillseesomethinglikethis.

What’shappening?ifyoureviewtheverticesorderforthetopfaceyouwillseethatishasbeendefinedincounter-clockwiseorder.Well,itwas,butrememberthatthewindingreferstothepointofview.Infact,ifyouapplytranslationalsototheskyboxsoyouareabletoseeitformtheupsideyouwillseethatthetopfaceisrenderedagainonceyouareoutsideit.

SkyBoxandsomeoptimizations

158

Let’ssketchwhat’shappening.Thefollowingpictureshowsoneofthetrianglesofthetopfaceoftheskyboxcube,whichisdefinedbythreeverticesdefinedincounter-clockwiseorder.

Butrememberthatweareinsidetheskybox,ifwelookatthecubeformtheinterior,whatwewillseeisthattheverticesaredefinedinclockwiseorder.

Thisisbecause,theskyboxwasdefinedtobelookedfromtheoutside.Soweneedtoflipthedefinitionofsomeofthefacesinordertobeviewedcorrectlywhenfacecullingisenabled.

SkyBoxandsomeoptimizations

159

Butthere’sstillmoreroomforoptimization.Let’sreviewourrenderingprocess.IntherendermethodoftheRendererclasswhatwearedoingisiterateoveraGameItemarrayandrendertheassociatedMesh.ForeachGameItemwedothefollowing:

1. Setupthemodelviewmatrix(uniqueperGameItem).2. GettheMeshassociatedtotheGameItemandactivatethetexture,bindtheVAOand

enableitsattributes.3. Performacalltodrawthetriangles.4. DisablethetextureandtheVAOelements.

But,inourcurrentgame,wereusethesameMeshforthe40,000GameItems,wearerepeatingtheoperationsfrompoint2topoint4againandagain.Thisisnotveryefficient,keepinmindthateachcalltoanOpenGLfunctionisanativecallthatincursinsomeperformanceoverhead.Besidesthat,weshouldalwaystrytolimitthestatechangesinOpenGL(activatinganddeactivatingtextures,VAOsarestatechanges).

WeneedtochangethewaywedothingsandorganizeourstructuresaroundMeshessinceitwillbeveryfrequenttohavemanyGameItemswiththesameMesh.NowwehaveanarrayofGameItemseachofthempointingtothesameMesh.Wehavesomethinglikethis.

Instead,wewillcreateaMapofMesheswithalistoftheGamItemsthatsharethatMesh.

Therenderingstepswillbe,foreachMesh:

1. Setupthemodelviewmatrix(uniqueperGameItem).2. GettheMeshassociatedtotheGameItemandActivatetheMeshtexture,bindthe

VAOandenableitsattributes.3. ForeachGameItemassociated:a.Setupthemodelviewmatrix(uniqueperGame

Item).b.Performacalltodrawthetriangles.

SkyBoxandsomeoptimizations

160

4. DisablethetextureandtheVAOelements.

IntheSceneclass,wewillstorethefollowingMap.

privateMap<Mesh,List<GameItem>>meshMap;

WestillhavethesetGameItemsmethod,butinsteadofjuststoringthearray,weconstructthemeshmap.

publicvoidsetGameItems(GameItem[]gameItems){

intnumGameItems=gameItems!=null?gameItems.length:0;

for(inti=0;i<numGameItems;i++){

GameItemgameItem=gameItems[i];

Meshmesh=gameItem.getMesh();

List<GameItem>list=meshMap.get(mesh);

if(list==null){

list=newArrayList<>();

meshMap.put(mesh,list);

}

list.add(gameItem);

}

}

TheMeshclassnowhasamethodtorenderalistoftheassociatedGameItemsandwehavesplittheactivatinganddeactivatingcodeintoseparatemethods.

SkyBoxandsomeoptimizations

161

privatevoidinitRender(){

Texturetexture=material.getTexture();

if(texture!=null){

//Activatefirstexturebank

glActiveTexture(GL_TEXTURE0);

//Bindthetexture

glBindTexture(GL_TEXTURE_2D,texture.getId());

}

//Drawthemesh

glBindVertexArray(getVaoId());

glEnableVertexAttribArray(0);

glEnableVertexAttribArray(1);

glEnableVertexAttribArray(2);

}

privatevoidendRender(){

//Restorestate

glDisableVertexAttribArray(0);

glDisableVertexAttribArray(1);

glDisableVertexAttribArray(2);

glBindVertexArray(0);

glBindTexture(GL_TEXTURE_2D,0);

}

publicvoidrender(){

initRender();

glDrawElements(GL_TRIANGLES,getVertexCount(),GL_UNSIGNED_INT,0);

endRender();

}

publicvoidrenderList(List<GameItem>gameItems,Consumer<GameItem>consumer){

initRender();

for(GameItemgameItem:gameItems){

//SetupdatarequieredbygameItem

consumer.accept(gameItem);

//Renderthisgameitem

glDrawElements(GL_TRIANGLES,getVertexCount(),GL_UNSIGNED_INT,0);

}

endRender();

}

AsyoucanseewestillhavetheoldmethodthatrenderstheaMeshtakingintoconsiderationthatwehaveonlyoneGameItem(thismaybeusedinothercases,thisiswhyithasnotbeenremoved).ThenewmethodrendersalistofGameItemsandreceivesasa

SkyBoxandsomeoptimizations

162

parameteraConsumer(afunction,thatusesthenewfunctionalprogrammingparadigmsintroducedinJava8),whichwillbeusedtosetupwhat’sspecificforeachGameItembeforedrawingthetriangles.Wewillusethistosetupthemodelviewmatrix,sincewedonotwanttheMeshclasstobecoupledwiththeuniformsnamesandtheparametersinvolvedwhensettingthesethingsup.

IntherenderScenemethodoftheRendererclassyoucanseethatwejustiterateovertheMeshmapandsetupthemodelviewmatrixuniformviaalambda.

for(Meshmesh:mapMeshes.keySet()){

sceneShaderProgram.setUniform("material",mesh.getMaterial());

mesh.renderList(mapMeshes.get(mesh),(GameItemgameItem)->{

Matrix4fmodelViewMatrix=transformation.buildModelViewMatrix(gameItem,viewM

atrix);

sceneShaderProgram.setUniform("modelViewMatrix",modelViewMatrix);

}

);

}

Anothersetofoptimizationsthatwecandoisthatwearecreatingtonsofobjectsintherendercycle.Inparticular,wewerecreatingtoomanyMatrix4finstancesthatholdsacopyathemodelviewmatrixforeachGameIteminstance.WewillcreatespecificmatricesforthatintheTransformationclass,andreusethesameinstance.Ifyoucheckthecodeyouwillseealsothatwehavechangedthenamesofthemethods,thegetXXmethodsjustreturnthestorematrixinstanceandanymethodthatchangesthevalueofamatrixiscalledbuildXXtoclarifyitspurpose.

WehavealsoavoidedtheconstructionofnewFloatBufferinstanceseachtimewesetauniformforaMatrixandremovedsomeotheruselessinstantiations.Withallthatinplaceyoucanseenowthattherenderingissmootherandmoreagile.

Youcancheckallthedetailsinthesourcecode.

SkyBoxandsomeoptimizations

163

HeightMapsInthischapterwewilllearnhowtocreatecomplexterrainsusingheightmaps.Beforewestart,youwillnoticethatsomerefactoringhasbeendone.Wehavecreatedsomenewpackagesandmovedsomeoftheclassestobetterorganizethem.Youcancheckthechangesinthesourcecode.

Sowhat’saheightmap?Aheightmapisanimagewhichisusedtogeneratea3Dterrainwhichusesthepixelcolourstogetsurfaceelevationdata.HeightmapsimagesuseusuallygrayscaleandcanbegeneratedbyprogramslikeTerragen.Aheightmapimagelookslikethis.

Theimageaboveit’slikeifyouwerelookingatafragmentoflandfromabove.Withthatimagewewillbuildameshcomposedbytrianglesformedbyvertices.Thealtitudeofeachvertexwillbecalculateddependingonthecolourofeachoftheimagepixels.Blackcolourwillrepresentthelowestvalueandwhitethehighestone.

Wewillbecreatingagridofvertices,oneforeachpixeloftheimage.Thoseverticeswillbeusedtoformtrianglesthatwillcomposethemeshasshowninthenextfigure.

HeightMaps

164

Thatmeshwillformagiantquadthatwillberenderedacrossxandzaxisusingthepixelcolourstochangetheelevationintheyaxis.

Theprocessofcreatinga3Dterrainfromaheightmapcanbesummarizedasfollows:

Loadtheimagethatcontainstheheightmap.(WewilluseaBufferedImageinstancetogetaccesstoeachpixel).Foreachimagepixelcreateavertexwithitsheightisbasedonthepixelcolour.Assignthecorrecttexturecoordinatetothevertex.Setuptheindicestodrawthetrianglesassociatedtothevertex.

HeightMaps

165

WewillcreateaclassnamedHeightMapMeshthatwillcreateaMeshbasedonaheightmapimageperformingthestepsdescribedabove.Let’sfirstreviewtheconstantsdefinedforthatclass:

privatestaticfinalintMAX_COLOUR=255*255*255;

Aswehaveexplainedabove,wewillcalculatetheheightofeachvertexbasedonthecolourofeachpixeloftheimageusedasheightmap.Imagesareusuallygreyscale,foraPNGimagethatmeansthateachRGBcomponentforeachpixelcanvaryfrom0to255,sowehave256discretevaluestodefinedifferentheights.Thismaybeenoughprecisionforyou,butifit’snotwecanusethethreeRGBcomponentstohavemoreintermediatevalues,in

thiscasetheheightcanbecalculatedformarangethatgetsfrom0to255 .Wewillchoosethesecondapproachsowearenotlimitedtogreyscaleimages.

Thenextconstantsare:

privatestaticfinalfloatSTARTX=-0.5f;

privatestaticfinalfloatSTARTZ=-0.5f;

Themeshwillbeformedbyasetofvertices(oneperpixel)whosexandzcoordinateswillbeintherangefollowingrange:

[-0.5,0.5],thatis,[STARTX,-STARTX]forthexaxis.[-0.5,0.5],thatis,[STARTZ,-STARTZ]forthezaxis.

Don'tworrytoomuchaboutthosevalues,laterontheresultingmeshcanbescaledtoaccommodateitssizeintheworld.Regardingyaxis,wewillsetuptwoparameters,minYandmaxY,forsettingthelowestandhighestvaluethattheycoordinatecanhave.Theseparametersarenotconstantbecausewemaywanttochangethematruntime,independentlyofthescalingapplied.Attheend,theterrainwillbecontainedinacubeintherange[STARTX,-STARTX],[minY,maxY]and[STARTZ,-STARTZ].

ThemeshwillbecreatedintheconstructoroftheHeightMapMeshclass,whichisdefinedlikethis.

publicHeightMapMesh(floatminY,floatmaxY,StringheightMapFile,StringtextureFile,

inttextInc)throwsException{

3

HeightMaps

166

Itreceivestheminimumandmaximumvalefortheyaxis,thenameofthefilethatcontainstheimagetobeusedasheightmapandthetexturefiletobeused.ItalsoreceivesanintegernamedtextIncthatwewilldiscusslateron.

ThefirstthingthatwedointheconstructoristoloadtheheightmapimageintoaBufferedImageinstance.

this.minY=minY;

this.maxY=maxY;

PNGDecoderdecoder=newPNGDecoder(getClass().getResourceAsStream(heightMapFile));

intheight=decoder.getHeight();

intwidth=decoder.getWidth();

ByteBufferbuf=ByteBuffer.allocateDirect(

4*decoder.getWidth()*decoder.getHeight());

decoder.decode(buf,decoder.getWidth()*4,PNGDecoder.Format.RGBA);

buf.flip();

Then,weloadthetexturefileintoaByteBufferandsetupthevariablesthatwewillneedtoconstructtheMesh.TheincxandinczvariableswillhavetheincrementtobeappliedtoeachvertexinthexandzcoordinatessotheMeshcoverstherangestatedabove.

Texturetexture=newTexture(textureFile);

floatincx=getWidth()/(width-1);

floatincz=Math.abs(STARTZ*2)/(height-1);

List<Float>positions=newArrayList();

List<Float>textCoords=newArrayList();

List<Integer>indices=newArrayList();

Afterthatwearereadytoiterateovertheimage,creatingavertexpereachpixel,settingupitstexturecoordinatesandsettinguptheindicestodefinecorrectlythetrianglesthatcomposetheMesh.

HeightMaps

167

for(introw=0;row<height;row++){

for(intcol=0;col<width;col++){

//Createvertexforcurrentposition

positions.add(STARTX+col*incx);//x

positions.add(getHeight(col,row,width,buf));//y

positions.add(STARTZ+row*incz);//z

//Settexturecoordinates

textCoords.add((float)textInc*(float)col/(float)width);

textCoords.add((float)textInc*(float)row/(float)height);

//Createindices

if(col<width-1&&row<height-1){

intleftTop=row*width+col;

intleftBottom=(row+1)*width+col;

intrightBottom=(row+1)*width+col+1;

intrightTop=row*width+col+1;

indices.add(rightTop);

indices.add(leftBottom);

indices.add(leftTop);

indices.add(rightBottom);

indices.add(leftBottom);

indices.add(rightTop);

}

}

}

Theprocessofcreatingthevertexcoordinatesisselfexplanatory.Let’signoreatthismomentwhywemultiplythetexturecoordinatesbyanumberandhowtheheightiscalculated.Youcanseethatforeachvertexwedefinetheindicesoftwotriangles(exceptifweareinthelastroworcolumn).Let’svisualizeitwitha3 × 3imagetovisualizehowtheyareconstructed.A3 × 3imagecontains9vertices,andthus4quadsformedby2 × 4triangles.Thefollowingpictureshowsthatgrid,namingeachvertexintheformVrc(r:row,c:column).

HeightMaps

168

Whenweareprocessingthefirstvertex(V00),wedefinetheindicesofthetwotrianglesshadedinred.

Whenweareprocessingthesecondvertex(V01),wedefinetheindicesofthetwotrianglesshadedinred.But,whenweareprocessingthethethirdvertex(V02)wedonotneedtodefinemoreindices,thetrianglesforthatrowhavealreadybeendefined.

HeightMaps

169

Youcaneasilyseehowtheprocesscontinuesfortherestofvertices.Now,oncewehavecreatedalltheverticespositions,thetexturecoordinatesandtheindiceswejustneedtocreateaMeshandtheassociatedMaterialwithallthatdata.

float[]posArr=Utils.listToArray(positions);

int[]indicesArr=indices.stream().mapToInt(i->i).toArray();

float[]textCoordsArr=Utils.listToArray(textCoords);

float[]normalsArr=calcNormals(posArr,width,height);

this.mesh=newMesh(posArr,textCoordsArr,normalsArr,indicesArr);

Materialmaterial=newMaterial(texture,0.0f);

mesh.setMaterial(material);

Youcanseethatwecalculatethenormalstakingasaninputthevertexpositions.Beforeweseehownormalscanbecalculated,let’sseehowheightsareobtained.WehavecreatedamethodnamedgetHeightwhichcalculatestheheightforavertex.

privatefloatgetHeight(intx,intz,intwidth,ByteBufferbuffer){

byter=buffer.get(x*4+0+z*4*width);

byteg=buffer.get(x*4+1+z*4*width);

byteb=buffer.get(x*4+2+z*4*width);

bytea=buffer.get(x*4+3+z*4*width);

intargb=((0xFF&a)<<24)|((0xFF&r)<<16)

|((0xFF&g)<<8)|(0xFF&b);

returnthis.minY+Math.abs(this.maxY-this.minY)*((float)argb/(float)MAX_C

OLOUR);

}

Themethodreceivesthexanzcoordinatesforapixel,thewidthoftheimageandtheByteBufferthatconatinsitandreturnstheRGBcolour(thesumoftheindividualR,GandBcomponents)andassignsavaluecontainedbetweenminYandmaxY(minYforblackcolourandmaxYforwhitecolour).

YoumaydevelopasimplerversionusingaBufferedImagewhichcontainshandyethodsdorgettingRGBvalues,butwewouldbeusingAWT.RememberthatAWTdoesnotmixwellwithOSXsotrytoavoidusingtheirclasses.

Let’sviewnowhowtexturecoordinatesarecalculated.Thefirstoptionistowrapthetexturealongthewholemesh,thetopleftvertexwouldhave(0,0)texturecoordinatesandthebottomrightvertexwouldhave(1,1)texturecoordinates.Theproblemwiththisapproachisthatthetextureshouldbehugeinordertoprovidegoodresults,ifnot,itwouldbestretchedtoomuch.

Butwecanstilluseasmalltexturewithverygoodresultsbyemployingaveryefficienttechnique.Ifwesettexturecoordinatesthatarebeyondthe[1, 1]range,wegetbacktooriginandstartcountingagainfromthestart.Thefollowingpictureshowsthisbehaviortiling

HeightMaps

170

thesametextureinseveralquadsthatextendbeyondthe[1, 1]range.

Thisiswhatwewilldowhensettingthetexturecoordinates.Wewillbemultiplyingthetexturecoordinates(calculatedasifthetexturejustwaswrappedcoveringthewholemesh)byafactor,thetextIncparameter,toincreasethenumberofpixelsofthetexturetobeusedbetweenadjacentvertices.

Theonlythingthat’spendingnowisnormalcalculation.Rememberthatweneednormalssolightcanbeappliedtotheterraincorrectly.Withoutnormalsourterrainwillberenderedwiththesamecolournomatterhowlighthitseachpoint.Themethodthatwewilluseheremaynotbethemostefficientforheightmapsbutitwillhelpyouunderstandhownormalscanbeauto-calculated.Ifyousearchforothersolutionsyoumayfindmoreefficientapproachesthat

HeightMaps

171

onlyusetheheightsofadjacentpointswithoutperformingcrossproductoperations.Neverthelesssincethiswillonlybedoneatstartup,themethodpresentedherewillnothurtperformancesomuch.

Let’sgraphicallyexplainhowanormalcanbecalculated.Imaginethatwehaveavertex

named .Wefirstcalculate,foreachofthesurroundingvertices( , , and ),the

vectorsthataretangenttothesurfacethatconnectsthesepoints.Thesevectors,( , ,

and ),arecalculatedbysubtractingeachadjacentpointfrom ( = − ,etc.)

Then,wecalculatethenormalforeachoftheplanesthatconnectstheadjacentpoints.Thisisdonebyperformingthecrossproductbetweenthepreviouscalculatedvectors.For

instance,thenormalofthesurfacethatconnects and (shadedinblue)iscalculated

asthecrossproductbetween and , = × .

P0 P1 P02 P3 P4

V 1 V 2

V 3 V 4 P0 V 1 P1 P0

P1 P2

V 1 V 2 V 12 V 1 V 2

HeightMaps

172

Ifwecalculatetherestofthenormalsfortherestofthesurfaces( = × ,

= × and = × ),thenormalfor willbethesum(normalized)ofall

thenormalsofthesurroundingsurfaces: = + + + .

Theimplementationofthemethodthatcalculatesthenormalsisasfollows.

privatefloat[]calcNormals(float[]posArr,intwidth,intheight){

Vector3fv0=newVector3f();

Vector3fv1=newVector3f();

Vector3fv2=newVector3f();

Vector3fv3=newVector3f();

Vector3fv4=newVector3f();

Vector3fv12=newVector3f();

Vector3fv23=newVector3f();

V 23 V 2 V 3

V 34 V 3 V 4 V 41 V 4 V 1 P0

N0^ V 12^ V 23^ V 34^ V 41^

HeightMaps

173

Vector3fv34=newVector3f();

Vector3fv41=newVector3f();

List<Float>normals=newArrayList<>();

Vector3fnormal=newVector3f();

for(introw=0;row<height;row++){

for(intcol=0;col<width;col++){

if(row>0&&row<height-1&&col>0&&col<width-1){

inti0=row*width*3+col*3;

v0.x=posArr[i0];

v0.y=posArr[i0+1];

v0.z=posArr[i0+2];

inti1=row*width*3+(col-1)*3;

v1.x=posArr[i1];

v1.y=posArr[i1+1];

v1.z=posArr[i1+2];

v1=v1.sub(v0);

inti2=(row+1)*width*3+col*3;

v2.x=posArr[i2];

v2.y=posArr[i2+1];

v2.z=posArr[i2+2];

v2=v2.sub(v0);

inti3=(row)*width*3+(col+1)*3;

v3.x=posArr[i3];

v3.y=posArr[i3+1];

v3.z=posArr[i3+2];

v3=v3.sub(v0);

inti4=(row-1)*width*3+col*3;

v4.x=posArr[i4];

v4.y=posArr[i4+1];

v4.z=posArr[i4+2];

v4=v4.sub(v0);

v1.cross(v2,v12);

v12.normalize();

v2.cross(v3,v23);

v23.normalize();

v3.cross(v4,v34);

v34.normalize();

v4.cross(v1,v41);

v41.normalize();

normal=v12.add(v23).add(v34).add(v41);

normal.normalize();

}else{

normal.x=0;

normal.y=1;

HeightMaps

174

normal.z=0;

}

normal.normalize();

normals.add(normal.x);

normals.add(normal.y);

normals.add(normal.z);

}

}

returnUtils.listToArray(normals);

}

Finally,inordertobuildlargerterrains,wehavetwooptions:

Createalargerheightmap.Reuseaheightmapandtileitthroughthe3Dspace.Theheightmapwillbelikeaterrainblockthatcouldbetranslatedacrosstheworldliketiles.Inordertodoso,thepixelsoftheedgeoftheheightmapmustbethesame(theleftedgemustbeequaltotherightsideandthetopedgemustbeequaltothebottomone)toavoidgapsbetweenthetiles.

Wewillusethesecondapproach(andselectanappropriateheightmap).Tosupportthis,wewillcreateaclassnamedTerrainthatwillcreateasquareofheightmaptiles,definedlikethis.

HeightMaps

175

packageorg.lwjglb.engine.items;

importorg.lwjglb.engine.graph.HeightMapMesh;

publicclassTerrain{

privatefinalGameItem[]gameItems;

publicTerrain(intblocksPerRow,floatscale,floatminY,floatmaxY,Stringheigh

tMap,StringtextureFile,inttextInc)throwsException{

gameItems=newGameItem[blocksPerRow*blocksPerRow];

HeightMapMeshheightMapMesh=newHeightMapMesh(minY,maxY,heightMap,texture

File,textInc);

for(introw=0;row<blocksPerRow;row++){

for(intcol=0;col<blocksPerRow;col++){

floatxDisplacement=(col-((float)blocksPerRow-1)/(float)2)*

scale*HeightMapMesh.getXLength();

floatzDisplacement=(row-((float)blocksPerRow-1)/(float)2)*

scale*HeightMapMesh.getZLength();

GameItemterrainBlock=newGameItem(heightMapMesh.getMesh());

terrainBlock.setScale(scale);

terrainBlock.setPosition(xDisplacement,0,zDisplacement);

gameItems[row*blocksPerRow+col]=terrainBlock;

}

}

}

publicGameItem[]getGameItems(){

returngameItems;

}

}

Let'sexplaintheoverallprocess,wehaveblocksthathavethefollowingcoordinates(forxandzandwiththeconstantsdefinedabove).

HeightMaps

176

Let'ssupposethatwearecreate¡ingaterrainformedbya3x3blocksgrid.Let'sassumealsothatwewont'scaletheterrainblocks(thatis,thevariableblocksPerRowwillbe3andthevariablescalewillbe1).Wewantthegridtobecenteredat(0,0)coordinates.

Weneedtotranslatetheblockssotheverticeswillhavethefollowingcoordinates.

ThetranslationisachievedbycallingsetPositionmethod,butrememberthatwhatwesetisadisplacementnotaposition.Ifyoureviewthefigureaboveyouwillseethatthecentralblockdoesnotrequireanydisplacement,it'salreadypositionedintheadequatecoordinates.

HeightMaps

177

Thevertexdrawningreenneedsadisplacement,forthexcoordinate,of−1andthevertexdrawninblueneedsadisplacementof+1.Theformulatocalculatethexdisplacement,takingintoconsiderationthescaleandtheblockwidth,isthisone:

xDisplacement = (col − (blocksPerRow − 1)/2) × scale× width

Andtheequivalentformulaforzdisplacementis:

zDisplacement = (row − (blocksPerRow − 1)/2) × scale× height

IfwecreateaTerraininstanceintheDummyGameclass,wecangetsomethinglikethis.

Youcanmovethecameraaroundtheterrainandseehowit’srendered.Sincewestilldonothaveimplementedcollisiondetectionyoucanpassthroughitandlookitfromabove.Becausewehavefacecullingenabled,somepartsoftheterrainarenotrenderedwhenlookingfrombelow.

HeightMaps

178

TerrainCollisionsOncewehavecreatedaterrainthenextstepistodetectcollisionstoavoidtraversingthroughit.Ifyourecallfrompreviouschapter,aterrainiscomposedbyblocks,andeachofthoseblocksisconstructedfromaheightmap.Theheightmapisusedtosettheheightoftheverticesthatcomposethetrianglesthatformtheterrain.

Inordertodetectacollisionwemustcomparecurrentpositionyvaluewiththeyvalueofthepointoftheterrainwearecurrentlyin.Ifweareaboveterrain’syvaluethere’snocollision,ifnot,weneedtogetback.Simpleconcept,doesit?Indeeditisbutweneedtoperformseveralcalculationsbeforeweareabletodothatcomparison.

Thefirstthingweneedtodefineiswhatweunderstandfortheterm"currentposition".Sincewedonothaveyetaplayerconcepttheansweriseasy,thecurrentpositionwillbethecameraposition.Sowealreadyhaveoneofthecomponentsofthecomparison,thus,thenextthingtocalculateisterrainheightatcurrentposition.

Asit'sbeensaidbefore,theterrainiscomposedbyagridofterrainblocksasshowninthenextfigure.

Eachterrainblockisconstructedfromthesameheightmapmesh,butisscaledanddisplacedpreciselytoformaterraingridthatlookslikeacontinuouslandscape.

So,whatweneedtodofirstisdetermineinwhichterrainblockthecurrentposition,thecamera,isin.Inordertodothat,wewillcalculatetheboundingboxofeachterrainblocktakingintoconsiderationthedisplacementandthescaling.Sincetheterrainwillnotbedisplacedorscaledatruntime,wecanperformthosecalculationsintheTerrainclassconstructor.Bydoingthiswayweaccessthemlateratanytimewithoutrepeatingthoseoperationsineachgameloopcycle.

Wewillcreateanewmethodthatcalculatestheboundingboxofaterrainblock,namedgetBoundingBox.

TerrainCollisions

179

privateBox2DgetBoundingBox(GameItemterrainBlock){

floatscale=terrainBlock.getScale();

Vector3fposition=terrainBlock.getPosition();

floattopLeftX=HeightMapMesh.STARTX*scale+position.x;

floattopLeftZ=HeightMapMesh.STARTZ*scale+position.z;

floatwidth=Math.abs(HeightMapMesh.STARTX*2)*scale;

floatheight=Math.abs(HeightMapMesh.STARTZ*2)*scale;

Box2DboundingBox=newBox2D(topLeftX,topLeftZ,width,height);

returnboundingBox;

}

TheBox2Dclassisasimplifiedversionofthejava.awt.Rectangle2D.Floatclass;createdtoavoidusingAWT.

Nowweneedtocalculatetheworldcoordinatesoftheterrainblocks.Inthepreviouschapteryousawthatallofourterrainmesheswerecreatedinsideaquadwithitsoriginsetto[STARTX,STARTZ].Thus,weneedtotransformthosecoordinatestotheworldcoordinatestakingintoconsiderationthescaleandthedisplacementasshowninthenextfigure.

Asit’sbeensaidabove,thiscanbedoneintheTerrainclassconstructorsinceitwon'tchangeatruntime.Soweneedtoaddanewattributewhichwillholdtheboundingboxes:

privatefinalBox2D[][]boundingBoxes;

IntheTerrainconstructor,whilewearecreatingtheterrainblocks,wejustneedtoinvokethemethodthatcalculatestheboundingbox.

TerrainCollisions

180

publicTerrain(intterrainSize,floatscale,floatminY,floatmaxY,StringheightMapF

ile,StringtextureFile,inttextInc)throwsException{

this.terrainSize=terrainSize;

gameItems=newGameItem[terrainSize*terrainSize];

PNGDecoderdecoder=newPNGDecoder(getClass().getResourceAsStream(heightMapFile))

;

intheight=decoder.getHeight();

intwidth=decoder.getWidth();

ByteBufferbuf=ByteBuffer.allocateDirect(

4*decoder.getWidth()*decoder.getHeight());

decoder.decode(buf,decoder.getWidth()*4,PNGDecoder.Format.RGBA);

buf.flip();

//Thenumberofverticespercolumnandrow

verticesPerCol=heightMapImage.getWidth();

verticesPerRow=heightMapImage.getHeight();

heightMapMesh=newHeightMapMesh(minY,maxY,buf,width,textureFile,textInc);

boundingBoxes=newBox2D[terrainSize][terrainSize];

for(introw=0;row<terrainSize;row++){

for(intcol=0;col<terrainSize;col++){

floatxDisplacement=(col-((float)terrainSize-1)/(float)2)*scal

e*HeightMapMesh.getXLength();

floatzDisplacement=(row-((float)terrainSize-1)/(float)2)*scal

e*HeightMapMesh.getZLength();

GameItemterrainBlock=newGameItem(heightMapMesh.getMesh());

terrainBlock.setScale(scale);

terrainBlock.setPosition(xDisplacement,0,zDisplacement);

gameItems[row*terrainSize+col]=terrainBlock;

boundingBoxes[row][col]=getBoundingBox(terrainBlock);

}

}

}

So,withalltheboundingboxespre-calculated,wearereadytocreateanewmethodthatwillreturntheheightoftheterraintakingasaparameterthecurrentposition.ThismethodwillbenamedgetHeightVectoranditsdefinedlikethis.

TerrainCollisions

181

publicfloatgetHeight(Vector3fposition){

floatresult=Float.MIN_VALUE;

//Foreachterrainblockwegettheboundingbox,translateittoviewcoodinates

//andcheckifthepositioniscontainedinthatboundingbox

Box2DboundingBox=null;

booleanfound=false;

GameItemterrainBlock=null;

for(introw=0;row<terrainSize&&!found;row++){

for(intcol=0;col<terrainSize&&!found;col++){

terrainBlock=gameItems[row*terrainSize+col];

boundingBox=boundingBoxes[row][col];

found=boundingBox.contains(position.x,position.z);

}

}

//Ifwehavefoundaterrainblockthatcontainsthepositionweneed

//tocalculatetheheightoftheterrainonthatposition

if(found){

Vector3f[]triangle=getTriangle(position,boundingBox,terrainBlock);

result=interpolateHeight(triangle[0],triangle[1],triangle[2],position.x,

position.z);

}

returnresult;

}

Thefirstthingthattowedointhatmethodistodeterminetheterrainblockthatwearein.Sincewealreadyhavetheboundingboxforeachterrainblock,thealgorithmissimple.Wejustsimplyneedtoiterateoverthearrayofboundingboxesandcheckifthecurrentpositionisinside(theclassBox2Dprovidesamethodforthis).

Oncewehavefoundtheterrainblock,weneedtocalculatethetrianglewhichwearein.ThisisdoneinthegetTrianglemethodthatwillbedescribedlateron.Afterthat,wehavethecoordinatesofthetrianglethatwearein,includingitsheight.But,weneedtheheightofapointthatisnotlocatedatanyofthoseverticesbutinaplaceinbetween.ThisisdoneintheinterpolateHeightmethod.Wewillalsoexplainhowthisisdonelateron.

Let’sfirststartwiththeprocessofdeterminingthetrianglethatwearein.ThequadthatformsaterrainblockcanbeseenasagridinwhicheachcellisformedbytwotrianglesLet’sdefinesomevariablesfirst:

boundingBox.xisthexcoordinateoftheoriginoftheboundingboxassociatedtothequad.boundingBox.yisthezcoordinatesoftheoriginoftheboundingboxassociatedtothequad(Althoughyouseea“y”,itmodelsthezaxis).boundingBox.widthisthewidthofthequadboundingBox.heightistheheightofthequad.

TerrainCollisions

182

cellWidthisthewidthofacell.cellHeightistheheightofacell.

Allofthevariablesdefinedaboveareexpressedinworldcoordinates.Tocalculatethewidthofacellwejustneedtodividetheboundingboxwidthbythenumberofverticespercolumn:

cellWidth =

AndthevariablecellHeightiscalculatedanalogous

cellHeight =

Oncewehavethosevariableswecancalculatetherowandthecolumnofthecellwearecurrentlyinwidthisquitestraightforward:

col =

row =

Thefollowingpictureshowsallthevariablesdescribedaboveforasampleterrainblock.

verticesPerColboundingBox.width

verticesPerRowboundingBox.height

boundingBox.widthposition.x−boundingBox.x

boundingBox.heightposition.z−boundingBox.y

TerrainCollisions

183

Withallthatinformationweareabletocalculatethepositionsoftheverticesofthetrianglescontainedinthecell.Howwecandothis?Let’sexaminethetrianglesthatformasinglecell.

Youcanseethatthecellisdividedbyadiagonalthatseparatesthetwotriangles.Thewaytodeterminethetriangleassociatedtothecurrentposition,isbycheckingifthezcoordinateisaboveorbelowthatdiagonal.Inourcase,ifcurrentpositionzvalueislessthanthez

TerrainCollisions

184

valueofthediagonalsettingthexvaluetothexvalueofcurrentpositionweareinT1.Ifit'sgreaterthanthatweareinT2.

Wecandeterminethatbycalculatingthelineequationthatmatchesthediagonal.

Ifyourememberyourschoolmathclasses,theequationofalinethatpassesfromtwopoints(in2D)is:

y − y1 = m ⋅ (x− x1)

Wheremisthelineslope,thatis,howmuchtheheightchangeswhenmovingthroughthexaxis.Notethat,inourcase,theycoordinatesarethezones.Alsonote,thatweareusing2Dcoordinatesbecausewearenotcalculatingheightshere.Wejustwanttoselectthepropertriangleandtodothatxanzcoordinatesareenough.So,inourcasethelineequationshouldberewrittenlikethis.

z − z1 = m ⋅ (z − z1)

Theslopecanbecalculateinthefollowingway:

m =

Sotheequationofthediagonaltogetthezvaluegivenaxpositionislikethis:

z = m ⋅ (xpos− x1) + z1 = ⋅ (zpos− x1) + z1

Wherex1,x2,z1andz2arethexandzcoordinatesoftheverticesV 1andV 2respectively.

Sothemethodtogetthetrianglethatthecurrentpositionisin,namedgetTriangle,applyingallthecalculationsdescribedabovecanbeimplementedlikethis:

x1−x2z1−z2

x1−x2z1−z2

TerrainCollisions

185

protectedVector3f[]getTriangle(Vector3fposition,Box2DboundingBox,GameItemterrai

nBlock){

//Getthecolumnandrowoftheheightmapassociatedtothecurrentposition

floatcellWidth=boundingBox.width/(float)verticesPerCol;

floatcellHeight=boundingBox.height/(float)verticesPerRow;

intcol=(int)((position.x-boundingBox.x)/cellWidth);

introw=(int)((position.z-boundingBox.y)/cellHeight);

Vector3f[]triangle=newVector3f[3];

triangle[1]=newVector3f(

boundingBox.x+col*cellWidth,

getWorldHeight(row+1,col,terrainBlock),

boundingBox.y+(row+1)*cellHeight);

triangle[2]=newVector3f(

boundingBox.x+(col+1)*cellWidth,

getWorldHeight(row,col+1,terrainBlock),

boundingBox.y+row*cellHeight);

if(position.z<getDiagonalZCoord(triangle[1].x,triangle[1].z,triangle[2].x,tr

iangle[2].z,position.x)){

triangle[0]=newVector3f(

boundingBox.x+col*cellWidth,

getWorldHeight(row,col,terrainBlock),

boundingBox.y+row*cellHeight);

}else{

triangle[0]=newVector3f(

boundingBox.x+(col+1)*cellWidth,

getWorldHeight(row+2,col+1,terrainBlock),

boundingBox.y+(row+1)*cellHeight);

}

returntriangle;

}

protectedfloatgetDiagonalZCoord(floatx1,floatz1,floatx2,floatz2,floatx){

floatz=((z1-z2)/(x1-x2))*(x-x1)+z1;

returnz;

}

protectedfloatgetWorldHeight(introw,intcol,GameItemgameItem){

floaty=heightMapMesh.getHeight(row,col);

returny*gameItem.getScale()+gameItem.getPosition().y;

}

Youcanseethatwehavetwoadditionalmethods.Thefirstone,namedgetDiagonalZCoord,calculatesthezcoordinateofthediagonalgivenaxpositionandtwovertices.Theotherone,namedgetWorldHeight,isusedtoretrievetheheightofthetrianglevertices,theycoordinate.Whentheterrainmeshisconstructedtheheightofeachvertexispre-calculatedandstored,weonlyneedtotranslateittoworldcoordinates.

TerrainCollisions

186

Ok,sowehavethetrianglecoordinatesthatthecurrentpositionisin.Finally,wearereadytocalculateterrainheightatcurrentposition.Howcanwedothis?Well,ourtriangleiscontainedinaplane,andaplanecanbedefinedbythreepoints,inthiscase,thethreeverticesthatdefineatriangle.

Theplaneequationisasfollows:a ⋅ x+ b ⋅ y + c ⋅ z + d = 0

Thevaluesoftheconstantsofthepreviousequationare:

a = (B −A ) ⋅ (C −A ) − (C −A ) ⋅ (B −A )

b = (B −A ) ⋅ (C −A ) − (C −A ) ⋅ (B −A )

c = (B −A ) ⋅ (C −A ) − (C −A ) ⋅ (B −A )

WhereA,BandCarethethreeverticesneededtodefinetheplane.

Then,withpreviousequationsandthevaluesofthexandzcoordinatesforthecurrentpositionweareabletocalculatetheyvalue,thatistheheightoftheterrainatthecurrentposition:

y = (−d− a ⋅ x− c ⋅ z)/b

Themethodthatperformsthepreviouscalculationsisthefollowing:

protectedfloatinterpolateHeight(Vector3fpA,Vector3fpB,Vector3fpC,floatx,float

z){

//Planeequationax+by+cz+d=0

floata=(pB.y-pA.y)*(pC.z-pA.z)-(pC.y-pA.y)*(pB.z-pA.z);

floatb=(pB.z-pA.z)*(pC.x-pA.x)-(pC.z-pA.z)*(pB.x-pA.x);

floatc=(pB.x-pA.x)*(pC.y-pA.y)-(pC.x-pA.x)*(pB.y-pA.y);

floatd=-(a*pA.x+b*pA.y+c*pA.z);

//y=(-d-ax-cz)/b

floaty=(-d-a*x-c*z)/b;

returny;

}

Andthat’sall!wearenowabletodetectthecollisions,sointheDummyGameclasswecanchangethefollowinglineswhenweupdatethecameraposition:

y y z z y y z z

z z x x z z z z

x x y y x x y y

TerrainCollisions

187

//Updatecameraposition

Vector3fprevPos=newVector3f(camera.getPosition());

camera.movePosition(cameraInc.x*CAMERA_POS_STEP,cameraInc.y*CAMERA_POS_STEP,came

raInc.z*CAMERA_POS_STEP);

//Checkiftherehasbeenacollision.Iftrue,settheypositionto

//themaximumheight

floatheight=terrain.getHeight(camera.getPosition());

if(camera.getPosition().y<=height){

camera.setPosition(prevPos.x,prevPos.y,prevPos.z);

}

Asyoucanseetheconceptofdetectingterraincollisionsiseasytounderstandbutweneedtocarefullyperformasetofcalculationsandbeawareofthedifferentcoordinatesystemswearedealingwith.

Besidesthat,althoughthealgorithmpresentedhereisvalidinmostofthecases,therearestillsituationsthatneedtobehandledcarefully.Oneeffectthatyoumayobserveistheonecalledtunnelling.Imaginethefollowingsituation,wearetravellingatafastspeedthroughourterrainandbecauseofthat,thepositionincrementgetsahighvalue.Thisvaluecangetsohighthat,sincewearedetectingcollisionswiththefinalposition,wemayhaveskippedobstaclesthatlayinbetween.

Therearemanypossiblesolutionstoavoidthateffect,thesimplestoneistosplitthecalculationtobeperformedinsmallerincrements.

TerrainCollisions

188

FogBeforewedealwithmorecomplextopicswewillreviewhowtocreateafogeffectinourgameengine.Withthateffectwewillsimulatehowdistantobjectsgetdimmedandseemtovanishintoadensefog.

Letusfirstexaminewhataretheattributesthatdefinefog.Thefirstoneisthefogcolour.Intherealworldthefoghasagraycolour,butwecanusethiseffecttosimulatewideareasinvadedbyafogwithdifferentcolours.Theattributeisthefog'sdensity.

Thus,inordertoapplythefogeffectweneedtofindawaytofadeour3Dsceneobjectsintothefogcolouraslongastheygetfarawayfromthecamera.Objectsthatareclosetothecamerawillnotbeaffectedbythefog,butobjectsthatarefarawaywillnotbedistinguishable.Soweneedtobeabletocalculateafactorthatcanbeusedtoblendthefogcolourandeachfragmentcolourinordertosimulatethateffect.Thatfactorwillneedtobedependentonthedistancetothecamera.

Let’snamethatfactorasfogFactor,andsetitsrangefrom0to1.WhenfogFactortakesthe1value,itmeansthattheobjectwillnotbeaffectedbyfog,thatis,it’sanearbyobject.WhenfogFactortakesthe0value,itmeansthattheobjectswillbecompletelyhiddeninthefog.

Then,theequationneededtocalculatethefogcolourwillbe:

finalColour = (1 − fogFactor) ⋅ fogColour + fogFactor ⋅ framentColour

finalColouristhecolourthatresultsfromapplyingthefogeffect.fogFactoristheparametersthatcontrolshowthefogcolourandthefragmentcolourareblended.Itbasicallycontrolstheobjectvisibility.fogColouristhecolourofthefog.fragmentColour,isthecolourofthefragmentwithoutapplyinganyfogeffectonit.

NowweneedtofindawaytocalculatefogFactordependingonthedistance.Wecanchosedifferentmodels,andthefirstonecouldbetousealinearmodel.Thatisamodelthat,givenadistance,changesthefogFactorvalueinalinearway.

Thelinearmodelcanbedefinedbythefollowingparameters:

fogStart:Thedistanceatwherefogeffectsstartstobeapplied.fogF inish:Thedistanceatwherefogeffectsreachesitsmaximumvalue.distance:Distancetothecamera.

Withthoseparameters,theequationtobeappliedwouldbe:

Fog

189

fogFactor =

ForobjectsatdistancelowerthanfogStartwejustsimplysetthefogFactorto1.ThefollowinggraphshowshowthefogFactorchangeswiththedistance.

Thelinearmodeliseasytocalculatebutitisnotveryrealisticanditdoesnottakeintoconsiderationthefogdensity.Inrealityfogtendstogrowinasmootherway.Sothenextsuitablemodelisaexponentialone.Theequationforthatmodelisasfollows:

focFactor = e =

Thenewvariablesthatcomeintoplayare:

fogDensitywhichmodelsthethicknessordensityofthefog.exponentwhichisusedtocontrolhowfastthefogincreaseswithdistance

Thefollowingpictureshowstwographsfortheequationabovefordifferentvaluesoftheexponent(2forthebluelineand4fortheredone)

(fogF inish− fogStart)(fogF inish− distance)

−(distance⋅fogDensity)exponent

e(distance⋅fogDensity)exponent1

Fog

190

Inourcodewewilluseaformulawhichsetsavalueoftwofortheexponent(youcaneasilymodifytheexampletousedifferentvalues).

Nowthatthetheoryhasbeenexplainedwecanputitintopractice.Wewillimplementtheeffectinthescenefragmentshadersincewehavethereallthevariablesweneed.Wewillstartbydefiningastructthatmodelsthefogattributes.

structFog

{

intactive;

vec3colour;

floatdensity;

};

Theactiveattributewillbeusedtoactivateordeactivatethefogeffect.Thefogwillbepassedtotheshaderthroughanotheruniformnamedfog.

uniformFogfog;

WewillcreatealsoanewclassnamedFogwhichisanotherPOJO(PlainOldJavaObject)whichcontainsthefogattributes.

Fog

191

packageorg.lwjglb.engine.graph.weather;

importorg.joml.Vector3f;

publicclassFog{

privatebooleanactive;

privateVector3fcolour;

privatefloatdensity;

publicstaticFogNOFOG=newFog();

publicFog(){

active=false;

this.colour=newVector3f(0,0,0);

this.density=0;

}

publicFog(booleanactive,Vector3fcolour,floatdensity){

this.colour=colour;

this.density=density;

this.active=active;

}

//Gettersandsettershere….

WewilladdaFoginstanceintheSceneclass.Asadefault,theSceneclasswillinitializetheFoginstancetotheconstantNOFOGwhichmodelsadeactivatedinstance.

SinceweaddedanewuniformtypeweneedtomodifytheShaderProgramclasstocreateandinitializethefoguniform.

publicvoidcreateFogUniform(StringuniformName)throwsException{

createUniform(uniformName+".active");

createUniform(uniformName+".colour");

createUniform(uniformName+".density");

}

publicvoidsetUniform(StringuniformName,Fogfog){

setUniform(uniformName+".activeFog",fog.isActive()?1:0);

setUniform(uniformName+".colour",fog.getColour());

setUniform(uniformName+".density",fog.getDensity());

}

IntheRendererclasswejustneedtocreatetheuniforminthesetupSceneShadermethod:

Fog

192

sceneShaderProgram.createFogUniform("fog");

AnduseitintherenderScenemethod:

sceneShaderProgram.setUniform("fog",scene.getFog());

Wearenowabletodefinefogcharacteristicsinourgame,butweneedtogetbacktothefragmentshaderinordertoapplythefogeffect.WewillcreateafunctionnamedcalcFogwhichisdefinedlikethis.

vec4calcFog(vec3pos,vec4colour,Fogfog)

{

floatdistance=length(pos);

floatfogFactor=1.0/exp((distance*fog.density)*(distance*fog.density));

fogFactor=clamp(fogFactor,0.0,1.0);

vec3resultColour=mix(fog.colour,colour.xyz,fogFactor);

returnvec4(resultColour.xyz,colour.w);

}

Asyoucanseewefirstcalculatethedistancetothevertex.Thevertexcoordinatesaredefinedintheposvariableandwejustneedtocalculatethelength.Thenwecalculatethefogfactorusingtheexponentialmodelwithanexponentoftwo(whichisequivalenttomultiplyittwice).WeclampthefogFactortoarangebetween0and1andusethemixfunction.InGLSL,themixfunctionisusedtoblendthefogcolourandthefragmentcolour(definedbyvariablecolour).It'sequivalenttoapplythisequation:

resultColour = (1 − fogFactor) ⋅ fog.colour + fogFactor ⋅ colour

Wealsopreservethewcomponente,thetransparency,oftheoriginalcolour.Wedon'twantthiscomponenttobeaffected,thefragmentshouldmantainitstransparencylevel.

Attheendofthefragmentshaderafterapplyingallthelighteffectswejustsimplyassignthereturnedvaluetothefragmentcolourifthefogisactive.

if(fog.activeFog==1)

{

fragColor=calcFog(mvVertexPos,fragColor,fog);

}

Withallthatcodecompleted,wecansetupaFogwiththefollowingdata:

scene.setFog(newFog(true,newVector3f(0.5f,0.5f,0.5f),0.15f));

Fog

193

Andwewillgetaneffectlikethis:

Youwillseethatdistantobjectsgetfadedinthedistanceandthatfogstartstodisappearwhenyouapproachtothem.There’saproblem,thoughwiththeskybox,itlooksalittlebitweirdthatthehorizonisnotaffectedbythefog.Thereareseveralwaystosolvethis:

Useadifferentskyboxinwhichyouonlyseeasky.Removetheskybox,sinceyouhaveadensefog,youshouldnotbeabletoseeabackground.

Maybenoneofthetwosolutionsfitsyou,andyoucantrytomatchthefogcolourtotheskyboxbackgroundbutyouwillendupdoingcomplexcalculationsandtheresultwillnotbemuchbetter.

Ifyoulettheexamplerunyouwillseehowdirectionallightgetsdimmedandthescenedarkens,butthere’saproblemwiththefog,itisnotaffectedbylightandyouwillgetsomethinglikethis.

Fog

194

Distantobjectsaresettothefogcolourwhichisaconstantandnotaffectedbylight.Thisfactproduceslikeaglowinginthedarkeffect(whichmaybeokforyouornot).Weneedtochangethefuncionthatcalculatesthefogtotakeintoconsiderationthelight.Thefunctionwillreceivetheambientlightandthedirectionallighttomodulatethefogcolour.

vec4calcFog(vec3pos,vec4colour,Fogfog,vec3ambientLight,DirectionalLightdirLi

ght)

{

vec3fogColor=fog.colour*(ambientLight+dirLight.colour*dirLight.intensity)

;

floatdistance=length(pos);

floatfogFactor=1.0/exp((distance*fog.density)*(distance*fog.density));

fogFactor=clamp(fogFactor,0.0,1.0);

vec3resultColour=mix(fogColor,colour.xyz,fogFactor);

returnvec4(resultColour.xyz,1);

}

Asyoucanseewiththedirectionallightwejustusethecolourandtheintensity,wearenotinterestedinthedirection.Withthatmodificationwejustneedtoslightlymodifythecalltothefunctionlikethis:

if(fog.active==1)

{

fragColor=calcFog(mvVertexPos,fragColor,fog,ambientLight,directionalLight);

}

Fog

195

Andwewillgetsomethinglikethiswhenthenightfalls.

Oneimportantthingtohighlightisthatwemustwiselychoosethefogcolour.Thisisevenmoreimportantwhenwehavenoskyboxbutafixedcolourbackground.Weshouldsetupthefogcolourtobeequaltotheclearcolour.Ifyouuncommentthecodethatrendertheskyboxandreruntheexampleyouwillgetsomethinglikethis.

Butifwemodifytheclearcolourtobeequalto(0.5,0.5,0.5)theresultwillbelikethis.

Fog

196

Fog

197

NormalMappingInthischapterwewillexplainatechniquethatwilldramaticallyimprovehowour3Dmodelslooklike.Bynow,wearenowabletoapplytexturestocomplex3Dmodels,butwearestillfarawayfromwhatrealobjectslooklike.Surfacesintherealworldarenotperfectlyplain,theyhaveimperfectionsthatour3Dmodelscurrentlydonothave.

Inordertorendermorerealisticsceneswearegoingtousenormalmaps.Ifyoulookataflatsurfaceintherealwordyouwillseethatthoseimperfectionscanbeseenevenatdistancebythewaythatthelightreflectsonit.Ina3Dsceneaflatsurfacewillhavenoimperfections,wecanapplyatexturetoitbutwewon’tchangethewaythatlightreflectsonit.That’sthethingthatmakesthedifference.

Wemaythinkinincreasingthedetailofourmodelsbyincreasingthenumberoftrianglesandreflectthoseimperfectionsbutperformancewilldegrade.Whatweneedisawaytochangethewaylightreflectsonsurfacestoincreasetherealism.Thisisachievedwiththenormalmappingtechnique.

Let’sgobacktotheplainsurfaceexample,aplanecanebedefinedbytwotriangleswhichformaquad.Ifyourememberformthelightningchapters,theelementthatmodelshowlightreflectsaresurfacenormals.Inthiscase,wehaveasinglenormalforthewholesurface,eachfragmentofthesurfaceusesthesamenormalwhencalculatinghowlightaffectsthem.Thisisshowninthenextfigure.

Ifwecouldchangethenormalsforeachfragmentofthesurfacewecouldmodelsurfaceimperfectionstorendertheminamorerealisticway.Thisisshowninthenextfigure.

NormalMapping

198

Thewaywearegoingtoachievethisisbyloadinganothertexturewhichstoresthenormalsforthesurface.Eachpixelofthenormaltexturewillcontainthevaluesofthex,yandzcoordinatesofthenormalstoredasanRGBvalue.

Let’susethefollowingtexturetodrawaquad.

Anexampleofanormalmaptexturefortheimageabovecouldbethefollowing.

Asyoucanseeisiflikewehadappliedacolourtransformationtotheoriginaltexture.Eachpixelstoresnormalinformationusingcolourcomponents.Onethingthatyouwillusuallyseewhenviewingnormalmapsisthatthedominantcolourstendtoblue.Thisisduetothefact

NormalMapping

199

thatnormalspointtothepositivezaxis.Thezcomponentwillusuallyhaveamuchhighervaluethanthexandyonesforplainsurfacesasthenormalpointsoutofthesurface.Sincex,y,zcoordinatesaremappedtoRGB,thebluecomponentwillhavealsoahighervalue.

So,torenderanobjectusingnormalmapswejustneedanextratextureanduseitwhilerenderingfragmentstogettheappropriatenormalvalue.

Let’sstartchangingourcodeinordertosupportnormalmaps.WewilladdanewtextureinstancetotheMaterialclasssowecanattachanormalmaptexturetoourgameitems.Thisinstancewillhaveitsowngettersandsettersandmethodtocheckifthematerialhasanormalmapornot.

publicclassMaterial{

privatestaticfinalVector4fDEFAULT_COLOUR=newVector3f(1.0f,1.0f,1.0f,10.f

);

privateVector3fambientColour;

privateVector3fdiffuseColour;

privateVector3fspecularColour;

privatefloatreflectance;

privateTexturetexture;

privateTexturenormalMap;

//…Previouscodehere

publicbooleanhasNormalMap(){

returnthis.normalMap!=null;

}

publicTexturegetNormalMap(){

returnnormalMap;

}

publicvoidsetNormalMap(TexturenormalMap){

this.normalMap=normalMap;

}

}

Wewillusethenormalmaptextureinthescenefragmentshader.But,sinceweareworkinginviewcoordinatesspaceweneedtopassthemodelviewmatrixinordertodothepropertransformation.Thus,weneedtomodifythescenevertexshader.

NormalMapping

200

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

outvec2outTexCoord;

outvec3mvVertexNormal;

outvec3mvVertexPos;

outmat4outModelViewMatrix;

uniformmat4modelViewMatrix;

uniformmat4projectionMatrix;

voidmain()

{

vec4mvPos=modelViewMatrix*vec4(position,1.0);

gl_Position=projectionMatrix*mvPos;

outTexCoord=texCoord;

mvVertexNormal=normalize(modelViewMatrix*vec4(vertexNormal,0.0)).xyz;

mvVertexPos=mvPos.xyz;

outModelViewMatrix=modelViewMatrix;

}

Inthescenefragmentshaderweneedtoaddanotherinputparameter.

inmat4outModelViewMatrix;

Inthefragmentshader,wewillneedtopassanewuniformforthenormalmaptexturesampler:

uniformsampler2Dtexture_sampler;

Also,inthefragmentshader,wewillcreateanewfunctionthatcalculatesthenormalforthecurrentfragment.

vec3calcNormal(Materialmaterial,vec3normal,vec2text_coord,mat4modelViewMatrix)

{

vec3newNormal=normal;

if(material.hasNormalMap==1)

{

newNormal=texture(normalMap,text_coord).rgb;

newNormal=normalize(newNormal*2-1);

newNormal=normalize(modelViewMatrix*vec4(newNormal,0.0)).xyz;

}

returnnewNormal;

}

NormalMapping

201

Thefunctiontakesthefollowingparameters:

Thematerialinstance.Thevertexnormal.Thetexturecoordinates.Themodelviewmatrix.

Thefirstthingwedointhatfunctionistocheckifthismaterialhasanormalmapassociatedornot.Ifnot,wejustsimplyusethevertexnormalasusual.Ifithasanormalmap,weusethenormaldatastoredinthenormaltexturemapassociatedtothecurrenttexturecoordinates.

Rememberthatthecolourwegetarethenormalcoordinates,butsincetheyarestoredasRGBvaluestheyarecontainedintherange[0,1].Weneedtotransformthemtobeintherange[-1,1],sowejustmultiplybytwoandsubtract1.Then,wenormalizethatvalueandtransformittoviewmodelcoordinatespace(aswiththevertexnormal).

Andthat’sall,wecanusethereturnedvalueasthenormalforthatfragmentinallthelightningcalculations.

IntheRendererclassweneedtocreatethenormalmapuniform,andintherenderScenemethodweneedtosetituplikethis:

...

sceneShaderProgram.setUniform("fog",scene.getFog());

sceneShaderProgram.setUniform("texture_sampler",0);

sceneShaderProgram.setUniform("normalMap",1);

...

Youmaynoticesomeinterestingthinginthecodeabove.Wearesetting0forthematerialtextureuniform(texture_sampler)and1forthenormalmaptexture(normalMap).Ifyourecallfromthetexturechapter.Weareusingmorethanonetexture,sowemustsetupthetextureunitforeachseparatetexture.

WeneedtotakethisalsointoconsiderationwhenwearerenderingaMesh.

NormalMapping

202

privatevoidinitRender(){

Texturetexture=material.getTexture();

if(texture!=null){

//Activatefirsttexturebank

glActiveTexture(GL_TEXTURE0);

//Bindthetexture

glBindTexture(GL_TEXTURE_2D,texture.getId());

}

TexturenormalMap=material.getNormalMap();

if(normalMap!=null){

//Activatefirsttexturebank

glActiveTexture(GL_TEXTURE1);

//Bindthetexture

glBindTexture(GL_TEXTURE_2D,normalMap.getId());

}

//Drawthemesh

glBindVertexArray(getVaoId());

glEnableVertexAttribArray(0);

glEnableVertexAttribArray(1);

glEnableVertexAttribArray(2);

}

Asyoucanseeweneedtobindtoeachofthetexturesavailableandactivatetheassociatedtextureunitinordertobeabletoworkwithmorethanonetexture.IntherenderScenemethodintheRendererclasswedonotneedtoexplicitlysetuptheuniformofthetexturesinceit’salreadycontainedintheMaterial.

Inordertoshowtheimprovementsthatnormalmapsprovide,wehavecreatedanexamplethatshowstwoquadssidebyside.Therightquadhasatexturemapappliedandtheleftonenot.Wealsohaveremovedtheterrain,theskyboxandtheHUDandsetupadirectionallightwithcanbechangedwiththeleftandrightcursorkeyssoyoucanseetheeffect.Wehavemodifiedthebasesourcecodeabitinordertosupportnothavingaskyboxoraterrain.Wehavealsoclampedthelighteffectinthefragmentshaderintherang[0,1]toavoidoverexposingeffectoftheimage.

Theresultisshowninthenextfigure.

NormalMapping

203

Asyoucanseethequadthathasanormaltextureappliedgivestheimpressionofhavingmorevolume.Althoughitis,inessence,aplainsurfaceliketheotherquad,youcanseehowthelightreflects.But,althoughthecodewehavesetup,worksperfectlywiththisexampleyouneedtobeawareofitslimitations.Thecodeonlyworksfornormalmaptexturesthatarecreatedusingobjectspacecoordinates.Ifthisisthecasewecanapplythemodelviewmatrixtransformationstotranslatethenormalcoordinatestotheviewspace.

But,usuallynormalmapsarenotdefinedinthatway.Theyusuallyaredefinedinthecalledtangentspace.Thetangentspaceisacoordinatesystemthatislocaltoeachtriangleofthemodel.Inthatcoordinatespacethezaxisalwayspointsoutofthesurface.Thisisthereasonwhywhenyoulookatanormalmapitsusuallybluish,evenforcomplexmodelswithopposingfaces.

Wewillstickwiththissimpleimplementationbynow,butkeepinmindthatyoumustalwaysusenormalmapsdefinedinobjectspace.Ifyouusemapsdefinedintangentspaceyouwillgetweirdresults.Inordertobeabletoworkwiththemweneedtosetupspecificmatricestotransformcoordinatestothetangentspace.

NormalMapping

204

Shadows

ShadowMappingCurrentlyweareabletorepresenthowlightaffectstheobjectsina3Dscene.Objectsthatgetmorelightareshownbrighterthenobjectsthatdonotreceivelight.Howeverwearestillnotabletocastshadows.Shadowswillincreasethedegreeofrealismthat3Dscenewouldhavesowewilladdsupportforitinthischapter.

WewilluseatechniquenamedShadowmappingwhichiswidelyusedingamesanddoesnotaffectseverelytheengineperformance.Shadowmappingmayseemsimpletounderstandbutit’ssomehowdifficulttoimplementitright.Or,tobemoreprecise,it’sverydifficulttoimplementitinagenericwaysthatcoverallthepotentialcasesandproducesconsistentresults.

Wewillexplainhereanapproachwhichwillserveyoutoaddshadowsformostofthecases,butwhatit’smoreimportantitwillserveyoutounderstanditslimitations.ThecodepresentedhereisfarfrombeingperfectbutIthinkitwillbeeasytounderstand.Itisalsodesignedtosupportdirectionallights(whichinmyopinionisthemorecomplexcase)butyouwilllearnhowitcanbeextendedtosupportothertypeoflights(suchuspointlights).IfyouwanttoachievemoreadvancedresultsyoushouldusemoreadvancetechniquessuchasCascadedShadowMaps.Inanycasetheconceptsexplainedherewillserveyouasabasis.

Solet’sstartbythinkinginhowwecouldcheckifaspecificarea(indeedafragment)isinshadowornot.Whiledrawingthatareaifwecancastraystothelightsource,ifwecanreachthelightsourcewithoutanycollisionthenthatpixelisinlight.Ifnot,thenthepixelisinshadow.

Thefollowingpictureshowsthecaseforapointlight,thepointPAcanreachthesourcelight,butpointsPBandPCcan’tsotheyareinshadow.

Shadows

205

So,howwecancheckifwecancastthatraywithoutcollisionsinanefficientmanner?Alightsourcecan,theoreticallycastinfinitelyraylights,sohowdowecheckifaraylightisblockedornot?Whatwecandoinsteadofcastingraylightsistolookatthe3Dscenefromthelight’spersèctiveandrenderthescenefromthatlocation.Wecansetthecameraatthelightpositionandrenderthescenesowecanstorethedepthforeachfragment.Thisisequivalenttocalculatethedistanceofeachfragmenttothelightsource.Attheend,whatwearedoingisstoringtheminimumdistanceasseenfromthelightsourceasashadowmap.

Thefollowingpictureshowsacubefloatingoveraplaneandaperpendicularlight.

Thesceneasseenfromthelightperspectivewouldbesomethinglikethis(thedarkerthecolour,theclosertothelightsource).

Shadows

206

Withthatinformationwecanrenderthe3Dsceneasusualandcheckthedistanceforeachfragmenttothelightsourcewiththeminimumstoreddistance.Ifthedistanceislessthatthevaluestoredintheshadowmap,thentheobjectisinlight,otherwiseisinshadow.Wecanhaveseveralobjectsthatcouldbehitbythesameraylight.Butwestoretheminimumdistance.

Thus,shadowmappingisatwostepprocess:

Firstwerenderthescenefromthelightspaceintoashadowmaptogettheminimumdistances.Secondwerenderthescenefromthecamerapointofviewandusethatdepthmaptocalculateifobjectsareinshadowornot.

Inordertorenderthedepthmapweneedtotalkaboutthedepthbuffer.Whenwerenderasceneallthedepthinformationisstoredinabuffernamed,obviously,depth-buffer(alsoz-buffer).Thatdepthinformationisthezvalueofeachofthefragmentthatisrendered.Ifyourecallfromthefirstchapterswhatwearedoingwhilerenderingasceneistransformingfromworldcoordinatestoscreencoordinates.Wearedrawingtoacoordinatespacewhichrangesfrom0to1forxandyaxis.Ifanobjectismoredistantthanother,wemustcalculatehowthisaffectstheirxandycoordinatesthroughtheperspectiveprojectionmatrix.Thisisnotcalculatedautomaticallydependingonthezvalue,thismustbedoneus.Whatisactuallystoredinthezcoordinateitsthedepthofthatfragment,nothinglessbutnothingmore.

Besidesthat,inoursourcecodeweareenablingdepthtesting.IntheWindowclasswehavesetthefollowingline:

glEnable(GL_DEPTH_TEST);

Bysettingthislinewepreventfragmentsthatcannotbeseen,becausetheyarebehindotherobjects,tobedrawn.Beforeafragmentisdrawnitszvalueiscomparedwiththezvalueofthez-buffer.Ifithasahigherzvalue(it’sfaraway)thanthezvalueofthebufferit’sdiscarded.Rememberthatthisisdoneinscreenspace,sowearecomparingthezvalueofafragmentgivenapairofxandycoordinatesinscreenspace,thatisintherange[0, 1].Thus,thezvalueisalsointhatrange.

Shadows

207

Thepresenceofthedepthbufferisthereasonwhyweneedtoclearthescreenbeforeperforminganyrenderoperation.Weneedtoclearnotonlythecolourbutthedepthinformationalso:

publicvoidclear(){

glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);

}

Inordertostartbuildingthedepthmapwewanttogetthatdepthinformationasviewedfromthelightperspective.Weneedtosetupacamerainthelightposition,renderthesceneandstorethatdepthinformationintoatexturesowecanaccesstoitlater.

Therefore,thefirstthingweneedtodoisaddsupportforcreatingthosetextures.WewillmodifytheTextureclasstosupportthecreationofemptytexturesbyaddinganewconstructor.Thisconstructorexpectsthedimensionsofthetextureandtheformatofthepixelsitstores.

publicTexture(intwidth,intheight,intpixelFormat)throwsException{

this.id=glGenTextures();

this.width=width;

this.height=height;

glBindTexture(GL_TEXTURE_2D,this.id);

glTexImage2D(GL_TEXTURE_2D,0,GL_DEPTH_COMPONENT,this.width,this.height,0,pix

elFormat,GL_FLOAT,(ByteBuffer)null);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);

}

WesetthetexturewrappingmodetoGL_CLAMP_TO_EDGEsincewedonotwantthetexturetorepeatincaseweexceedthe[0, 1]range.

Sonowthatweareabletocreateemptytextures,weneedtobeabletorenderasceneintoit.InordertodothatweneedtouseFrameBuffersObjects(orFBOs).AFrameBufferisacollectionofbuffersthatcanbeusedasadestinationforrendering.WhenwehavebeenrenderingtothescreenwehaveusingOpenGL’sdefaultbuffer.OpenGLallowsustorendertouserdefinedbuffersbyusingFBOs.WewillisolatetherestofthecodeoftheprocessofcreatingFBOsforshadowmappingbycreatinganewclassnamedShadowMap.Thisisthedefinitionofthatclass.

packageorg.lwjglb.engine.graph;

importstaticorg.lwjgl.opengl.GL11.*;

importstaticorg.lwjgl.opengl.GL30.*;

Shadows

208

publicclassShadowMap{

publicstaticfinalintSHADOW_MAP_WIDTH=1024;

publicstaticfinalintSHADOW_MAP_HEIGHT=1024;

privatefinalintdepthMapFBO;

privatefinalTexturedepthMap;

publicShadowMap()throwsException{

//CreateaFBOtorenderthedepthmap

depthMapFBO=glGenFramebuffers();

//Createthedepthmaptexture

depthMap=newTexture(SHADOW_MAP_WIDTH,SHADOW_MAP_HEIGHT,GL_DEPTH_COMPONENT

);

//AttachthethedepthmaptexturetotheFBO

glBindFramebuffer(GL_FRAMEBUFFER,depthMapFBO);

glFramebufferTexture2D(GL_FRAMEBUFFER,GL_DEPTH_ATTACHMENT,GL_TEXTURE_2D,dep

thMap.getId(),0);

//Setonlydepth

glDrawBuffer(GL_NONE);

glReadBuffer(GL_NONE);

if(glCheckFramebufferStatus(GL_FRAMEBUFFER)!=GL_FRAMEBUFFER_COMPLETE){

thrownewException("CouldnotcreateFrameBuffer");

}

//Unbind

glBindFramebuffer(GL_FRAMEBUFFER,0);

}

publicTexturegetDepthMapTexture(){

returndepthMap;

}

publicintgetDepthMapFBO(){

returndepthMapFBO;

}

publicvoidcleanup(){

glDeleteFramebuffers(depthMapFBO);

depthMap.cleanup();

}

}

Shadows

209

TheShadowMapclassdefinestwoconstantsthatdeterminethesizeofthetexturethatwillholdthedepthmap.Italsodefinestwoattributes,onefortheFBOandoneforthetexture.Intheconstructor,wecreateanewFBOandanewTexture.FortheFBOwewilluseasthepixelformattheconstantGL_DEPTH_COMPONENTsinceweareonlyinterestedinstoringdepthvalues.ThenweattachtheFBOtothetextureinstance.

ThefollowinglinesexplicitlysettheFBOtonotrenderanycolour.AFBOneedsacolourbuffer,butwearenotgoingtoneeded.ThisiswhywesetthecolourbufferstobeusedasGL_NONE.

glDrawBuffer(GL_NONE);

glReadBuffer(GL_NONE);

NowwearereadytorenderthescenefromthelightperspectiveintoFBOintheRendererclass.Inordertodothat,wewillcreateaspecificsetofvertexandfragmentsshaders.

Thevertexshader,nameddepth_vertex.fs,isdefinedlikethis.

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

uniformmat4modelLightViewMatrix;

uniformmat4orthoProjectionMatrix;

voidmain()

{

gl_Position=orthoProjectionMatrix*modelLightViewMatrix*vec4(position,1.0f);

}

Weexpecttoreceivethesameinputdataasthesceneshader.Infact,weonlyneedtheposition,buttoreuseasmuchascodeaspossiblewewillpassitanyway.Wealsoneedapairofmatrices.Rememberthatwemustrenderthescenefromthelightpointofview,soweneedtotransformourmodelstolight'scoordinatespace.ThisisdonethroughthemodelLightViewMatrixmatrix,whichisanalogoustoviewmodelmatrixusedforacamera.Thelightisourcameranow.

Thenweneedtotransformthosecoordinatestoscreenspace,thatis,weneedtoprojectthem.Andthisisoneofthedifferenceswhilecalculatingshadowmapsfordirectionallightsversuspointlights.Forpointlightswewoulduseaperspectiveprojectionmatrixasifwewererenderingthescenenormally.Directionallights,instead,affectallobjectsinthesame

Shadows

210

wayindependentlyofthedistance.Directionallightsarelocatedataninfinitepointanddonothaveapositionbutadirection.Anorthographicprojectiondoesnotrenderdistantobjectssmaller,andbecauseofthischaracteristicisthemostsuitablefordirectionallights.

Thefragmentshaderisevensimpler.Itjustoutputsthezcoordinateasthedepthvalue.

#version330

voidmain()

{

gl_FragDepth=gl_FragCoord.z;

}

Infact,youcanremovethatline,sinceweareonlygeneratingdepthvalues,thedepthvalueitwillbeautomaticallyreturned.

OncewehavedefinedthenewshadersfordepthrenderingwecanusethemintheRendererclass.Wedefineanewmethodforsettingupthoseshaders,namedsetupDepthShader,whichwillbeinvokedwheretheothersshadersareinitialized.

privatevoidsetupDepthShader()throwsException{

depthShaderProgram=newShaderProgram();

depthShaderProgram.createVertexShader(Utils.loadResource("/shaders/depth_vertex.vs"

));

depthShaderProgram.createFragmentShader(Utils.loadResource("/shaders/depth_fragmen

t.fs"));

depthShaderProgram.link();

depthShaderProgram.createUniform("orthoProjectionMatrix");

depthShaderProgram.createUniform("modelLightViewMatrix");

}

NowweneedtocreateanewmethodthatusesthoseshaderswhichwillbenamedrenderDepthMap.Thismethodwillbeinvokedintheprincipalrendermethod.

publicvoidrender(Windowwindow,Cameracamera,Scenescene,IHudhud){

clear();

//Renderdepthmapbeforeviewportshasbeensetup

renderDepthMap(window,camera,scene);

glViewport(0,0,window.getWidth(),window.getHeight());

//Restofthecodehere....

Shadows

211

Ifyoulookattheabovecodeyouwillseethatthenewmethodisinvokedattheverybeginning,beforewehavesettheviewport.Thisisduetothefactthatthisnewmethodwillchangetheviewporttomatchthedimensionsofthetexturethatholdsthedepthmap.Becauseofthat,wewillalwaysneedtoset,aftertherenderDepthMaphasbeenfinished,theviewporttothescreendimensions(withoutcheckingifthewindowhasbeenresized).

Let’sdefinenowtherenderDepthMapmethod.ThefirstthingthatwewilldoistobindtotheFBOwehavecreatedintheShadowMapclassandsettheviewporttomatchthetexturedimensions.

glBindFramebuffer(GL_FRAMEBUFFER,shadowMap.getDepthMapFBO());

glViewport(0,0,ShadowMap.SHADOW_MAP_WIDTH,ShadowMap.SHADOW_MAP_HEIGHT);

Thenweclearthedepthbuffercontentsandbindthedepthshaders.Sinceweareonlydealingwithdepthvalueswedonotneedtoclearcolourinformation.

glClear(GL_DEPTH_BUFFER_BIT);

depthShaderProgram.bind();

Nowweneedtosetupthematrices,andherecomesthetrickypart.Weusethelightasacamerasoweneedtocreateaviewmatrixwhichneedsapositionandthreeangles.Asithasbeensaidatthebeginningofthechapterwewillsupportonlydirectionallights,andthattypeoflightsdoesnotdefineapositionbutadirection.Ifwewereusingpointlightsthiswouldbeeasy,thepositionofthelightwouldbethepositionoftheviewmatrix,butwedonothavethat.

Wewilltakeasimpleapproachtocalculatethelightposition.Directionallightsaredefinedbyavector,usually,normalized,whichpointstothedirectionwherethelightis.Wewillmultiplythatdirectionvectorbyaconfigurablefactorsoitdefinesapointatareasonabledistanceforthescenewewanttodraw.Wewillusethatdirectioninordertocalculatetherotationangleforthatviewmatrix.

Shadows

212

Thisisthefragmentthatcalculatesthelightpositionandtherotationangles

floatlightAngleX=(float)Math.toDegrees(Math.acos(lightDirection.z));

floatlightAngleY=(float)Math.toDegrees(Math.asin(lightDirection.x));

floatlightAngleZ=0;

Matrix4flightViewMatrix=transformation.updateLightViewMatrix(newVector3f(lightDire

ction).mul(light.getShadowPosMult()),newVector3f(lightAngleX,lightAngleY,lightAngl

eZ));

Nextweneedtocalculatetheorthographicprojectionmatrix.

Matrix4forthoProjMatrix=transformation.updateOrthoProjectionMatrix(orthCoords.left,

orthCoords.right,orthCoords.bottom,orthCoords.top,orthCoords.near,orthCoords.far)

;

WehavemodifiedtheTransformationclasstoincludethelightviewmatrixandtheorthographicprojectionmatrix.Previouslywehadaorthographic2Dprojectionmatrix,sowehaverenamedthepreviousmethodsandattributes.Youcancheckthedefinitioninthesourcecodewhichisstraightforward.

ThenwerenderthesceneobjectsasintherenderScenemethodbutusingthepreviousmatricestoworkinlightspacecoordinatesystem.

Shadows

213

depthShaderProgram.setUniform("orthoProjectionMatrix",orthoProjMatrix);

Map<Mesh,List<GameItem>>mapMeshes=scene.getGameMeshes();

for(Meshmesh:mapMeshes.keySet()){

mesh.renderList(mapMeshes.get(mesh),(GameItemgameItem)->{

Matrix4fmodelLightViewMatrix=transformation.buildModelViewMatrix(gameItem,

lightViewMatrix);

depthShaderProgram.setUniform("modelLightViewMatrix",modelLightViewMatrix);

}

);

}

//Unbind

depthShaderProgram.unbind();

glBindFramebuffer(GL_FRAMEBUFFER,0);

TheparameterizationoftheorthographicprojectionmatrixisdefinedinthedirectionalLight.Thinkoftheorthographicprojectionmatrixasaboundingboxthatcontainsalltheobjectsthatwewanttorender.Whenprojectingonlytheobjectsthatfitintothatboundingboxwillbebevisible.Thatboundingboxisdefinedby6parameters:left,right,bottom,top,near,far.Since,thelightpositionisnowtheorigin,theseparametersdefinethedistancefromthatorigintotheleftorright(x-axis)upordown(y-axis)andtothenearestorfarthestplane(z-axis).

Oneofthetrickiestpointsingettingshadowsmaptoworkisdeterminethelightpositionandtheorthographicprojectionmatrixparameters.ThisiswayalltheseparametersarenowdefinedintheDirectionalLightclasssoitcanbesetproperlyaccordingtoeachscene.

Youcanimplementamoreautomaticapproach,bycalculatingthecentreofthecamerafrustum,getbackinthelightdirectionandbuildaorthographicprojectionthatcontainsalltheobjectsinthescene.Thefollowingfigureshowsa3Dsceneaslookedformabove,thecamerapositionanditsfrustum(inblue)andtheoptimallightpositionandboundingboxinred.

Shadows

214

Theproblemwiththeapproachaboveisthatisdifficulttocalculateandifyouhavesmallobjectsandtheboundingboxisbigyoumaygetstrangeresults.Theapproachpresentedhereissimplerforsmallscenesandyoucantweakittomatchyourmodels(evenyoucanchosetoexplicitlysetlight’spositiontoavoidstrangeeffectsifcameramovesfarawayfromtheorigin).Ifyouwantamoregenericmodelthatcanbeappliedtoanysceneyoushouldextendittosupportcascadingshadowmaps.

Let'scontinue.Beforeweusethedepthmapstoactuallycalculateshadows,youcouldrenderaquadwiththegeneratedtexturetoseehowarealdepthmaplookslike.Youcouldgetsomethinglikethisforascenecomposedbyarotatingcubefloatingoveraplanewithaperpendiculardirectionallight.

Shadows

215

Asit'sbeensaidbefore,thedarkerthecolour,theclosertothelightposition.What’stheeffectofthelightpositioninthedepthmap?Youcanplaywiththemultiplicationfactorofthedirectionallightandyouwillseethatthesizeoftheobjectsrenderedinthetexturedonotdecrease.Rememberthatweareusinganorthographicprojectionmatrixandobjectsdonotgetsmallerwithdistance.Whatyouwillseeisthatallcoloursgetbrighterasseeninthenextpicture.

Doesthatmeanthatwecanchooseahighdistanceforthelightpositionwithoutconsequences?Theanswerisno.Iflightistoofarawayfromtheobjectswewanttorender,theseobjectscanbeoutoftheboundingboxthatdefinestheorthographicprojectionmatrix.Inthiscaseyouwillgetanicewhitetexturewhichwouldbeuselessforshadowmapping.Ok,thenwesimplyincreasetheboundingboxsizeandeverythingwillbeok,right?Theanswerisagainno.Ifyouchosehugedimensionsfortheorthographicprojectionmatrixyourobjectswillbedrawnverysmallinthetexture,andthedepthvaluescanevenoverlapleadingtostrangeresults.Ok,soyoucanthinkinincreasingthetexturesize,but,againinthiscaseyouarelimitedandtexturescannotgrowindefinitelytousehugeboundingboxes.

Soasyoucanseeselectingthelightpositionandtheorthographicprojectionparametersisacomplexequilibriumwhichmakesdifficulttogetrightresultsusingshadowmapping.

Let’sgobacktotherenderingprocess,oncewehavecalculatedthedepthmapwecanuseitwhilerenderingthescene.Firstweneedtomodifythescenevertexshader.Uptonow,thevertexshaderprojectedthevertexcoordinatesfrommodelviewspacetothescreenspaceusingaperspectivematrix.Nowweneedtoprojectalsothevertexcoordinatesfromlightspacecoordinatesusingaprojectionmatrixtobeusedinthefragmentshadertocalculatetheshadows.

Thevertexshaderismodifiedlikethis.

Shadows

216

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

outvec2outTexCoord;

outvec3mvVertexNormal;

outvec3mvVertexPos;

outvec4mlightviewVertexPos;

outmat4outModelViewMatrix;

uniformmat4modelViewMatrix;

uniformmat4projectionMatrix;

uniformmat4modelLightViewMatrix;

uniformmat4orthoProjectionMatrix;

voidmain()

{

vec4mvPos=modelViewMatrix*vec4(position,1.0);

gl_Position=projectionMatrix*mvPos;

outTexCoord=texCoord;

mvVertexNormal=normalize(modelViewMatrix*vec4(vertexNormal,0.0)).xyz;

mvVertexPos=mvPos.xyz;

mlightviewVertexPos=orthoProjectionMatrix*modelLightViewMatrix*vec4(position

,1.0);

outModelViewMatrix=modelViewMatrix;

}

Weusenewuniformsforthelightviewmatrixandtheorthographicprojectionmatrix.

Inthefragmentshaderwewillcreateanewfunctiontocalculatetheshadowsthatisdefinedlikethis.

floatcalcShadow(vec4position)

{

floatshadowFactor=1.0;

vec3projCoords=position.xyz;

//Transformfromscreencoordinatestotexturecoordinates

projCoords=projCoords*0.5+0.5;

if(projCoords.z<texture(shadowMap,projCoords.xy).r)

{

//Currentfragmentisnotinshade

shadowFactor=0;

}

return1-shadowFactor;

}

Shadows

217

Thefunctionreceivesthepositioninlightviewspaceprojectedusingtheorthographicprojectionmatrix.Itreturns0ifthepositionisinshadowand1ifit’snot.First,thecoordinatesaretransformedtotexturecoordinates.Screencoordinatesareintherange[−1, 1],buttexturecoordinatesareintherange[0, 1].Withthatcoordinateswegetthedepthvaluefromthetextureandcompareitwiththezvalueofthefragmentcoordinates.Ifthezvalueifthefragmenthasalowervaluethantheonestoredinthetexturethatmeansthatthefragmentisnotinshade.

Inthefragmentshader,thereturnvaluefromthecalcShadowfunctiontomodulatethelightcolourcontributionsfrompoint,spotanddirectionallights.Theambientlightisnotaffectedbytheshadow.

floatshadow=calcShadow(mlightviewVertexPos);

fragColor=clamp(ambientC*vec4(ambientLight,1)+diffuseSpecularComp*shadow,0,1

);

IntherenderScenemethodoftheRendererclasswejustneedtopasstheuniformfortheorthographicprojectionandlightviewmatrices(weneedtomodifyalsothemethodthatinitializestheshadertocreatethenewuniforms).Youcanconsultthisinthebook’ssourcecode.

IftoruntheDummyGameclass,whichhasbeenmodifiedtosetupafloatingcubeoveraplanewithadirectionallightwhichanglecanbechangedbyusingupanddownkeys,youshouldseesomethinglikethis.

Althoughshadowsareworking(youcancheckthatbymovinglightdirection),theimplementationpresentssomeproblems.Firstofall,therearestrangelinesintheobjectsthatarelightenedup.Thiseffectiscalledshadowacne,andit’sproducedbythelimited

Shadows

218

resolutionofthetexturethatstoresthedepthmap.Thesecondproblemisthatthebordersoftheshadowarenotsmoothandlookblocky.Thecauseisthesameagain,thetextureresolution.Wewillsolvetheseproblemsinordertoimproveshadowquality.

ShadowMappingimprovementsNowthatwehavetheshadowmappingmechanismworking,let’ssolvetheproblemswehave.Let’sfirststartwiththeshadowacneproblem.Thedepthmaptextureislimitedinsize,andbecauseofthat,severalfragmentscanbemappedtothesamepixelinthattexturedepth.Thetexturedepthstorestheminimumdepth,soattheend,wehaveseveralfragmentsthatsharethesamedepthinthattexturealthoughtheyareatdifferentdistances.

Wecansolvethisbyincreasing,byalittlebitthedepthcomparisoninthefragmentshader,weaddabias.

floatbias=0.05;

if(projCoords.z-bias<texture(shadowMap,projCoords.xy).r)

{

//Currentfragmentisnotinshade

shadowFactor=0;

}

Now,theshadowacnehasdisappeared.

Shadows

219

Nowwearegoingtosolvedeshadowedgesproblem,whichisalsocausedbythetextureresolution.Foreachfragmentwearegoingtosamplethedepthtexturewiththefragment’spositionvalueandthesurroundingvalues.Thenwewillcalculatetheaverageandassignthatvalueastheshadowvalue.Inthiscasehisvaluewon’tbe0or1butcantakevaluesinbetweeninordertogetsmootheredges.

Thesurroundingvaluesmustbeatonepixeldistanceofthecurrentfragmentpositionintexturecoordinates.Soweneedtocalculatetheincrementofonepixelintexturecoordinateswhichisequalto1/textureSize.

InthefragmentShaderwejustneedtomodifytheshadowfactorcalculationtogetanaveragevalue.

floatshadowFactor=0.0;

vec2inc=1.0/textureSize(shadowMap,0);

for(introw=-1;row<=1;++row)

{

for(intcol=-1;col<=1;++col)

{

floattextDepth=texture(shadowMap,projCoords.xy+vec2(row,col)*inc).r;

shadowFactor+=projCoords.z-bias>textDepth?1.0:0.0;

}

}

shadowFactor/=9.0;

Theresultlooksnowsmoother.

Shadows

220

Nowoursamplelooksmuchbetter.Nevertheless,theshadowmappingtechniquepresentedherecanstillbeimprovedalot.Youcancheckaboutsolvingpeterpanningeffect(causedbythebiasfactor)andothertechniquestoimprovetheshadowedges.Inanycase,withtheconceptsexplainedhereyouhaveagoodbasistostartmodifyingthesample.

Inordertorendermultiplelightsyoujustneedtorenderaseparatedepthmapforeachlightsource.Whilerenderingthesceneyouwillneedtosampleallthosedepthmapstocalculatetheappropriateshadowfactor.

Shadows

221

Animations

IntroductionBynowwehavejustloadedstatic3Dmodels,inthischapterwewilllearnhowtoanimatethem.Whenthinkingaboutanimationsthefirstapproachistocreatedifferentmeshesforeachmodelpositions,loadthemupintotheGPUanddrawthemsequentiallytocreatetheillusionofanimation.Althoughthisapproachisperfectforsomegamesit'snotveryefficient(intermsofmemoryconsumption).

Thiswhereskeletalanimationcomestoplay.Inskeletalanimationthewayamodelanimatesisdefinedbyitsunderlyingskeleton.Askeletonisdefinedbyahierarchyofspecialpointscalledjoints.Thosejointsaredefinedbytheirpositionandrotation.Wehavesaidalsothatit'sahierarchy,thismeansthatthefinalpositionforeachjointisaffectedbytheirparents.Forinstance,thinkonawrist,thepositionofawristismodifiedifacharactermovestheelbowandalsoifitmovestheshoulder.

Jointsdonotneedtorepresentaphysicalboneorarticulation,theyareartifactsthatallowsthecreativestomodelananimation.Inadditiontojointswestillhavevertices,thepointsthatdefinethetrianglesthatcomposea3Dmodel.But,inskeletalanimation,verticesaredrawnbasedonthepositionofthejointsitisrelatedto.

InthischapterwewilluseMD5formattoloadanimatedmodels.MD5formatwascreatebyIDSoftware,thecreatorsofDoom,andit’sbasicallyatextbasedfileformatwhichiswellunderstood.AnotherapproachwouldbetousetheColladaformat,whichisapublicstandardsupportedbymanytools.ColladaisanXMLbasedformatbutasadownsideit’sverycomplex(Thespecificationforthe1.5versionhasmorethan500pages).So,wewillsticktoamuchmoresimpleformat,MD5,thatwillallowustofocusintheconceptsoftheskeletalanimationandtocreateaworkingsample.

YoucanalsoexportsomemodelsfromBlendertoMD5formatviaspecificaddonsthatyoucanfindontheInternet(http://www.katsbits.com/tools/#md5)

InthischapterI’veconsultedmanydifferentsources,butIhavefoundtwothatprovideaverygoodexplanationabouthowtocreateananimatedmodelusingMD5files.Thesessourcescanbeconsultedat:

http://www.3dgep.com/gpu-skinning-of-md5-models-in-opengl-and-cg/http://ogldev.atspace.co.uk/www/tutorial38/tutorial38.html

Animations

222

Solet’sstartbywritingthecodethatparsesMD5files.TheMD5formatdefinestwotypesoffiles:

Themeshdefinitionfile:whichdefinesthejointsandtheverticesandtexturesthatcomposethesetofmeshesthatformthe3Dmodel.Thisfileusuallyhasaextensionnamed“.md5mesh”.Theanimationdefinitionfile:whichdefinestheanimationsthatcanbeappliedtothemodel.Thisfileusuallyhasaextensionnamed“.md5anim”.

AnMD5fileiscomposedbyaheaderandifferentsectionscontainedbetweenbraces.Let’sstartexaminingthemeshdefinitionfile.IntheresourcesfolderyouwillfindseveralmodelsinMD5format.Ifyouopenoneofthemyoucanseeastructuresimilarlikethis.

Thefirststructurethatyoucanfindinthemeshdefinitionfileistheheader.Youcanseebelowheader’scontentfromoneofthesamplesprovided:

Animations

223

MD5Version10

commandline""

numJoints33

numMeshes6

Theheaderdefinesthefollowingattributes:

TheversionoftheMD5specificationthatitcompliesto.Thecommandusedtogeneratethisfile(froma3Dmodellingtool).ThenumberofJointsthataredefinedinthejointssectionThenumberofMeshes(thenumberofmeshessectionsexpected).

TheJointssectionsdefinesthejoints,asitnamesstates,theirpositionsandtheirrelationships.Afragmentofthejointssectionofoneofthesamplemodelsisshownbelow.

joints{

"origin"-1(-0.0000000.016430-0.006044)(0.7071070.0000000.707107)

//

"sheath"0(11.004813-3.17713831.702473)(0.307041-0.5786140.354181)

//origin

"sword"1(9.809593-9.36154940.753730)(0.305557-0.5781550.353505)

//sheath

"pubis"0(0.0140762.06444226.144581)(-0.466932-0.531013-0.466932)

//origin

……

}

AJointisdefinedbythefollowingattributes:

Jointname,atextualattributebetweenquotes.Jointparent,usinganindexwhichpointstotheparentjointusingitspositioninthejointslist.Therootjointhasaparentequalsto-1.Jointposition,definedinmodelspacecoordinatesystem.Jointorientation,definedalsoinmodelspacecoordinatesystem.Theorientationinfactisaquaternionwhosew-componentisnotincluded.

Beforecontinuingexplainingtherestofthefilelet’stalkaboutquaternions.Quaternionsarefourcomponentelementsthatareusedtorepresentrotation.Uptonow,wehavebeenusingEulerangles(yaw,pitchandroll)todefinerotations,whichbasicallydefinerotationaroundthex,yandzangles.But,Euleranglespresentsomeproblemswhenworkingwithrotations,specificallyyoumustbeawareofthecorrectordertoapplyderotationsandsomeoperationscangetverycomplex.

Animations

224

Thiswherequaternionscometohelpinordertosolvethiscomplexity.Asithasbeensaidbeforeaquaternionisdefinedasasetof4numbers(x,y,z,w).Quaternionsdefinearotationaxisandtherotationanglearoundthataxis.

YoucancheckinthewebthemathematicaldefinitionofeachofthecomponentsbutthegoodnewsisthatJOML,themathlibraryweareusing,providessupportforthem.Wecanconstructrotationmatricesbasedonquaternionsandperformsometransformationtovectorswiththem.

Let’sgetbacktothejointsdefinition,thewcomponentismissingbutitcanbeeasilycalculatedwiththehelpoftherestofthevalues.Youcancheckthesourcecodetoseehowit'sdone.

Afterthejointsdefinitionyoucanfindthedefinitionofthedifferentmeshesthatcomposeamodel.BelowyoucanfindafragmentofaMeshdefinitionfromoneofthesamples.

mesh{

shader"/textures/bob/guard1_body.png"

numverts494

vert0(0.3945310.513672)01

vert1(0.4472660.449219)12

...

vert493(0.6835940.455078)8643

numtris628

tri0021

tri1013

...

tri627471479493

numweights867

weight051.000000(6.1757748.105262-0.023020)

weight150.500000(4.88017312.8052514.196980)

...

weight86660.333333(1.266308-0.3027018.949338)

}

Animations

225

Let’sreviewthestructurepresentedabove:

AMeshstartsbydefiningatexturefile.Keepinmindthatthepaththatyouwillfindhereistheoneusedbythetoolthatcreatedthatmodel.Thatpathmaynotmatchtheonethatisusedtoloadthosefiles.Youhavetwoapproacheshere,eitheryouchangethebasepathdynamicallyoreitheryouchangethatpathbyhand.I’vechosenthelatterone,thesimplerone.Nextyoucanfindtheverticesdefinition.Avertexisdefinedbythefollowingattributes:

Vertexindex.

Texturecoordinates.

Theindexofthefirstweightdefinitionthataffectsthisvertex.

Thenumberofweightstoconsider.Afterthevertices,thetrianglesthatformthismesharedefined.Thetrianglesdefinethewaythatverticesareorganizedusingtheirindices.Finally,theweightsaredefined.AWeightdefinitioniscomposedby:

AWeightindex.

AJointindex,whichpointstothejointrelatedtothisweight.

Abiasfactor,whichisusedtomodulatetheeffectofthisweight.

Apositionofthisweight.

Thefollowingpicturedepictstherelationbetweenthecomponentsdescribedaboveusingsampledata.

Ok,sonowthatweunderstandthemeshmodelfilewecanparseit.Ifyoulookatthesourcecodeyouwillseethatanewpackagehasbeencreatedtohostparsersformodelformats.There’soneforOBJfilesunderorg.lwjglb.engine.loaders.objandthecodeforMD5files

Animations

226

isunderorg.lwjglb.engine.loaders.md5.

AlltheparsingcodeisbasedonregularexpressionstoextracttheinformationfromtheMD5textfiles.TheparserswillcreateahierarchyofobjectsthatmimicthestructureoftheinformationcomponentscontainedintheMD5files.ItmaynotbethemostefficientparserintheworldbutIthinkitwillservetobetterunderstandtheprocess.

ThestartingclasstoparseaMD5modelfileisMD5Modelclass.ThisclassreceivesasaparameterinitsparsemethodthecontentsofaMD5fileancreatesahierarchythatcontainstheheader,thelistofjointsandthelistofmesheswithallthesubelements.Thecodeisverystraightforwardso,Iwon’tincludeithere.

Afewcommentsabouttheparsingcode:

ThesubelementsofaMesharedefinedasinnerclassesinsidetheMD5Meshclass.YoucancheckhowthefourthcomponentofthejointsorientationarecalculatedinthecalculateQuaternionmethodformtheMD5Utilsclass.

NowthatwehaveparsedafilewemusttransformthatobjecthierarchyintosomethingthatcanbeprocessedbythegameEngine,wemustcreateaGameIteminstance.InordertodothatwewillcreateanewclassnamedMD5LoaderthatwilltakeaMD5ModelinstanceandwillconstructaGameItem.

Beforewestart,asyounoticed,aMD5modelhasseveralMeshes,butourGameItemclassonlysupportsasingleMesh.Weneedtochangethisfirst,theclassGameItemnowlookslikethis.

packageorg.lwjglb.engine.items;

importorg.joml.Vector3f;

importorg.lwjglb.engine.graph.Mesh;

publicclassGameItem{

privateMesh[]meshes;

privatefinalVector3fposition;

privatefloatscale;

privatefinalVector3frotation;

publicGameItem(){

position=newVector3f(0,0,0);

scale=1;

rotation=newVector3f(0,0,0);

}

Animations

227

publicGameItem(Meshmesh){

this();

this.meshes=newMesh[]{mesh};

}

publicGameItem(Mesh[]meshes){

this();

this.meshes=meshes;

}

publicVector3fgetPosition(){

returnposition;

}

publicvoidsetPosition(floatx,floaty,floatz){

this.position.x=x;

this.position.y=y;

this.position.z=z;

}

publicfloatgetScale(){

returnscale;

}

publicvoidsetScale(floatscale){

this.scale=scale;

}

publicVector3fgetRotation(){

returnrotation;

}

publicvoidsetRotation(floatx,floaty,floatz){

this.rotation.x=x;

this.rotation.y=y;

this.rotation.z=z;

}

publicMeshgetMesh(){

returnmeshes[0];

}

publicMesh[]getMeshes(){

returnmeshes;

}

publicvoidsetMeshes(Mesh[]meshes){

this.meshes=meshes;

}

publicvoidsetMesh(Meshmesh){

if(this.meshes!=null){

for(MeshcurrMesh:meshes){

Animations

228

currMesh.cleanUp();

}

}

this.meshes=newMesh[]{mesh};

}

}

WiththemodificationabovewecannowdefinethecontentsfortheMD5Loaderclass.ThisclasswillhaveamethodnamedprocessthatwillreceiveaMD5Modelinstanceandadefaultcolour(forthemeshesthatdonotdefineatexture)andwillreturnaGameIteminstance.Thebodyofthatmethodisshownbelow.

publicstaticGameItemprocess(MD5Modelmd5Model,Vector4fdefaultColour)throwsExcep

tion{

List<MD5Mesh>md5MeshList=md5Model.getMeshes();

List<Mesh>list=newArrayList<>();

for(MD5Meshmd5Mesh:md5Model.getMeshes()){

Meshmesh=generateMesh(md5Model,md5Mesh,defaultColour);

handleTexture(mesh,md5Mesh,defaultColour);

list.add(mesh);

}

Mesh[]meshes=newMesh[list.size()];

meshes=list.toArray(meshes);

GameItemgameItem=newGameItem(meshes);

returngameItem;

}

AsyoucanseewejustiterateoverthemeshesdefinedintotheMD5Modelclassandtransformthemintoinstancesoftheclassorg.lwjglb.engine.graph.MeshbyusingthegenerateMeshmethodwhichistheonethatreallydoesthework.Beforeweexaminethatmethodwewillcreateaninnerclassthatwillserveustobuildthepositionsandnormalsarray.

Animations

229

privatestaticclassVertexInfo{

publicVector3fposition;

publicVector3fnormal;

publicVertexInfo(Vector3fposition){

this.position=position;

normal=newVector3f(0,0,0);

}

publicVertexInfo(){

position=newVector3f();

normal=newVector3f();

}

publicstaticfloat[]toPositionsArr(List<VertexInfo>list){

intlength=list!=null?list.size()*3:0;

float[]result=newfloat[length];

inti=0;

for(VertexInfov:list){

result[i]=v.position.x;

result[i+1]=v.position.y;

result[i+2]=v.position.z;

i+=3;

}

returnresult;

}

publicstaticfloat[]toNormalArr(List<VertexInfo>list){

intlength=list!=null?list.size()*3:0;

float[]result=newfloat[length];

inti=0;

for(VertexInfov:list){

result[i]=v.normal.x;

result[i+1]=v.normal.y;

result[i+2]=v.normal.z;

i+=3;

}

returnresult;

}

}

Let’sgetbacktothegenerateMeshmethod,thefirstwedoisgetthemeshverticesinformation,theweightsandthestructureofthejoints.

Animations

230

privatestaticMeshgenerateMesh(MD5Modelmd5Model,MD5Meshmd5Mesh,Vector4fdefaultC

olour)throwsException{

List<VertexInfo>vertexInfoList=newArrayList<>();

List<Float>textCoords=newArrayList<>();

List<Integer>indices=newArrayList<>();

List<MD5Mesh.MD5Vertex>vertices=md5Mesh.getVertices();

List<MD5Mesh.MD5Weight>weights=md5Mesh.getWeights();

List<MD5JointInfo.MD5JointData>joints=md5Model.getJointInfo().getJoints();

Thenweneedtocalculatetheverticespositionbasedontheinformationcontainedintheweightsandjoints.Thisisdoneinthefollowingblock

for(MD5Mesh.MD5Vertexvertex:vertices){

Vector3fvertexPos=newVector3f();

Vector2fvertexTextCoords=vertex.getTextCoords();

textCoords.add(vertexTextCoords.x);

textCoords.add(vertexTextCoords.y);

intstartWeight=vertex.getStartWeight();

intnumWeights=vertex.getWeightCount();

for(inti=startWeight;i<startWeight+numWeights;i++){

MD5Mesh.MD5Weightweight=weights.get(i);

MD5JointInfo.MD5JointDatajoint=joints.get(weight.getJointIndex());

Vector3frotatedPos=newVector3f(weight.getPosition()).rotate(joint.getO

rientation());

Vector3facumPos=newVector3f(joint.getPosition()).add(rotatedPos);

acumPos.mul(weight.getBias());

vertexPos.add(acumPos);

}

vertexInfoList.add(newVertexInfo(vertexPos));

}

Let’sexaminewhatwearedoinghere.Weiterateovertheverticesinformationandstorethetexturecoordinatesinalist,noneedtoapplyanytransformationhere.Thenwegetthestartingandtotalnumberofweightstoconsidertocalculatethevertexposition.

Thevertexpositioniscalculatedbyusingalltheweightsthatisrelatedto.Eachweightshasapositionandabias.Thesumofallbiasoftheweightsassociatedtoeachvertexmustbeequalto1.0.Eachweightalsohasapositionwhichisdefinedinjoint’slocalspace,soweneedtotransformittomodelspacecoordinatesusingthejoint’sorientationandposition(likeifitwereatransformationmatrix)towhichitrefersto.

Tosumup,thevertexpositioncanbeexpressedbythisformula:

Animations

231

V pos = (Jt ×Wp ) b

Where:

Thesummationstartsfromws(Weightstart)uptowc(Weightcount)weights.

Jt isthejoint’stransformationmatrixassociatedtotheweightW .

Wp istheweightposition.

Wb istheweightbias.

Thisequationiswhatweimplementinthebodyoftheloop(wedonothavethetransformationmatrixsincewehavethejointpositionandrotationseparatelybuttheresultisthesame).

Withthecodeabovewewillbeabletoconstructthepositionsandtexturecoordinatesdatabutwestillneedtobuilduptheindicesandthenormals.Indicescanbecalculatedbyusingthetrianglesinformation,justbyiteratingthroughthelistthatholdsthosetriangles.

Normalscanbecalculatedalsousingtrianglesinformation.LetV ,V andV bethetrianglevertices(inobject’smodelspace).Thenormalforthetrianglecanbecalculateaccordingtothisformula:

N = (V − V ) × (V − V )

WhereNshouldbenormalizedafter.Thefollowingfigureshowsthegeometricinterpretationoftheformulaabove.

i=ws∑

ws+wc

i i W i

i i

i

i

0 1 2

2 0 1 0

Animations

232

Foreachvertexwecomputeitsnormalbythenormalizedsumofallthenormalsofthetrianglesitbelongsto.Thecodethatperformsthosecalculationsisshownbelow.

for(MD5Mesh.MD5Triangletri:md5Mesh.getTriangles()){

indices.add(tri.getVertex0());

indices.add(tri.getVertex1());

indices.add(tri.getVertex2());

//Normals

VertexInfov0=vertexInfoList.get(tri.getVertex0());

VertexInfov1=vertexInfoList.get(tri.getVertex1());

VertexInfov2=vertexInfoList.get(tri.getVertex2());

Vector3fpos0=v0.position;

Vector3fpos1=v1.position;

Vector3fpos2=v2.position;

Vector3fnormal=(newVector3f(pos2).sub(pos0)).cross(newVector3f(pos1).sub(

pos0));

v0.normal.add(normal);

v1.normal.add(normal);

v2.normal.add(normal);

}

//Oncethecontributionshavebeenadded,normalizetheresult

for(VertexInfov:vertexInfoList){

v.normal.normalize();

}

ThenwejustneedtotransformtheListstoarraysandprocessthetextureinformation.

float[]positionsArr=VertexInfo.toPositionsArr(vertexInfoList);

float[]textCoordsArr=Utils.listToArray(textCoords);

float[]normalsArr=VertexInfo.toNormalArr(vertexInfoList);

int[]indicesArr=indices.stream().mapToInt(i->i).toArray();

Meshmesh=newMesh(positionsArr,textCoordsArr,normalsArr,indicesArr);

returnmesh;

}

Goingbacktotheprocessmethodyoucanseethatthere'samethodnamedhandleTexture,whichisresponsibleforloadingtextures.Thisisthedefinitionofthatmethod:

Animations

233

privatestaticvoidhandleTexture(Meshmesh,MD5Meshmd5Mesh,Vector4fdefaultColour)

throwsException{

StringtexturePath=md5Mesh.getTexture();

if(texturePath!=null&&texturePath.length()>0){

Texturetexture=newTexture(texturePath);

Materialmaterial=newMaterial(texture);

//HandlenormalMaps;

intpos=texturePath.lastIndexOf(".");

if(pos>0){

StringbasePath=texturePath.substring(0,pos);

Stringextension=texturePath.substring(pos,texturePath.length());

StringnormalMapFileName=basePath+NORMAL_FILE_SUFFIX+extension;

if(Utils.existsResourceFile(normalMapFileName)){

TexturenormalMap=newTexture(normalMapFileName);

material.setNormalMap(normalMap);

}

}

mesh.setMaterial(material);

}else{

mesh.setMaterial(newMaterial(defaultColour,1));

}

}

Theimplementationisverystraightforward.Theonlypeculiarityisthatifameshdefinesatexturenamed“texture.png”itsnormaltexturemapwillbedefinedinafile“texture_normal.png”.Weneedtocheckifthatfileexistsandloaditaccordingly.

WecannowloadaMD5fileandrenderitaswerenderotherGameItems,butbeforedoingthatweneedtodisablecullfaceinordertorenderitproperlysincenotallthetriangleswillbedrawninthecorrectdirection.WewilladdsupporttotheWindowclasstosettheseparametersatruntime(youcancheckitinthesourcecodethechanges).

Ifyouloadsomeofthesamplemodelsyouwillgetsomethinglikethis.

Animations

234

Whatyouseehereisthebindingpose,it’sthestaticrepresentationoftheMD5modelusedfortheanimatorstomodelthemeasily.Inordertogetanimationtoworkwemustprocesstheanimationdefinitionfile.

AnimatethemodelAMD5animationdefinitionfile,likethemodeldefinitionone,iscomposedbyaheaderandifferentsectionscontainedbetweenbraces.Ifyouopenoneofthosefilesyoucanseeastructuresimilarlikethis.

Animations

235

Thefirststructurethatyoucanfindintheanimationfile,asinthecaseofthemeshdefinitionfile,istheheader.Youcanseebelowheader’scontentfromoneofthesamplesprovided:

Animations

236

MD5Version10

commandline""

numFrames140

numJoints33

frameRate24

numAnimatedComponents198

Theheaderdefinesthefollowingattributes:

TheversionoftheMD5specificationthatitcompliesto.Thecommandusedtogeneratethisfile(froma3Dmodellingtool).Thenumberframesdefinedinthefile.Thenumberofjointsdefinedinthehierarchysection.Theframerate,framespersecond,thatwasusedwhilecreatingthisanimation.Thisparametercanbeusedtocalculatethetimebetweenframes.Thenumberofcomponentsthateachframedefines.

Thehierarchysectionistheonethatcomesfirstanddefinesthejointsforthisanimation.Youcanseeafragmentbelow:

hierarchy{

"origin"-100//

"body"0630//origin(TxTyTzQxQyQz)

"body2"100//body

"SPINNER"2566//body2(QxQyQz)

....

}

Ajoint.Inthehierarchysection,isdefinedbythefollowingattributes:

Jointname,atextualattributebetweenquotes.Jointparent,usinganindexwhichpointstotheparentjointusingitspositioninthejointslist.Therootjointhasaparentequalsto-1.Jointflags,whichsethowthisjoint'spositionandorientationwillbechangedaccordingtothedatadefinedineachanimationframe.Thestartindex,insidetheanimationdataofeachframethatisusedwhenapplyingtheflags.

Thenextsectionistheboundsone.Thissectiondefinesaboundingboxwhichcontainsthemodelforeachanimationframe.Itwillcontainalineforeachoftheanimationframesanditlooklikethis:

Animations

237

bounds{

(-24.3102264404-44.2608566284-0.181215778)(31.086198806838.7131576538117.7

417449951)

(-24.3102283478-44.1887664795-0.1794649214)(31.180028915438.7173080444117.

7729110718)

(-24.3102359772-44.1144447327-0.1794776917)(31.204278945938.7091217041117.

8352737427)

....

}

Eachboundingboxisdefinedbytwo3componentvectorsinmodelspacecoordinates.Thefirstvectordefinestheminimumboundandthesecondonethemaximum.

Thenextsectionisthebaseframedata.Inthissection,thepositionandorientationofeachjointissetupbeforethedeformationsofeachanimationframeareapplied.Youcanseeafragmentbelow:

baseframe{

(000)(-0.5-0.5-0.5)

(-0.894733607870.7142486572-6.5027675629)(-0.3258574307-0.00830373540.0313

780755)

(0.00000014620.0539700091-0.0137935728)(000)

....

}

Eachlineisassociatedtoajointanddefinethefollowingattributes:

Positionofthejoint,asathreecomponentsvector.Orientationofthejoint,asthethreecomponentsofaquaternion(asinthemodelfile).

Afterthatyouwillfindseveralframedefinitions,asmanyasthevalueassignedtothenumFramesheaderattribute.Eachframesectionislikeahugearrayoffloatsthatwillbeusedbythejointswhenapplyingthetransformationsforeachframe.Youcanseeafragmentbelow.

frame1{

-0.927910089570.682762146-6.3709330559-0.3259022534-0.01005017380.0320306309

0.32590225340.0100501738-0.0320306309

-0.1038384438-0.1639953405-0.01525534880.0299418624

....

}

ThebaseclassthatparsesaMD5animationfileisnamedMD5AnimModel.Thisclasscreatesalltheobjectshierarchythatmapsthecontentsofthatfileandyoucancheckthesourcecodeforthedetails.ThestructureissimilartotheMD5modeldefinitionfile.Nowthatweare

Animations

238

abletoloadthatinformationwewilluseittogenerateananimation.

Wewillgeneratetheanimationintheshader,soinsteadofpre-calculatingallthepositionsforeachframeweneedtopreparethedataweneedsointhevertexshaderwecancomputethefinalpositions.Let’sgetbacktotheprocessmethodintheMD5Loaderclass,weneedtomodifyittotakeintoconsiderationtheanimationinformation.Thenewdefinitionforthatmethodisshownbelow:

publicstaticAnimGameItemprocess(MD5Modelmd5Model,MD5AnimModelanimModel,Vector4f

defaultColour)throwsException{

List<Matrix4f>invJointMatrices=calcInJointMatrices(md5Model);

List<AnimatedFrame>animatedFrames=processAnimationFrames(md5Model,animModel,i

nvJointMatrices);

List<Mesh>list=newArrayList<>();

for(MD5Meshmd5Mesh:md5Model.getMeshes()){

Meshmesh=generateMesh(md5Model,md5Mesh);

handleTexture(mesh,md5Mesh,defaultColour);

list.add(mesh);

}

Mesh[]meshes=newMesh[list.size()];

meshes=list.toArray(meshes);

AnimGameItemresult=newAnimGameItem(meshes,animatedFrames,invJointMatrices);

returnresult;

}

Therearesomechangeshere,themostobviousisthatthemethodnowreceivesaMD5AnimModelinstance.ThenextoneisthatwedonotreturnaGameIteminstancebutandAnimGameItemone.ThisclassinheritsfromtheGameItemclassbutaddssupportforanimations.Wewillseewhythiswasdonethiswaylater.

Ifwecontinuewiththeprocessmethod,thefirstthingwedoiscallthecalcInJointMatricesmethod,whichisdefinedlikethis:

Animations

239

privatestaticList<Matrix4f>calcInJointMatrices(MD5Modelmd5Model){

List<Matrix4f>result=newArrayList<>();

List<MD5JointInfo.MD5JointData>joints=md5Model.getJointInfo().getJoints();

for(MD5JointInfo.MD5JointDatajoint:joints){

Matrix4ftranslateMat=newMatrix4f().translate(joint.getPosition());

Matrix4frotationMat=newMatrix4f().rotate(joint.getOrientation());

Matrix4fmat=translateMat.mul(rotationMat);

mat.invert();

result.add(mat);

}

returnresult;

}

ThismethoditeratesoverthejointscontainedintheMD5modeldefinitionfile,calculatesthetransformationmatrixassociatedtoeachjointandthenitgetstheinverseofthosematrices.ThisinformationisusedtoconstructtheAnimationGameIteminstance.

Let’scontinuewiththeprocessmethod,thenextthingwedoisprocesstheanimationframesbycallingtheprocessAnimationFramesmethod:

privatestaticList<AnimatedFrame>processAnimationFrames(MD5Modelmd5Model,MD5AnimMo

delanimModel,List<Matrix4f>invJointMatrices){

List<AnimatedFrame>animatedFrames=newArrayList<>();

List<MD5Frame>frames=animModel.getFrames();

for(MD5Frameframe:frames){

AnimatedFramedata=processAnimationFrame(md5Model,animModel,frame,invJoin

tMatrices);

animatedFrames.add(data);

}

returnanimatedFrames;

}

Thismethodprocesseachanimationframe,definedintheMD5animationdefinitionfile,andreturnsalistofAnimatedFrameinstances.TherealworkisdoneintheprocessAnimationFramemethod.Let’sexplainwhatthismethodwilldo.

Wefirst,iterateoverthejointsdefinedinthehierarchysectionintheMD5animatonfile.

Animations

240

privatestaticAnimatedFrameprocessAnimationFrame(MD5Modelmd5Model,MD5AnimModelani

mModel,MD5Frameframe,List<Matrix4f>invJointMatrices){

AnimatedFrameresult=newAnimatedFrame();

MD5BaseFramebaseFrame=animModel.getBaseFrame();

List<MD5Hierarchy.MD5HierarchyData>hierarchyList=animModel.getHierarchy().getHi

erarchyDataList();

List<MD5JointInfo.MD5JointData>joints=md5Model.getJointInfo().getJoints();

intnumJoints=joints.size();

float[]frameData=frame.getFrameData();

for(inti=0;i<numJoints;i++){

MD5JointInfo.MD5JointDatajoint=joints.get(i);

Wegetthepositionandorientationofthebaseframeelementassociatedtoeachjoint.

MD5BaseFrame.MD5BaseFrameDatabaseFrameData=baseFrame.getFrameDataList().get

(i);

Vector3fposition=baseFrameData.getPosition();

Quaternionforientation=baseFrameData.getOrientation();

Inprinciple,thatinformationshouldbeassignedtothethejoint’spositionandorientation,butitneedstobetransformedaccordingtothejoint’sflag.Ifyourecall,whenthestructureoftheanimationfilewaspresented,eachjointinthehierarchysectiondefinesaflag.Thatflagmodelshowthepositionandorientationinformationshouldbechangedaccordingtotheinformationdefinedineachanimationframe.

Ifthefirstbitofthatflagfieldisequalto1,weshouldchangethexcomponentofthebaseframepositionwiththedatacontainedintheanimationframeweareprocessing.Thatanimationfarmedefinesabugafloatarray,sowhichIelementsshouldwetake.TheanswerisalsointhejointsdefinitionwhichincludesastartIndexattribute.Ifthesecondbitofthegalisequalto1,weshouldchangetheycomponentofthebaseframepositionwiththevalueatstartIndex+1,andsoon.Thenextbitsareforthezposition,andthex,yandzcomponentsoftheorientation.

Animations

241

intflags=hierarchyList.get(i).getFlags();

intstartIndex=hierarchyList.get(i).getStartIndex();

if((flags&1)>0){

position.x=frameData[startIndex++];

}

if((flags&2)>0){

position.y=frameData[startIndex++];

}

if((flags&4)>0){

position.z=frameData[startIndex++];

}

if((flags&8)>0){

orientation.x=frameData[startIndex++];

}

if((flags&16)>0){

orientation.y=frameData[startIndex++];

}

if((flags&32)>0){

orientation.z=frameData[startIndex++];

}

//UpdateQuaternion'swcomponent

orientation=MD5Utils.calculateQuaternion(orientation.x,orientation.y,orien

tation.z);

Nowwehaveallinformationneededtocalculatethetransformationmatricestogetthefinalpositionforeachjointforthecurrentanimationframe.Butthere’sanotherthingthatwemustconsider,thepositionofeachjointisrelativetoitsparentposition,soweneedtogetthetransformationmatrixassociatedtoeachparentanduseitinordertogetatransformationmatrixthatisinmodelspacecoordinates.

Animations

242

//Calculatetranslationandrotationmatricesforthisjoint

Matrix4ftranslateMat=newMatrix4f().translate(position);

Matrix4frotationMat=newMatrix4f().rotate(orientation);

Matrix4fjointMat=translateMat.mul(rotationMat);

//Jointpositionisrelativetojoint'sparentindexposition.Useparentmat

rices

//totransformittomodelspace

if(joint.getParentIndex()>-1){

Matrix4fparentMatrix=result.getLocalJointMatrices()[joint.getParentInde

x()];

jointMat=newMatrix4f(parentMatrix).mul(jointMat);

}

result.setMatrix(i,jointMat,invJointMatrices.get(i));

}

returnresult;

}

YoucanseethatwecreateaninstanceoftheAnimatedFrameclassthatholdstheinformationthatwillbeuseduringanimation.Thisclassalsousestheinversematrices,wewillseelateronwhythisdonethisway.AnimportantthingtonoteisthatthesetMatrixmethodoftheAnimatedFrameisdefinedlikethis.

publicvoidsetMatrix(intpos,Matrix4flocalJointMatrix,Matrix4finvJointMatrix){

localJointMatrices[pos]=localJointMatrix;

Matrix4fmat=newMatrix4f(localJointMatrix);

mat.mul(invJointMatrix);

jointMatrices[pos]=mat;

}

ThevariablelocalJointMatrixstoresthetransformationmatrixforthejointthatoccupiestheposition“i”forthecurrentframe.TheinvJointMatrixholdstheinversetransformationmatrixforthejointthatoccupiestheposition“i”forthebindingpose.WestoretheresultofmultiplyingthelocalJointMatrixbytheinvJointMatrix.Thisresultwillbeusedlatertocomputethefinalpositions.Westorealsotheoriginaljointtransformationmatrix,thevariablelocalJointMatrix,sowecanuseittocalculatethisjointchildstransformationmatrices.

Let'sgetbacktotheMD5Loaderclass.ThegenerateMeshmethodalsohaschanged,wecalculatethepositionsofthebindingposeasithasbeenexplainedbefore,butforeachvertexwestoretwoarrays:

Anarraythatholdstheweightbiasassociatedtothisvertex.Anarraythatholdthejointindicesassociatedtothisvertex(throughtheweights).

Animations

243

Welimitthesizeofthosearraystoavalueof4.TheMeshclasshasalsobeenmodifiedtoreceivethoseparametersandincludeitintheVAOinformationprocessedbytheshaders.Youcancheckthedetailsinthesourcecode,butSolet’srecapwhatwehavedone:

Wearestillloadingthebindingposewiththeirfinalpositionscalculatedasthesumofthejointspositionsandorientationsthroughtheweightsinformation.ThatinformationisloadedintheshadersasVBOsbutit’scomplementedbythebiasoftheweightsassociatedtoeachvertexandtheindicesofthejointsthataffectit.Thisinformationiscommontoalltheanimationframes,sinceit’sdefinedintheMD5definitionfile.Thisisthereasonwhywelimitthesizeofthebiasandjointindicesarrays,theywillbeloadedasVBOsoncewhenthemodelissenttotheGPU.Foreachanimationframewestorethetransformationmatricestobeappliedtoeachjointaccordingtothepositionsandorientationsdefinedinthebaseframe.Wealsohavecalculatedtheinversematricesofthetransformationmatricesassociatedtothejointsthatdefinethebindingpose.Thatis,weknowhowtoundothetransformationsdoneinthebindingpose.Wewillseehowthiswillbeappliedlater.

Nowthatwehaveallthepiecestosolvethepuzzlewejustneedtousethemintheshader.Wefirstneedtomodifytheinputdatatoreceivetheweightsandthejointindices.

#version330

constintMAX_WEIGHTS=4;

constintMAX_JOINTS=150;

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

layout(location=3)invec4jointWeights;

layout(location=4)inivec4jointIndices;

Wehavedefinedtwoconstants:

MAX_WEIGHTS,definesthemaximumnumberofweightsthatcomeintheweightsVBO

Animations

244

(ansolothejointindices)MAX_JOINTS,definesthemaximumnumberofjointswearegoingtosupport(moreonthislater).

Thenwedefinetheoutputdataandtheuniforms.

outvec2outTexCoord;

outvec3mvVertexNormal;

outvec3mvVertexPos;

outvec4mlightviewVertexPos;

outmat4outModelViewMatrix;

uniformmat4jointsMatrix[MAX_JOINTS];

uniformmat4modelViewMatrix;

uniformmat4projectionMatrix;

uniformmat4modelLightViewMatrix;

uniformmat4orthoProjectionMatrix;

YoucanseethatwehaveanewuniformnamedjointsMatrixwhichisanarrayofmatrices(withamaximumlengthsetbytheMAX_JOINTSconstant).Thatarrayofmatricesholdsthejointmatricescalculatedforallthejointsinthepresentframe,andwascalculatedintheMD5Loaderclasswhenprocessingaframe.Thus,thatarrayholdsthetransformationsthatneedtobeappliedtoajointinthepresentanimationframeandwillserveasthebasisforcalculatingthevertexfinalposition.

WiththenewdataintheVBOsandthisuniformwewilltransformthebindingposeposition.Thisisdoneinthefollowingblock.

vec4initPos=vec4(0,0,0,0);

intcount=0;

for(inti=0;i<MAX_WEIGHTS;i++)

{

floatweight=jointWeights[i];

if(weight>0){

count++;

intjointIndex=jointIndices[i];

vec4tmpPos=jointsMatrix[jointIndex]*vec4(position,1.0);

initPos+=weight*tmpPos;

}

}

if(count==0)

{

initPos=vec4(position,1.0);

}

Animations

245

Firstofall,wegetthebindingposeposition,weiterateovertheweightsassociatedtothisvertexandmodifythepositionusingtheweightsandthejointmatricesforthisframe(storedinthejointsMatrixuniform)byusingtheindexthatisstoredintheinput.

So,givenavertexposition,wearecalculatingit’sframepositionas

V fp = Wb Jfp × Jt ) × V pos

Where:

WfvpisthevertexfinalpositionWbisthevertexweightJfpisthejointmatrixtransformationmatrixforthisframe

Jt istheinverseofthejointtransformationmatrixforthebindingpose.ThemultiplicationofthismatrixandJfpiswhat'scontainedinthejointsMatrixuniform.V posisthevertexpositioninthebindingposition.

V posiscalcualtedbyusintheJtmatrix,whichisthematrixofthejointtransformationmatrixforthebindingpose.So,attheendwearesomehowundoingthemodificicationsofthebindingposetoapplythetransformationsforthisframe.Thisisthereasonwhyweneedtheinversebindingposematrix.

Theshadersupportsverticeswithvariablenumberofweights,uptoamaximumof4,andalsosupportstherenderingofnonanimateditems.Inthiscase,theweightswillbeequalto0andwewillgettheoriginalposition.

i=0∑

MAXWEIGTHS

i ( i i−1

−1

Animations

246

Therestoftheshaderstaysmoreorlessthesame,wejustusetheupdatedpositiona

ndpassthecorrectvaluestobeusedbythefragmentshader.

vec4mvPos=modelViewMatrix*initPos;

gl_Position=projectionMatrix*mvPos;

outTexCoord=texCoord;

mvVertexNormal=normalize(modelViewMatrix*vec4(vertexNormal,0.0)).xyz;

mvVertexPos=mvPos.xyz;

mlightviewVertexPos=orthoProjectionMatrix*modelLightViewMatrix*vec4(position

,1.0);

outModelViewMatrix=modelViewMatrix;

}

So,inordertotesttheanimationwejustneedtopassthejointsMatrixtotheshader.SincethisinformationisstoredonlyininstancesoftheAnimGameItemclass,thecodeisverysimple.IntheloopthatrenderstheMeshes,weaddthisfragment.

if(gameIteminstanceofAnimGameItem){

AnimGameItemanimGameItem=(AnimGameItem)gameItem;

AnimatedFrameframe=animGameItem.getCurrentFrame();

sceneShaderProgram.setUniform("jointsMatrix",frame.getJointMatrices());

}

Ofcourse,yowillneedtocreatetheuniformbeforeusingit,youcancheckthesourcecodeforthat.Ifyouruntheexampleyouwillbeabletoseehowthemodelanimatesbypressingthespacebar(eachtimethekeyispressedanewframeissetandthejointsMatrixuniformchanges).

Youwillseesomethinglikethis.

Animations

247

Althoughtheanimationissmooth,thesamplepresentssomeproblems.Firstofall,lightisnotcorrectlyappliedandtheshadowrepresentsthebindingposebutnotthecurrentframe.Wewillsolvealltheseproblemsnow.

CorrectinganimationissuesThefirstissuethatwilladdressisthelightningproblem.Youmayhavealreadynoticedthecase,itsduetothefactthatwearenottransformingnormals.Thus,thenormalsthatareusedinthefragmentshaderaretheonesthatcorrespondtothebindingpose.Weneedtotransformtheminthesamewayasthepositions.

Thisissueiseasytosolve,wejustneedtoincludethenormalsintheloopthatiteratesovertheweightsinthevertexshader.

Animations

248

vec4initPos=vec4(0,0,0,0);

vec4initNormal=vec4(0,0,0,0);

intcount=0;

for(inti=0;i<MAX_WEIGHTS;i++)

{

floatweight=jointWeights[i];

if(weight>0){

count++;

intjointIndex=jointIndices[i];

vec4tmpPos=jointsMatrix[jointIndex]*vec4(position,1.0);

initPos+=weight*tmpPos;

vec4tmpNormal=jointsMatrix[jointIndex]*vec4(vertexNormal,0.0);

initNormal+=weight*tmpNormal;

}

}

if(count==0)

{

initPos=vec4(position,1.0);

initNormal=vec4(vertexNormal,0.0);

}

Thenwejustcalculatetheoutputvertexnormalasalways:

mvVertexNormal=normalize(modelViewMatrix*initNormal).xyz;

Thenextissueistheshadowproblem.Ifyourecallfromtheshadowschapter,weareusingshadowmapstodrawshadows.Wearerenderingthescenefromthelightperspectiveinordertocreateadepthmapthattellsusifapointisinshadowornot.But,asinthecaseofthenormals,wearejustpassingthebindingposecoordinatesandnotchangingthemaccordingtothecurrentframe.Thisisthereasonwhytheshadowdoesnotcorrespondstothecurrentposition.

Thesolutioniseasy,wejustneedtomodifythedepthvertexshadertousethejointsMatrixandtheweightsandjointindicestotransformtheposition.Thisishowthedepthvertexshaderlookslike.

Animations

249

#version330

constintMAX_WEIGHTS=4;

constintMAX_JOINTS=150;

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

layout(location=3)invec4jointWeights;

layout(location=4)inivec4jointIndices;

uniformmat4jointsMatrix[MAX_JOINTS];

uniformmat4modelLightViewMatrix;

uniformmat4orthoProjectionMatrix;

voidmain()

{

vec4initPos=vec4(0,0,0,0);

intcount=0;

for(inti=0;i<MAX_WEIGHTS;i++)

{

floatweight=jointWeights[i];

if(weight>0){

count++;

intjointIndex=jointIndices[i];

vec4tmpPos=jointsMatrix[jointIndex]*vec4(position,1.0);

initPos+=weight*tmpPos;

}

}

if(count==0)

{

initPos=vec4(position,1.0);

}

gl_Position=orthoProjectionMatrix*modelLightViewMatrix*initPos;

}

YouneedtomodifytheRendererclasstosetupthenewuniformsforthisshader,andthefinalresultwillbemuchbetter.Thelightwillbeappliedcorrectlyandtheshadowwillchangeforeachanimationframeasshowninthenextfigure.

Animations

250

Andthat'sall,youhavenowaworkingexamplethatanimatesMD5models.Thesourcecodecanstillbeimprovedandyoucanmodifythematricesthatareloadedineachrendercycletointerpolatebetweeenframespositions.Youcancheckthesourcesusedforthischaptertoseehowthiscanbedone.

Animations

251

Particles

ThebasicsInthischapterwewilladdparticleeffectstothegameengine.Withthiseffectwewillbeabletosimulaterays,fire,dustandclouds.It’sasimpleeffecttoimplementthatwillimprovethegraphicalaspectofanygame.

Beforewestartit'sworthtomentionthattherearemanywaystoimplementparticleeffectswithdifferent,results.Inthiscasewewillusebillboardparticles.Thistechniqueusesmovingtexturequadstorepresentaparticlewiththepeculiaritythattheyarealwaysalwaysfacingtheobserver,inourcase,thecamera.YoucanalsousebillboardingtechniquetoshowinformationpanelsovergameitemslikeaminiHUDs.

Let’sstartbydefiningwhatisaparticle.Aparticlecandedefinedbythefollowingattributes:

1. Ameshthatrepresentsthequadvertices.2. Atexture.3. Apositionatagiveninstant.4. Ascalefactor.5. Speed.6. Amovementdirection.7. Alifetimeortimetolive.Oncethistimehasexpiredtheparticleceasestoexist.

ThefirstfouritemsarepartoftheGameItemclass,butthelastthreearenot.Thus,wewillcreateanewclassnamedParticlethatextendsaGameIteminstanceandthatisdefinedlikethis.

packageorg.lwjglb.engine.graph.particles;

importorg.joml.Vector3f;

importorg.lwjglb.engine.graph.Mesh;

importorg.lwjglb.engine.items.GameItem;

publicclassParticleextendsGameItem{

privateVector3fspeed;

/**

*Timetoliveforparticleinmilliseconds.

*/

privatelongttl;

Particles

252

publicParticle(Meshmesh,Vector3fspeed,longttl){

super(mesh);

this.speed=newVector3f(speed);

this.ttl=ttl;

}

publicParticle(ParticlebaseParticle){

super(baseParticle.getMesh());

Vector3faux=baseParticle.getPosition();

setPosition(aux.x,aux.y,aux.z);

aux=baseParticle.getRotation();

setRotation(aux.x,aux.y,aux.z);

setScale(baseParticle.getScale());

this.speed=newVector3f(baseParticle.speed);

this.ttl=baseParticle.geTtl();

}

publicVector3fgetSpeed(){

returnspeed;

}

publicvoidsetSpeed(Vector3fspeed){

this.speed=speed;

}

publiclonggeTtl(){

returnttl;

}

publicvoidsetTtl(longttl){

this.ttl=ttl;

}

/**

*UpdatestheParticle'sTTL

*@paramelapsedTimeElapsedTimeinmilliseconds

*@returnTheParticle'sTTL

*/

publiclongupdateTtl(longelapsedTime){

this.ttl-=elapsedTime;

returnthis.ttl;

}

}

Asyoucanseefromthecodeabove,particle'sspeedandmovementdirectioncanbeexpressedasasinglevector.Thedirectionofthatvectormodelsthemovementdirectionanditsmodulethespeed.TheParticleTimeToLive(TTL)ismodelledasmillisecondscounterthatwillbedecreasedwheneverthegamestateisupdated.Theclasshasalsoacopyconstructor,thatis,aconstructorthattakesaninstanceofanotherParticletomakeacopy.

Particles

253

Now,weneedtocreateaparticlegeneratororparticleemitter,thatis,aclassthatgeneratestheparticlesdynamically,controlstheirlifecycleandupdatestheirpositionaccordingtoaspecificmodel.Wecancreatemanyimplementationsthatvaryinhowparticlesandcreatedandhowtheirpositionsareupdated(forinstance,takingintoconsiderationthegravityornot).So,inordertokeepourgameenginegeneric,wewillcreateaninterfacethatalltheParticleemittersmustimplement.Thisinterface,namedIParticleEmitter,isdefinedlikethis:

packageorg.lwjglb.engine.graph.particles;

importjava.util.List;

importorg.lwjglb.engine.items.GameItem;

publicinterfaceIParticleEmitter{

voidcleanup();

ParticlegetBaseParticle();

List<GameItem>getParticles();

}

TheIParticleEmitterinterfacehasamethodtocleanupresources,namedcleanup,andamethodtogetthelistofParticles,namedgetParticles.ItalsoasamethodnamedgetBaseParticle,butWhat’sthismethodfor?Aparticleemitterwillcreatemanyparticlesdynamically.Wheneveraparticleexpires,newoneswillbecreated.Thatparticlerenewalcyclewilluseabaseparticle,likeapattern,tocreatenewinstances.Thisiswhatthisbaseparticleisusedfor,ThisisalsothereasonwhytheParticleclassdefinesacopyconstructor.

InthegameenginecodewewillreferonlytotheIParticleEmitterinterfacesothebasecodewillnotbedependentonthespecificimplementations.Neverthelesswecancreateaimplementationthatsimulatesaflowofparticlesthatarenotaffectedbygravity.ThisimplementationcanbeusedtosimulateraysorfireandisnamedFlowParticleEmitter.

Thebehaviourofthisclasscanbetunedwiththefollowingattributes:

Amaximumnumberofparticlesthatcanbealiveatatime.Aminimumperiodtocreateparticles.Particleswillbecreatedonebyonewithaminimumperiodtoavoidcreatingparticlesinbursts.Asetofrangestorandomizeparticlesspeedandstartingposition.Newparticleswillusebaseparticlepositionandspeedwhichcanberandomizedwithvaluesbetweenthoserangestospreadthebeam.

Theimplementationofthisclassisasfollows:

Particles

254

packageorg.lwjglb.engine.graph.particles;

importjava.util.ArrayList;

importjava.util.Iterator;

importjava.util.List;

importorg.joml.Vector3f;

importorg.lwjglb.engine.items.GameItem;

publicclassFlowParticleEmitterimplementsIParticleEmitter{

privateintmaxParticles;

privatebooleanactive;

privatefinalList<GameItem>particles;

privatefinalParticlebaseParticle;

privatelongcreationPeriodMillis;

privatelonglastCreationTime;

privatefloatspeedRndRange;

privatefloatpositionRndRange;

privatefloatscaleRndRange;

publicFlowParticleEmitter(ParticlebaseParticle,intmaxParticles,longcreationP

eriodMillis){

particles=newArrayList<>();

this.baseParticle=baseParticle;

this.maxParticles=maxParticles;

this.active=false;

this.lastCreationTime=0;

this.creationPeriodMillis=creationPeriodMillis;

}

@Override

publicParticlegetBaseParticle(){

returnbaseParticle;

}

publiclonggetCreationPeriodMillis(){

returncreationPeriodMillis;

}

publicintgetMaxParticles(){

returnmaxParticles;

}

@Override

Particles

255

publicList<GameItem>getParticles(){

returnparticles;

}

publicfloatgetPositionRndRange(){

returnpositionRndRange;

}

publicfloatgetScaleRndRange(){

returnscaleRndRange;

}

publicfloatgetSpeedRndRange(){

returnspeedRndRange;

}

publicvoidsetCreationPeriodMillis(longcreationPeriodMillis){

this.creationPeriodMillis=creationPeriodMillis;

}

publicvoidsetMaxParticles(intmaxParticles){

this.maxParticles=maxParticles;

}

publicvoidsetPositionRndRange(floatpositionRndRange){

this.positionRndRange=positionRndRange;

}

publicvoidsetScaleRndRange(floatscaleRndRange){

this.scaleRndRange=scaleRndRange;

}

publicbooleanisActive(){

returnactive;

}

publicvoidsetActive(booleanactive){

this.active=active;

}

publicvoidsetSpeedRndRange(floatspeedRndRange){

this.speedRndRange=speedRndRange;

}

publicvoidupdate(longellapsedTime){

longnow=System.currentTimeMillis();

if(lastCreationTime==0){

lastCreationTime=now;

}

Iterator<?extendsGameItem>it=particles.iterator();

while(it.hasNext()){

Particleparticle=(Particle)it.next();

if(particle.updateTtl(ellapsedTime)<0){

Particles

256

it.remove();

}else{

updatePosition(particle,ellapsedTime);

}

}

intlength=this.getParticles().size();

if(now-lastCreationTime>=this.creationPeriodMillis&&length<maxParticl

es){

createParticle();

this.lastCreationTime=now;

}

}

privatevoidcreateParticle(){

Particleparticle=newParticle(this.getBaseParticle());

//Addalittlebitofrandomnessoftheparrticle

floatsign=Math.random()>0.5d?-1.0f:1.0f;

floatspeedInc=sign*(float)Math.random()*this.speedRndRange;

floatposInc=sign*(float)Math.random()*this.positionRndRange;

floatscaleInc=sign*(float)Math.random()*this.scaleRndRange;

particle.getPosition().add(posInc,posInc,posInc);

particle.getSpeed().add(speedInc,speedInc,speedInc);

particle.setScale(particle.getScale()+scaleInc);

particles.add(particle);

}

/**

*Updatesaparticleposition

*@paramparticleTheparticletoupdate

*@paramelapsedTimeElapsedtimeinmilliseconds

*/

publicvoidupdatePosition(Particleparticle,longelapsedTime){

Vector3fspeed=particle.getSpeed();

floatdelta=elapsedTime/1000.0f;

floatdx=speed.x*delta;

floatdy=speed.y*delta;

floatdz=speed.z*delta;

Vector3fpos=particle.getPosition();

particle.setPosition(pos.x+dx,pos.y+dy,pos.z+dz);

}

@Override

publicvoidcleanup(){

for(GameItemparticle:getParticles()){

particle.cleanup();

}

}

}

Nowwecanextendtheinformationthat’scontainedintheSceneclasstoincludeanarrayofParticleEmitterinstances.

Particles

257

packageorg.lwjglb.engine;

//Importshere

publicclassScene{

//Moreattributeshere

privateIParticleEmitter[]particleEmitters;

Atthisstagewecanstartrenderingtheparticles.Particleswillnotbeaffectedbylightsandwillnotcastanyshadow.Theywillnothaveanyskeletalanimation,soitmakessensetohavespecificshaderstorenderthem.Theshaderswillbeverysimple,theywilljustrendertheverticesusingtheprojectionandmodelviewmatricesanduseatexturetosetthecolours.

Thevertexshaderisdefinedlikethis.

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

outvec2outTexCoord;

uniformmat4modelViewMatrix;

uniformmat4projectionMatrix;

voidmain()

{

gl_Position=projectionMatrix*modelViewMatrix*vec4(position,1.0);

outTexCoord=texCoord;

}

Thefragmentshaderisdefinedlikethis:

Particles

258

#version330

invec2outTexCoord;

invec3mvPos;

outvec4fragColor;

uniformsampler2Dtexture_sampler;

voidmain()

{

fragColor=texture(texture_sampler,outTexCoord);

}

Asyoucanseetheyareverysimple,theyresemblethepairofshadersusedinthefirstchapters.Now,asinotherchapters,weneedtosetupandusethoseshadersintheRendererclass.TheshaderssetupwillbedoneinamethodnamedsetupParticlesShaderwhichisdefinedlikethis:

privatevoidsetupParticlesShader()throwsException{

particlesShaderProgram=newShaderProgram();

particlesShaderProgram.createVertexShader(Utils.loadResource("/shaders/particles_v

ertex.vs"));

particlesShaderProgram.createFragmentShader(Utils.loadResource("/shaders/particles

_fragment.fs"));

particlesShaderProgram.link();

particlesShaderProgram.createUniform("projectionMatrix");

particlesShaderProgram.createUniform("modelViewMatrix");

particlesShaderProgram.createUniform("texture_sampler");

}

AndnowwecancreatetherendermethodnamedrenderParticlesintheRendererclasswhichisdefinedlikethis:

Particles

259

privatevoidrenderParticles(Windowwindow,Cameracamera,Scenescene){

particlesShaderProgram.bind();

particlesShaderProgram.setUniform("texture_sampler",0);

Matrix4fprojectionMatrix=transformation.getProjectionMatrix();

particlesShaderProgram.setUniform("projectionMatrix",projectionMatrix);

Matrix4fviewMatrix=transformation.getViewMatrix();

IParticleEmitter[]emitters=scene.getParticleEmitters();

intnumEmitters=emitters!=null?emitters.length:0;

for(inti=0;i<numEmitters;i++){

IParticleEmitteremitter=emitters[i];

Meshmesh=emitter.getBaseParticle().getMesh();

mesh.renderList((emitter.getParticles()),(GameItemgameItem)->{

Matrix4fmodelViewMatrix=transformation.buildModelViewMatrix(gameItem,v

iewMatrix);

particlesShaderProgram.setUniform("modelViewMatrix",modelViewMatrix);

}

);

}

particlesShaderProgram.unbind();

}

Thefragmentaboveshouldbeselfexplanatoryifyoumanagedtogettothispoint,itjustrenderseachparticlesettinguptherequireduniforms.Wehavenowcreatedallthemethodsweneedtotesttheimplementationoftheparticleeffect.WejustneedtomodifytheDummyGameclasswecansetupaparticleemitterandthecharacteristicsofthebaseparticle.

Vector3fparticleSpeed=newVector3f(0,1,0);

particleSpeed.mul(2.5f);

longttl=4000;

intmaxParticles=200;

longcreationPeriodMillis=300;

floatrange=0.2f;

floatscale=0.5f;

MeshpartMesh=OBJLoader.loadMesh("/models/particle.obj");

Texturetexture=newTexture("/textures/particle_tmp.png");

MaterialpartMaterial=newMaterial(texture,reflectance);

partMesh.setMaterial(partMaterial);

Particleparticle=newParticle(partMesh,particleSpeed,ttl);

particle.setScale(scale);

particleEmitter=newFlowParticleEmitter(particle,maxParticles,creationPeriodMillis

);

particleEmitter.setActive(true);

particleEmitter.setPositionRndRange(range);

particleEmitter.setSpeedRndRange(range);

this.scene.setParticleEmitters(newFlowParticleEmitter[]{particleEmitter});

Particles

260

Weareusingaplainfilledcircleastheparticle’stexturebynow,tobetterunderstandwhat’shappening.Ifyouexecuteityouwillseesomethinglikethis.

Whysomeparticlesseemtobecutoff?Whythetransparentbackgrounddoesnotsolvethis?Thereasonforthatisdepthtesting.Somefragmentsoftheparticlesgetdiscardedbecausetheyhaveadepthbuffervaluehigherthanthecurrentvalueofthedepthbufferforthatzone.Wecansolvethisbyorderingtheparticledrawingsdependingintheirdistancetothecameraorwecanjustdisablethedepthwriting.

Beforewedrawtheparticleswejustneedtoinsertthisline:

glDepthMask(false);

Andwhenwearedonewithrenderingwerestorethepreviousvalue:

glDepthMask(true);

Thenwewillgetsomethinglikethis.

Particles

261

Ok,problemsolved.Nevertheless,westillwantanothereffecttobeapplied,wewouldwantthatcoloursgetblendedsocolourswillbeaddedtocreatebettereffects.Thisisachievedwithbyaddingthislinebeforerenderingtosetupadditiveblending.

glBlendFunc(GL_SRC_ALPHA,GL_ONE);

Asinthedepthcase,afterwehaverenderedalltheparticleswerestoretheblendingfunctionto:

glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);

Nowwegetsomethinglikethis.

Particles

262

Butwehavenotfinishedyet.Ifyouhavemovedthecameraoverthebluesquarelookingdownyoumayhavegotsomethinglikethis.

Theparticlesdonotlookverygood,theyshouldlookroundbuttheyresembleasheetofpaper.Atthispointsiswhereweshouldbeapplyingthebillboardtechnique.Thequadthatisusedtorendertheparticleshouldalwaysbefacingthecamera,totallyperpendiculartoit

Particles

263

asifittherewasnorotationatall.Thecameramatrixappliestranslationandrotationtoeveryobjectinthescene,wewanttoskiptherotationtobeapplied.

Warning:Mathsahead,youcanskipitifyoudon'tfeelcomfortablewiththis.Let’sreviewthatviewmatrixonceagain.Thatmatrixcanberepresentedlikethis(withoutanyscaleappliedtoit).

Theredelementsrepresentthecamerarotationwhiletheblueonesrepresentthetranslation.Weneedtocanceltheeffectoftheupperleft3x3matrixcontainedintheviewmatrixsoitgetstosomethinglikethis.

So,wehavea3x3matrix,theupperleftredfragment,let'snameitM andwewantittotransformittotheidentifymatrix:I .Anymatrixmultipliedbyitsinversewillgivetheidentify

matrix:M ×M = I .Sowejustneedtogettheupperleft3x3matrixfromtheviewmatrix,andmultiplyitbyitsinverse,butwecanevenoptimizethis.Arotationmatrixhasaninterestingcharacteristic,itsinversecoincideswithitstransposematrix.Thatis:

M ×M =M ×M = I.Andatransposematrixismuchmoreeasiertocalculatethantheinverse.Thetransposeofamatrixislikeifweflipit,wechangerowspercolumns.

=

Ok,let'ssummarize.Wehavethistransformation:V ×M ,whereV istheviewmatrixandM isthemodelmatrix.Wecanexpressthatexpressionlikethis:

×

Wewanttocanceltherotationoftheviewmatrix,togetsomethinglikethis:

⎣⎢⎢⎡r00r01r020

r10r11r120

r20r21r220

dx

dy

dz

1 ⎦⎥⎥⎤

⎣⎢⎢⎡1000

0100

0010

dx

dy

dz

1 ⎦⎥⎥⎤

r

r r−1

r r−1

r rT

⎣⎡r00r01r02

r10r11r12

r20r21r22⎦⎤T

⎣⎡r00r10r20

r01r11r21

r02r12r22⎦⎤

⎣⎢⎢⎡v00v01v02v03

v10v11v12v13

v20v21v22v23

v30v31v32v33⎦⎥⎥⎤

⎣⎢⎢⎡m00m01m02m03

m10m11m12m13

m20m21m22m23

m30m31m32m33⎦

⎥⎥⎤

⎣⎢⎢⎡100

mv03

010

mv13

001

mv23

mv30mv31mv32mv33⎦

⎥⎥⎤

Particles

264

Sowejustneedtosettheupperleft3x3matrixforthemodelmatrixasthetransposematrixofthe3x3upperpartoftheviewmatrix:

×

But,afterdoingthis,wehaveremovedthescalingfactor,indeedwhatwedoreallywhanttoachieveissomethinglikethis:

Wheresx,syandszarethescalingfactor.Thus,afterwehavesetsettheupperleft3x3matrixforthemodelmatrixasthetransposematrixoftheviewmatrix,weneedtoapplyscalingagain.

Andthat'sall,wejustneedtochangethisintherenderParticlesMethodlikethis:

for(inti=0;i<numEmitters;i++){

IParticleEmitteremitter=emitters[i];

Meshmesh=emitter.getBaseParticle().getMesh();

mesh.renderList((emitter.getParticles()),(GameItemgameItem)->{

Matrix4fmodelMatrix=transformation.buildModelMatrix(gameItem);

viewMatrix.transpose3x3(modelMatrix);

Matrix4fmodelViewMatrix=transformation.buildModelViewMatrix(modelMa

trix,viewMatrix);

modelViewMatrix.scale(gameItem.getScale());

particlesShaderProgram.setUniform("modelViewMatrix",modelViewMatrix);

}

);

}

WealsohaveaddedanothermethodtotheTransformationclasstoconstructamodelviewmatrixusingtwomatricesinsteadofaGameItemandtheviewmatrix.

Withthatchange,whenwelooktheparticlesfromabovewegetsomethinglikethis.

⎣ ⎦

⎣⎢⎢⎡v00v01v02v03

v10v11v12v13

v20v21v22v23

v30v31v32v33⎦⎥⎥⎤

⎣⎢⎢⎡v00v10v20m03

v01v11v21m13

v02v12v22m23

m30m31m32m33⎦

⎥⎥⎤

⎣⎢⎢⎡

sx

00

mv03

0sy

0mv13

00sz

mv23

mv30mv31mv32mv33⎦

⎥⎥⎤

Particles

265

Nowwehaveeverythingweneedtocreateamorerealisticparticleeffectsolet'schangethetexturetosomethingmoreelaborated.Wewillusethisimage(itwascreatedwithGIMPwiththelightsandshadowsfilters).

Withthistexture,wewillgetsomethinglikethis.

Particles

266

Muchbetter!Youmaynoticethatweneedtoadjustthescale,sinceparticlesarenowalwaysfacingthecamerathedisplayedareaisalwaysthemaximum.

Finally,anotherconclusion,togetperfectresultswhichcanbeusedinanysceneyouwillneedtoimplementparticleorderingandactivatedepthbuffer.Inanycase,youhavehereasampletoincludethiseffectinyourgames.

TextureAtlasNowthatwehavesetthebasicinfrastructureforparticleeffectwecanaddsomeanimationeffectstoit.Inordertoachievethat,wearegoingtosupporttextureatlas.Atextureatlasisalargeimagethatcontainsallthetexturesthatwillbeused.Withatextureatlasweneedonlytoloadalargeimageandthenwhiledrawingthegameitemsweselecttheportionsofthatimagetobeusedasourtexture.Thistechniquecanbeappliedforinstancewhenwewanttorepresentthesamemodelmanytimeswithdifferenttextures(thinkforinstanceabouttrees,orrocks).Insteadofhavingmanytextureinstancesandswitchingbetweenthem(rememberthatswitchingstatesarealwaysslow)wecanusethesametextureatlasandjustselecttheappropriatecoordinates.

Inthiscase,wearegoingtousetexturecoordinatestoanimateparticles.Wewilliterateoverdifferenttexturestomodelaparticleanimation.Allthosetextureswillbegroupedintoatextureatlaswhichlookslikethis.

Particles

267

Thetextureatlascanbedividedintoquadtiles.Wewillassignatilepositiontoaparticleandwillchangeitovertimetorepresentanimation.Solet’sgetonit.ThefirstthingthatwearegoingtodoismodifyingtheTextureclasstospecifythenumberofrowsandcolumnsthatatextureatlascanhave.

packageorg.lwjglb.engine.graph;

//..Importshera

publicclassTexture{

//Moreattributeshere

privateintnumRows=1;

privateintnumCols=1;

//Morecodehere

publicTexture(StringfileName,intnumCols,intnumRows)throwsException{

this(fileName);

this.numCols=numCols;

this.numRows=numRows;

}

Particles

268

Thedefaultcaseistohaveatexturewithanumberofcolumnsandrowsequalto1,thatis,thetextureswehavedealingwith.Wealsoaddanotherconstructortobeabletospecifytherowsandcolumns.

ThenweneedtokeeptrackthepositioninthetextureatlasforaGameItem,sowejustaddanotherattributetothatclasswithadefaultvalueequalto0.

packageorg.lwjglb.engine.items;

importorg.joml.Vector3f;

importorg.lwjglb.engine.graph.Mesh;

publicclassGameItem{

//Moreattributeshere

privateinttextPos;

ThenwewillmodifytheParticleclasstobeabletoiterateautomaticallythroughatextureatlas.

packageorg.lwjglb.engine.graph.particles;

importorg.joml.Vector3f;

importorg.lwjglb.engine.graph.Mesh;

importorg.lwjglb.engine.graph.Texture;

importorg.lwjglb.engine.items.GameItem;

publicclassParticleextendsGameItem{

privatelongupdateTextureMillis;

privatelongcurrentAnimTimeMillis;

TheupdateTextureMillisattributemodelstheperiodoftime(inmilliseconds)tomovetothenextpositioninthetextureatlas.Thelowestthevaluethefastesttheparticlewillrolloverthetextures.ThecurrentAnimTimeMillisattributejustkeepstrackofthetimethattheparticlehasmaintainedatextureposition.

Thus,weneedtomodifytheParticleclassconstructortosetupthosevalues.Alsowecalculatethenumberoftilesofthetextureatlas,whichismodelledbytheattributeanimFrames.

Particles

269

publicParticle(Meshmesh,Vector3fspeed,longttl,longupdateTextureMillis){

super(mesh);

this.speed=newVector3f(speed);

this.ttl=ttl;

this.updateTextureMills=updateTextureMills;

this.currentAnimTimeMillis=0;

Texturetexture=this.getMesh().getMaterial().getTexture();

this.animFrames=texture.getNumCols()*texture.getNumRows();

}

Now,wejustneedtomodifythemethodthatchecksiftheparticlehasexpiredtocheckalsoifweneedtoupdatethetextureposition.

publiclongupdateTtl(longelapsedTime){

this.ttl-=elapsedTime;

this.currentAnimTimeMillis+=elapsedTime;

if(this.currentAnimTimeMillis>=this.getUpdateTextureMillis()&&this.animFrame

s>0){

this.currentAnimTimeMillis=0;

intpos=this.getTextPos();

pos++;

if(pos<this.animFrames){

this.setTextPos(pos);

}else{

this.setTextPos(0);

}

}

returnthis.ttl;

}

Besidesthat,wealsohavemodifiedtheFlowRangeEmitterclasstoaddsomerandomnesstotheperiodoftimewhenweshouldchangetheaparticle’stextureposition.Youcancheckitinthesourcecode.

Nowwecanusethatinformationtosetupappropriatetexturecoordinates.Wewilldothisinthevertexfragmentsinceitoutputsthosevaluestobeusedinthefragmentshader.Thenewversionofthatshaderisdefinedlikethis.

Particles

270

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

outvec2outTexCoord;

uniformmat4modelViewMatrix;

uniformmat4projectionMatrix;

uniformfloattexXOffset;

uniformfloattexYOffset;

uniformintnumCols;

uniformintnumRows;

voidmain()

{

gl_Position=projectionMatrix*modelViewMatrix*vec4(position,1.0);

//Supportfortextureatlas,updatetexturecoordinates

floatx=(texCoord.x/numCols+texXOffset);

floaty=(texCoord.y/numRows+texYOffset);

outTexCoord=vec2(x,y);

}

Asyoucanseewehavenowthreenewuniforms.TheuniformsnumColsandnumRowsjustcontainthenumberofcolumnsandrowsofthetextureatlas.Inordertocalculatethetexturecoordinates,wefirstmustscaledowntheseparameters.Eachtilewillhaveawidthwhichisequalto1/numColsandaheightwhichisequalto1/numRowsasshowninthenextfigure.

Particles

271

Thenwejustneedtoapplyandoffsetdependingontherowandcolumn,thisiswhatismodelledbythetexXOffsetandtexYOffsetuniforms.

WewillcalculatetheseoffsetsintheRendererclassasshowninthenextfragment.Wecalculatetherowandcolumnthateachparticleisinaccordingtoitspositionandcalculatetheoffsetaccordinglyasamultipleoftile’swidthandheight.

mesh.renderList((emitter.getParticles()),(GameItemgameItem)->{

intcol=gameItem.getTextPos()%text.getNumCols();

introw=gameItem.getTextPos()/text.getNumCols();

floattextXOffset=(float)col/text.getNumCols();

floattextYOffset=(float)row/text.getNumRows();

particlesShaderProgram.setUniform("texXOffset",textXOffset);

particlesShaderProgram.setUniform("texYOffset",textYOffset);

Notethatifyouonlyneedtosupportperfectlysquaretextureatlas,youwillonlyneedtwouniforms.Thefinalresultlookslikethis.

Particles

272

Nowwehaveanimatedparticlesworking.Inthenextchapterwewilllearnhowtooptimizetherenderingprocess.Wearerenderingmultipleelementsthathavethesamemeshandweareperformingadrawingcallforeachofthem.Inthenextchapterwewilllearnhowtodoitinasinglecall.Thattechniqueisusefulforparticlesbutalsoforrenderingsceneswheremultipleelementssharethesamemodelbutareplacedindifferentlocationsorhavedifferenttextures.

Particles

273

InstancedRendering

LotsofInstancesWhendrawinga3Dsceneisfrequenttohavemanymodelsrepresentedbythesamemeshbutwithdifferenttransformations.Inthiscase,althoughtheymaybesimpleobjectswithjustafewtriangles,performancecansuffer.Thecausebehindthisisthewaywearerenderingthem.

WearebasicallyiteratingthroughallthegameitemsinsidealoopandperformingacalltothefunctionglDrawElements.Asithasbeensaidinpreviouschapters,callstoOpenGLlibraryshouldbeminimized.EachcalltotheglDrawElementsfunctionimposesanoverheadthatisrepeatedagainandagainforeachGameIteminstance.

Whendealingwithlotsofsimilarobjectsitwouldbemoreefficienttorenderallofthemusingasinglecall.Thistechniqueiscalledinstancedrendering.InordertoacomplishthatOpenGLprovidesasetoffunctionsnamedglDrawXXXInstancedtorenderasetofelementsatonce.Inourcase,sincewearedrawingelementswewillusethefunctionnamedglDrawElementsInstanced.ThisfunctionreceivesthesameargumentsastheglDrawElementsplusoneadditionalparameterwhichsetsthenumberofinstancestobedrawn.

ThisisasampleofhowtheglDrawElementsisused.

glDrawElements(GL_TRIANGLES,numVertices,GL_UNSIGNED_INT,0)

Andthisishowtheinstancedversioncanbeused:

glDrawElementsInstanced(GL_TRIANGLES,numVertices,GL_UNSIGNED_INT,0,numInstances);

Butyoumaybewonderingnowhowcanyousetthedifferenttransformationsforeachofthoseinstances.Now,beforewedraweachinstancewepassthedifferenttransformationsandinstancerelateddatausinguniforms.Beforearendercallismadeweneedtosetupthespecificdataforeachitem.Howcanwedothiswhenrenderingallofthematonce?

Whenusinginstancedrendering,inthevertexshaderwecanuseaninputvariablethatholdstheindexoftheinstancethatiscurrentlybeingdrawn.Withthatbuilt-invariablewecan,forinstance,passanarrayofuniformscontainingthetransformationstobeappliedtoeachinstanceanduseasinglerendercall.

InstancedRendering

274

Theproblemwiththisapproachisthatitstillimposestoomuchoverhead.Inadditiontothat,thenumberofuniformsthatwecanpassislimited.Thus,weneedtoemplyanotherapproach,insteadofusinglistsofuniformswewilluseinstancedarrays.

Ifyourecallfromthefirstchapters,thedataforeachMeshisdefinedbyasetofarraysofdatanamedVBOs.ThedatastoreinthoseVBOsisuniqueperMeshinstance.

WithstandardVBOs,insideashader,wecanaccessthedataassociatedtoeachvertex(itsposition,colour,textue,etc.).Whenevertheshaderisrun,theinputvariablesaresettopointtothespecificdataassociatedtoeachvertex.Withinstancedarrayswecansetupdatathatischangedperinstanceinsteadofpervertex.IfwecombinebothtypeswecanuseregularVBOstostorepervertexinformation(position,texturecoordinates)andVBOsthatcontainperinstancedatasuchasmodelviewmatrices.

ThenextfigureshowsaMeshcomposedbythreepervertexVBOsdefinigthepositions,texturesandnormals.Thefirstindexofeachofthoseelementsistheinstancethatitbelongsto(inbluecolour).Thesecondindexrepresentsthevertexpositioninsideainstance.

TheMeshisalsodefinedbytwoperinstanceVBOs.Oneforthemodelviewmatrixandtheotheroneforthelightviewmatrix.Whenrenderingtheverticesforthefirsinstance(the1X,ones),themodelviewandlightviewmatriceswillbethesame(the1).Whenverticesofthesecondinstancearetoberenderedthesecondmodelviewandlightviewmatriceswillbeused.

InstancedRendering

275

Thus,whenrenderingthefirstvertexofthefirstinstance,V11,T11andN11wouldbeusedforposition,textureandnormaldataandMV1wouldbeusedasamodelviewmatrix.Whenrenderingthesecondvertexofthesamefirstinstance,V12,T12andN12wouldbeusedforposition,textureandnormaldaraandMV1woulsstillbeusedasamodelviewmatrix.MV2andLV2wouldnotbeuseduntilsecondinstanceisrendered.

InordertodefineperinstancedataweneedtocallthefunctionglVertexAttribDivisorafterdefiningvertexattributes.Thisfunctionreceivestwoparameters:

index:Theindexofthevertexattribute(asissuedintheglVertexAttribPointerfunction).

Divisor:Ifthisvalecontainszero,thedataischangedforeachvertexwhilerendering.Ifitissettoone,thedatachangesonceperinstance.Ifit’ssettotwoitchangeseverytwoinstances,etc.

So,inordertosetdataforainstanceweneedtoperformthiscallaftereveryattributedefinition:

glVertexAttribDivisor(index,1);

Let’sstartchangingourcodebasetosupportinstancedrendering.ThefirststepistocreateanewclassnamedInstancedMeshthatinheritsfromtheMeshclass.TheconstructorofthisclasswillbesimilartothesimilartotheMeshonebutwithanextraparameter,thenumberofinstances.

Intheconstructor,besidesrelyinginsuper’sconstructor,wewillcreatetwonewVBOs,oneforthemodelviewmatrixandanotheroneforthelightviewmatrix.Thecodeforcreatingthemodelviewmatrixispresentedbelow.

InstancedRendering

276

modelViewVBO=glGenBuffers();

vboIdList.add(modelViewVBO);

this.modelViewBuffer=MemoryUtil.memAllocFloat(numInstances*MATRIX_SIZE_FLOATS);

glBindBuffer(GL_ARRAY_BUFFER,modelViewVBO);

intstart=5;

for(inti=0;i<4;i++){

glVertexAttribPointer(start,4,GL_FLOAT,false,MATRIX_SIZE_BYTES,i*VECTOR4F_S

IZE_BYTES);

glVertexAttribDivisor(start,1);

start++;

}

ThefirstthingthatwedoiscreateanewVBOandanewFloatBuffertostorethedataonit.Thesizeofthatbufferismeasuredinfloats,soitwillbeequaltothenumberofinstancesmultipliedbythesizeinfloatsofa4x4matrix,whichisequalto16.

OncetheVBOhasbeenbondwestartdefiningtheattributesforit.Youcanseethatthisisdoneinaforloopthatiteratesfourtimes.Eachturnoftheloopdefinesonevectorthematrix.Whynotsimplydefiningasingleattributeforthewholematrix?Thereasonforthatisthatavertexattributecannotcontainmorethanfourfloats.Thus,weneedtosplitthematrixdefinitionintofourpieces.Let’srefreshtheparametersoftheglVertexAttribPointer:

Index:Theindexoftheelementtobedefined.Size:Thenumberofcomponentsforthisattribute.Inthiscaseit’s4,4floats,whichisthemaximumacceptedvalue.Type:Thetypeofdata(floatsinourcase).Normalize:Iffixed-pointdatashouldbenormalizedornot.Stride:Thisisimportanttounderstandhere,thissetsthebyteoffsetsbetweenconsecutiveattributes.Inthiscase,weneedtosetittothewholesizeofamatrixinbytes.Thisactslikeamarkthatpacksthedatasoitcanbechangedbetweenvertexorinstances.Pointer:Theoffsetthatthisattributedefinitionappliesto.Inourcase,weneedtosplitthematrixdefinitionintofourcalls.Eachvectorofthematrixincrementstheoffset.

Afterdefiningthevertexattribute,weneedtocalltheglVertexAttribDivisorusingthesameindex.

Thedefinitionofthelightviewmatrixissimilartothepreviousone,youcancheckitinthesourcecode.ContinuingwiththeInstancedMeshclassdefinitionit’simportanttooverridethemethodsthatenablethevertexattributesbeforerendering(andtheonethatdisablesthemafter).

InstancedRendering

277

@Override

protectedvoidinitRender(){

super.initRender();

intstart=5;

intnumElements=4*2;

for(inti=0;i<numElements;i++){

glEnableVertexAttribArray(start+i);

}

}

@Override

protectedvoidendRender(){

intstart=5;

intnumElements=4*2;

for(inti=0;i<numElements;i++){

glDisableVertexAttribArray(start+i);

}

super.endRender();

}

TheInstancedMeshclassdefinesapublicmethod,namedrenderListInstanced,thatrendersalistofgameitems,thismethodsplitsthelistofgameitemsintochunksofsizeequaltothenumberofinstancesusedtocreatetheInstancedMesh.TherealrenderingmethodiscalledrenderChunkInstancedandisdefinedlikethis.

InstancedRendering

278

privatevoidrenderChunkInstanced(List<GameItem>gameItems,booleandepthMap,Transfor

mationtransformation,Matrix4fviewMatrix,Matrix4flightViewMatrix){

this.modelViewBuffer.clear();

this.modelLightViewBuffer.clear();

inti=0;

for(GameItemgameItem:gameItems){

Matrix4fmodelMatrix=transformation.buildModelMatrix(gameItem);

if(!depthMap){

Matrix4fmodelViewMatrix=transformation.buildModelViewMatrix(modelMatrix

,viewMatrix);

modelViewMatrix.get(MATRIX_SIZE_FLOATS*i,modelViewBuffer);

}

Matrix4fmodelLightViewMatrix=transformation.buildModelLightViewMatrix(model

Matrix,lightViewMatrix);

modelLightViewMatrix.get(MATRIX_SIZE_FLOATS*i,this.modelLightViewBuffer);

i++;

}

glBindBuffer(GL_ARRAY_BUFFER,modelViewVBO);

glBufferData(GL_ARRAY_BUFFER,modelViewBuffer,GL_DYNAMIC_DRAW);

glBindBuffer(GL_ARRAY_BUFFER,modelLightViewVBO);

glBufferData(GL_ARRAY_BUFFER,modelLightViewBuffer,GL_DYNAMIC_DRAW);

glDrawElementsInstanced(GL_TRIANGLES,getVertexCount(),GL_UNSIGNED_INT,0,gameIt

ems.size());

glBindBuffer(GL_ARRAY_BUFFER,0);

}

Themethodisquitesimple,webasicallyiterateoverthegameitemsandcalculatethemodelviewandlightviewmatrices.Thesematricesaredumpedintotheirrespectivebuffers.ThecontentsofthosebuffersaresenttototheGPUandfinallywerenderallofthemwithasinglecalltotheglDrawElementsInstancedmethod.

Goingbacktotheshaders,weneedtomodifythevertexshadertosupportinstancedrendering.Wewillfirstaddnewinputparametersforthemodelandviewmatricesthatwillbepassedwhenusinginstancedrendering.

layout(location=5)inmat4modelViewInstancedMatrix;

layout(location=9)inmat4modelLightViewInstancedMatrix;

Asyoucansee,themodelviewmatrixstartsatlocation5.Sinceamatrixisdefinedbyasetoffourattributes(eachonecontainingavector),thelightviewmatrixstartsatlocation9.Sincewewanttouseasingleshaderforbothnoninstancedandinstancedrendering,wewillmaintaintheuniformsformodelandlightviewmatrices.Weonlyneedtochangetheirnames.

InstancedRendering

279

uniformintisInstanced;

uniformmat4modelViewNonInstancedMatrix;

uniformmat4modelLightViewNonInstancedMatrix;

Wehavecreatedanotheruniformtospecifyifweareusinginstancedrenderingornot.Inthecaseweareusinginstancedrenderingthecodeisverysimple,wejustusethematricesfromtheinputparameters.

voidmain()

{

vec4initPos=vec4(0,0,0,0);

vec4initNormal=vec4(0,0,0,0);

mat4modelViewMatrix;

mat4lightViewMatrix;

if(isInstanced>0)

{

modelViewMatrix=modelViewInstancedMatrix;

lightViewMatrix=modelLightViewInstancedMatrix;

initPos=vec4(position,1.0);

initNormal=vec4(vertexNormal,0.0);

}

Wedon’tsupportanimationsforinstancedrenderingtosimplifytheexample,butthistechniquecanbeperfectlyusedforthis.Finally,theshaderjustsetupappropriatevaluesasusual.

vec4mvPos=modelViewMatrix*initPos;

gl_Position=projectionMatrix*mvPos;

outTexCoord=texCoord;

mvVertexNormal=normalize(modelViewMatrix*initNormal).xyz;

mvVertexPos=mvPos.xyz;

mlightviewVertexPos=orthoProjectionMatrix*lightViewMatrix*initPos;

outModelViewMatrix=modelViewMatrix;

}

Ofcourse,theRendererhasbeenmodifiedtosupporttheuniformschangesandtoseparatetherenderingofnoninstancedmeshesfromtheinstancedones.Youcancheckthechangesinthesourcecode.

Inadditiontothat,someoptimizationshavebeenaddedtothesourcecodebytheJOMLauthorKaiBurjack.TheseoptimizationshavebeenappliedtotheTransformationclassandaresummarizedinthefollowinglist:

Removedredundantcallstosetupmatriceswithidentityvalues.

InstancedRendering

280

Usequaternionsforrotationswhicharemoreefficient.Usespecificmethodsforrotatingandtranslatingmatriceswhichareoptimizedforthoseoperations.

ParticlesrevisitedWiththesupportofinstancedrenderingwecanalsoimprovetheperformancefortheparticlesrendering.Particlesarethebestusecaseforthis.

Inordertoapplyinstancerenderingtoparticleswemustprovidesupportfortextureatlas.ThiscanbeachievedbyaddinganewVBOwithtextureoffsetsforinstancedrendering.Thetextureoffsetscanbemodeledbyasinglevectoroftowfloats,sothere'snoneedtosplitthedefinitionasinthematricescase.

//Textureoffsets

glVertexAttribPointer(start,2,GL_FLOAT,false,INSTANCE_SIZE_BYTES,strideStart);

glVertexAttribDivisor(start,1);

InstancedRendering

281

But,insteadofaddinganewVBOwewillsetalltheinstanceattributesinsideasingleVBO.Thenextfigureshowstheconcept.WearepackingupalltheattributesinsideasingleVBO.Thevalueswillchangepereachinstance.

InordertouseasingleVBOweneedtomodifytheattributesizeforalltheattributesinsideaninstance.Asyoucanseefromthecodeabove,thedefinitionofthetextureoffsetsusesaconstantnamedINSTANCE_SIZE_BYTES.Thisconstantisequaltothesizeinbytesoftwomatrices(onefortheviewmodelandtheotheroneforthelightviewmodel)plustwofloats(textureoffesets),whichintotalis136.Thestridealsoneedstobemodifiedproperly.

Youcancheckthemodificationsinthesourcecode.

TheRendererclassneedsalsotobemodifiedtouseinstancedrenderingforparticlesandsupporttextureatlasinscenerendering.Inthiscase,there'snosenseinsupportbothtypesofrendering(noninstanceandinstanced),sothemodificationsaresimpler.

Thevertexshaderforparticlesisalsostraightfroward.

#version330

layout(location=0)invec3position;

layout(location=1)invec2texCoord;

layout(location=2)invec3vertexNormal;

layout(location=5)inmat4modelViewMatrix;

layout(location=13)invec2texOffset;

outvec2outTexCoord;

uniformmat4projectionMatrix;

uniformintnumCols;

uniformintnumRows;

voidmain()

{

gl_Position=projectionMatrix*modelViewMatrix*vec4(position,1.0);

//Supportfortextureatlas,updatetexturecoordinates

floatx=(texCoord.x/numCols+texOffset.x);

floaty=(texCoord.y/numRows+texOffset.y);

outTexCoord=vec2(x,y);}

Theresultsofthischanges,lookexactlythesameaswhenrenderingnoninstancedparticlesbuttheperformanceismuchhigher.AFPScounterhasbeenaddedtothewindowtitle,asanoption.Youcanplaywithinstancedandnoninstancedrenderingtoseetheimprovementsbyyourself.

InstancedRendering

282

ExtrabonusWithalltheinfrastructurethatwehaverightnow,I'vemodifiedtherenderingcubescodetouseaheightmapasabase,usingalsotextureatlastousedifferenttextures.Italsocombinesparticlesrendering.Itlookslikethis.

InstancedRendering

283

Pleasekeepinmindthatthere'sstillmuchroomforoptimization,buttheaimofthebookisguidingyouinlearningLWJGLandOpenGLconceptsandtechniques.Thegoalisnottocreateafullblowngameengine(andefinitelynotavoxelengine,whichrequireadifferentapproachandmoreoptimizations).

InstancedRendering

284

AudioUntilthismomentwehavebeendealingwithgraphics,butanotherkeyaspectofeverygameisaudio.ThiscapabilityisgoingtobeaddressedinthischapterwiththehelpofOpenAL(OpenAudioLibrary).OpenAListheOpenGLcounterpartforaudio,itallowsustoplaysoundsthroughandabstractionlayer.Thatlayerisolatesusfromtheunderlyingcomplexitiesoftheaudiosubsystem.Besidesthat,itallowsusto“render”soundsina3Dscene,wheresoundscanbesetupinspecificlocations,attenuatedwiththedistanceandmodifiedaccordingtotheirvelocity(simulatingDopplereffect)

LWJGLsupportsOpenALwithoutrequiringanyadditionaldownload,it’sjustreadytouse.ButbeforestartcodingweneedtopresentthemainelementsinvolvedwhendealingwithOpenAL,whichare:

Buffers.Sources.Listener.

Buffersstoreaudiodata,thatis,musicorsoundeffects.TheyaresimilartothetexturesintheOpenGLdomain.OpenALexpectsaudiodatatobeinPCM(PulseCodedModulation)format(eitherinmonoorinstereo),sowecannotjustdumpMP3orOGGfileswithoutconvertingthemfirsttoPCM.

Thenextelementaresources,whichrepresentalocationina3Dspace(apoint)thatemitssound.Asourceisassociatedtoabuffer(onlyoneattime)andcanbedefinedbythefollowingattributes:

Aposition,thelocationofthesource(x,yandzcoordinates).Bytheway,OpenALusesarighthandedCartesiancoordinatesystemasOpenGL,soyoucanassume(tosimplifythings)thatyourworldcoordinatesareequivalenttotheonesinthesoundspacecoordinatesystem.Avelocity,whichspecifieshowfastthesourceismoving.ThisisusedtosimulateDopplereffect.Again,whichisusedtomodifytheintensityofthesound(it’slikeanamplifierfactor).

Asourcehasadditionalattibuteswhichwillbeshownlaterwhendescribingthesourcecode.

Andlast,butnoleast,alistenerwhichiswherethegeneratedsoundsaresupposedtobeheard.TheListenerrepresentswerethemicrophoneissetina3Daudioscenetoreceivethesounds.Thereisonlyonelistener.Thus,it’softensaidthataudiorenderingisdoneform

Audio

285

thelistener’sperspective.Alistenersharessometheattributesbutithassomeadditionalonessuchastheorientation.Theorientationrepresentswherethelistenerisfacing.

Soanaudio3Dsceneiscomposedbyasetofsoundsourceswhichemitsoundandalistenerthatreceivesthem.Thefinalperceivedsoundwilldependonthedistanceofthelistenertothedifferentsources,theirrelativespeedandtheselectedpropagationmodels.Sourcescansharebuffersandplaythesamedata.Thefollowingfiguredepictsasample3Dscenewiththedifferentelementtypesinvolved.

So,let'sstartcoding,wewillcreateanewpackageunderthenameorg.lwjglb.engine.soundthatwillhostalltheclassesresponsibleforhandlingaudio.Wewillfirststartwithaclass,namedSoundBufferthatwillrepresentanOpenALbuffer.Afragmentofthedefinitionofthatclassisshownbelow.

Audio

286

packageorg.lwjglb.engine.sound;

//...Someinportshere

publicclassSoundBuffer{

privatefinalintbufferId;

publicSoundBuffer(Stringfile)throwsException{

this.bufferId=alGenBuffers();

try(STBVorbisInfoinfo=STBVorbisInfo.malloc()){

ShortBufferpcm=readVorbis(file,32*1024,info);

//Copytobuffer

alBufferData(buffer,info.channels()==1?AL_FORMAT_MONO16:AL_FORMAT_S

TEREO16,pcm,info.sample_rate());

}

}

publicintgetBufferId(){

returnthis.bufferId;

}

publicvoidcleanup(){

alDeleteBuffers(this.bufferId);

}

//....

}

Theconstructoroftheclassexpectsasoundfile(whichmaybeintheclasspathastherestofresources)andcreatesanewbufferfromit.ThefirstthingthatwedoiscreateanOpenALbufferwiththecalltoalGenBuffers.Attheendoursoundbufferwillbeidentifiedbyanintegerwhichislikeapointertothedataitholds.Oncethebufferhasbeencreatedwedumptheaudiodatainit.TheconstructorexpectsafileinOGGformat,soweneedtotransformittoPCMformat.Youcancheckhowthat'sdoneintesourcecode,anyway,thesourcecodehasbeenextractedfromtheLWJGLOpenALtests.

PreviousversionsofLWJGLhadahelperclassnamedWaveDatawhichwasusedtoloadaudiofilesinWAVformat.ThisclassisnolongerpresentinLWJGL3.Nevertheless,youmaygetthesourcecodefromthatclassanduseitinyourgames(maybewithoutrequiringanychanges).

TheSoundBufferclassalsoprovidesacleanupmethodtofreetheresourceswhenwearedonewithit.

Let'scontinuebymodellinganOpenAl,whichwillbeimplementedbyclassnamedSounSource.Theclassisdefinedbelow.

Audio

287

packageorg.lwjglb.engine.sound;

importorg.joml.Vector3f;

importstaticorg.lwjgl.openal.AL10.*;

publicclassSoundSource{

privatefinalintsourceId;

publicSoundSource(booleanloop,booleanrelative){

this.sourceId=alGenSources();

if(loop){

alSourcei(sourceId,AL_LOOPING,AL_TRUE);

}

if(relative){

alSourcei(sourceId,AL_SOURCE_RELATIVE,AL_TRUE);

}

}

publicvoidsetBuffer(intbufferId){

stop();

alSourcei(sourceId,AL_BUFFER,bufferId);

}

publicvoidsetPosition(Vector3fposition){

alSource3f(sourceId,AL_POSITION,position.x,position.y,position.z);

}

publicvoidsetSpeed(Vector3fspeed){

alSource3f(sourceId,AL_VELOCITY,speed.x,speed.y,speed.z);

}

publicvoidsetGain(floatgain){

alSourcef(sourceId,AL_GAIN,gain);

}

publicvoidsetProperty(intparam,floatvalue){

alSourcef(sourceId,param,value);

}

publicvoidplay(){

alSourcePlay(sourceId);

}

publicbooleanisPlaying(){

returnalGetSourcei(sourceId,AL_SOURCE_STATE)==AL_PLAYING;

}

publicvoidpause(){

alSourcePause(sourceId);

}

Audio

288

publicvoidstop(){

alSourceStop(sourceId);

}

publicvoidcleanup(){

stop();

alDeleteSources(sourceId);

}

}

Thesoundsourceclassprovidessomemethodstosetupitsposition,thegain,andcontrolmethodsforplayingstoppingandpausingit.Keepinmindthatsoundcontrolactionsaremadeoverasource(notoverthebuffer),rememberthatseveralsourcescansharethesamebuffer.AsintheSoundBufferclass,aSoundSourceisidentifiedbyanidentifier,whichisusedineachoperation.Thisclassalsoprovidesacleanupmethodtofreethereservedresources.Butlet’sexaminetheconstructor.ThefirstthingthatwedoistocreatethesourcewiththealGenSourcescall.Then,wesetupsomeinterestingpropertiesusingtheconstructorparameters.

Thefirstparameter,loop,indicatesifthesoundtobeplayedshouldbeinloopmodeornot.Bydefault,whenaplayactionisinvokedoverasourcetheplayingstopswhentheaudiodataisconsumed.Thisisfineforsomesounds,butsomeothers,likebackgroundmusic,needtobeplayedoverandoveragain.Insteadofmanuallycontrollingwhentheaudiohasstoppedandre-launchtheplayprocess,wejustsimplysettheloopingpropertytotrue:“alSourcei(sourceId,AL_LOOPING,AL_TRUE);”.

Theotherparameter,relative,controlsifthepositionofthesourceisrelativetothelistenerornot.Inthiscase,whenwesetthepositionforasource,webasicallyaredefiningthedistance(withavector)tothelistener,notthepositionintheOpenAL3Dscene,nottheworldposition.Thisactivatedbythe“alSourcei(sourceId,AL_SOURCE_RELATIVE,AL_TRUE);”call.But,Whatcanweusethisfor?Thispropertyisinterestingforinstanceforbackgroundsoundsthatshouldbeaffected(attenuated)bythedistancetothelistener.Think,forinstance,inbackgroundmusicorsoundeffectsrelatedtoplayercontrols.Ifwesetthesesourcesasrelative,andsettheirpositionto(0, 0, 0)theywillnotbeattenuated.

Nowit’sturnforthelistenerwhich,surprise,ismodelledbyaclassnamedSoundListener.Here’sthedefinitionforthatclass.

Audio

289

packageorg.lwjglb.engine.sound;

importorg.joml.Vector3f;

importstaticorg.lwjgl.openal.AL10.*;

publicclassSoundListener{

publicSoundListener(){

this(newVector3f(0,0,0));

}

publicSoundListener(Vector3fposition){

alListener3f(AL_POSITION,position.x,position.y,position.z);

alListener3f(AL_VELOCITY,0,0,0);

}

publicvoidsetSpeed(Vector3fspeed){

alListener3f(AL_VELOCITY,speed.x,speed.y,speed.z);

}

publicvoidsetPosition(Vector3fposition){

alListener3f(AL_POSITION,position.x,position.y,position.z);

}

publicvoidsetOrientation(Vector3fat,Vector3fup){

float[]data=newfloat[6];

data[0]=at.x;

data[1]=at.y;

data[2]=at.z;

data[3]=up.x;

data[4]=up.y;

data[5]=up.z;

alListenerfv(AL_ORIENTATION,data);

}

}

Adifferenceyouwillnoticefromthepreviousclassesisthatthere’snoneedtocreatealistener.Therewillalwaysbeonelistener,sononeedtocreateone,it’salreadythereforus.Thus,intheconstructorwejustsimplysetitsinitialposition.Forthesamereasonthere’snoneedforacleanupmethod.Theclasshasmethodsalsoforsettinglistenerpositionandvelocity,asintheSoundSourceclass,butwehaveanextramethodforchangingthelistenerorientation.Let’sreviewwhatorientationisallabout.Listenerorientationisdefinedbytwovectors,“at”vectorand“up”one,whichareshowninthenextfigure.

Audio

290

The“at”vectorbasicallypointswherethelistenerisfacing,bydefaultitscoordinatesare(0, 0,−1).The“up”vectordetermineswhichdirectionisupforthelistenerand,bydefaultitpointsto(0, 1, 0).SothetreecomponentsofeachofthosetwovectorsarewhataresetinthealListenerfvmethodcall.Thismethodisusedtotransferasetoffloats(avariablenumberoffloats)toaproperty,inthiscase,theorientation.

Beforecontinuingit'snecessarytostressoutsomeconceptsinrelationtosourceandlistenerspeeds.TherelativespeedbetweensourcesandlistenerwillcauseOpenALtosimulateDopplereffect.Incaseyoudon’tknow,Dopplereffectiswhatcausesthatamovingobjectthatisgettingclosertoyouseemstoemitinahigherfrequencythanitseemstoemitwheniswalkingaway.Thething,isthat,simplybysettingasourceorlistenervelocity,OpenALwillnotupdatetheirpositionforyou.ItwillusetherelativevelocitytocalculatetheDopplereffect,butthepositionswon’tbemodified.So,ifyouwanttosimulateamovingsourceorlisteneryoumusttakecareofupdatingtheirpositionsinthegameloop.

Nowthatwehavemodelledthekeyelementswecansetthemuptowork,weneedtoinitializeOpenALlibrary,sowewillcreateanewclassnamedSoundManagerthatwillhandlethis.Here’safragmentofthedefinitionofthisclass.

Audio

291

packageorg.lwjglb.engine.sound;

//Importshere

publicclassSoundManager{

privatelongdevice;

privatelongcontext;

privateSoundListenerlistener;

privatefinalList<SoundBuffer>soundBufferList;

privatefinalMap<String,SoundSource>soundSourceMap;

privatefinalMatrix4fcameraMatrix;

publicSoundManager(){

soundBufferList=newArrayList<>();

soundSourceMap=newHashMap<>();

cameraMatrix=newMatrix4f();

}

publicvoidinit()throwsException{

this.device=alcOpenDevice((ByteBuffer)null);

if(device==NULL){

thrownewIllegalStateException("FailedtoopenthedefaultOpenALdevice."

);

}

ALCCapabilitiesdeviceCaps=ALC.createCapabilities(device);

this.context=alcCreateContext(device,(IntBuffer)null);

if(context==NULL){

thrownewIllegalStateException("FailedtocreateOpenALcontext.");

}

alcMakeContextCurrent(context);

AL.createCapabilities(deviceCaps);

}

ThisclassholdsreferencestotheSoundBufferandSoundSourceinstancestotrackandlatercleanupthemproperly.SoundBuffersarestoredinaListbutSoundSourcesarestoredininaMapsotheycanberetrievedbyaname.TheinitmethodinitializestheOpenALsubsystem:

Opensthedefaultdevice.Createthecapabilitiesforthatdevice.Createasoundcontext,liketheOpenGLone,andsetitasthecurrentone.

Audio

292

TheSoundManagerclassalsohasamethodtoupdatethelistenerorientationgivenacameraposition.Inourcase,thelistenerwillbeplacedwheneverthecamerais.So,givencamerapositionandrotationinformation,howdowecalculatethe“at”and“up”vectors?Theanswerisbyusingtheviewmatrixassociatedtothecamera.Weneedtotransformthe“at”(0, 0,−1)and“up”(0, 1, 0)vectorstakingintoconsiderationcamerarotation.LetcameraMatrixbetheviewmatrixassociatedtothecamera.Thecodetoaccomplishthatwouldbe:

Matrix4finvCam=newMatrix4f(cameraMatrix).invert();

Vector3fat=newVector3f(0,0,-1);

invCam.transformDirection(at);

Vector3fup=newVector3f(0,1,0);

invCam.transformDirection(up);

Thefirstthingthatwedoisinvertthecameraviewmatrix.Whywedothis?Thinkaboutitthisway,theviewmatrixtransformsfromworldspacecoordinatestoviewspace.Whatwewantisjusttheopposite,wewanttotransformfromviewspacecoordinates(theviewmatrix)tospacecoordinates,whichiswherethelistenershouldbepositioned.Withmatrices,theoppositeusuallymeanstheinverse.Oncewehavethatmatrixwejusttransformthe“default”“at”and“up”vectorsusingthatmatrixtocalculatethenewdirections.

But,ifyoucheckthesourcecodeyouwillseethattheimplementationisslightlydifferent,whatwedoisthis:

Vector3fat=newVector3f();

cameraMatrix.positiveZ(at).negate();

Vector3fup=newVector3f();

cameraMatrix.positiveY(up);

listener.setOrientation(at,up);

Thecodeaboveisequivalenttothefirstapproach,it’sjustamoreefficientapproach.Itusesafastermethod,availableinJOMLlibrary,thatjustdoesnotneedtocalculatethefullinversematrixbutachievesthesameresults.ThismethodwasprovidedbytheJOMLauthorinaLWJGLforum,soyoucancheckmoredetailsthere.IfyoucheckthesourcecodeyouwillseethattheSoundManagerclasscalculatesitsowncopyoftheviewmatrix.ThisisalreadydoneintheRendererclass.Inordertokeepthecodesimple,andtoavoidrefactoring,I’vepreferredtokeepthisthatway.

Andthat’sall.Wehavealltheinfrastructureweneedinordertoplaysounds.Youcancheckinthesourcecodehowallthepiecesareused.Youcanseehowmusicisplayedandthedifferenteffectssound(ThesefileswereobtainedfromFreesound,propercreditsareina

Audio

293

filenameCREDITS.txt).Ifyougetsomeotherfiles,youmaynoticethatsoundattenuationwithdistanceorlistenerorientationwillnotwork.Pleasecheckthatthefilesareinmono,notinstereo.OpenALcanonlyperformthosecomputationswithmonosounds.

Afinalnote.OpenALalsoallowsyoutochangetheattenuationmodelbyusingthealDistanceModelandpassingthemodelyouwant('``AL11.AL_EXPONENT_DISTANCE,AL_EXPONENT_DISTANCE_CLAMP```,etc.).Youcanplaywiththemandchecktheresults.

Audio

294

3DObjectPicking

CameraSelectionOneofthekeyaspectsofeverygameistheabilitytointeractwiththeenvironment.Thiscapabilityrequirestobeabletoselectobjectsinthe3Dscene.Inthischapterwewillexplorehowthiscanbeachieved.

But,beforewestarttalkingaboutthestepstobeperformedtoselectobjects,weneedawaytorepresentselectedobjects.Thus,thefirstthingthatwemustdo,isaddanotherattributetotheGameItemclass,whichwillallowustotagselectedobjects:

privatebooleanselected;

Then,weneedtobeabletousethatvalueinthesceneshaders.Let’sstartwiththefragmentshader(scene_fragment.fs).Inthiscase,wewillassumethatwewillreceiveaflag,fromthevertexshader,thatwilldetermineifthefragmenttoberenderedbelongstoaselectedobjectornot.

infloatoutSelected;

Then,attheendofthefragmentshader,wewillmodifythefinalfragmentcolour,bysettingthebluecomponentto1ifit’sselected.

if(outSelected>0){

fragColor=vec4(fragColor.x,fragColor.y,1,1);

}

Then,weneedtobeabletosetthatvalueforeach.Ifyourecallfrompreviouschapterswehavetwoscenarios:

Renderingofnoninstancedmeshes.Renderingofinstancedmeshes.

Inthefirstcase,thedataforeachGameItemispassedthroughuniforms,sowejustneedtoaddanewuniformforthatinthevertexshader.Inthesecondcase,weneedtocreateanewinstancedattribute.Youcanseebellowtheadditionsthatneedtobeintegratedintothevertexshaderforbothcases.

3DObjectpicking

295

layout(location=14)infloatselectedInstanced;

...

uniformfloatselectedNonInstanced;

...

if(isInstanced>0)

{

outSelected=selectedInstanced;

...

}

else

{

outSelected=selectedNonInstanced;

...

Nowthattheinfrastructurehasbeenset-upwejustneedtodefinehowobjectswillbeselected.Beforewecontinueyoumaynotice,ifyoulookatthesourcecode,thattheViewmatrixisnowstoredintheCameraclass.Thisisduetothefactthatwewerrecalculatingtheviewmatrixsinseveralclassesinthesourcecode.Previously,itwasstoredintheTransformationandintheSoundManagerclasses.Inordertocalculateintersectionswewouldneedtocerateanotherreplica.Insteadofthat,wecentralizethatintheCameraclass.Thischangealso,requiresthattheviewmatrixisupdatedinourmaingameloop.

Let’scontinuewiththepickingdiscussion.Inthissample,wewillfollowasimpleapproach,selectionwillbedoneautomaticallyusingthecamera.Theclosestobjecttowherethecameraisfacingwillbeselected.Let’sdiscusshowthiscanbedone.

Thefollowingpicturedepictsthesituationweneedtosolve.

3DObjectpicking

296

Wehavethecamera,placedinsomecoordinatesinworld-space,facingaspecificdirection.Anyobjectthatintersectswitharaycastfromcamera’spositionfollowingcamera’sforwarddirectionwillbeacandidate.Betweenallthecandidateswejustneedtochosetheclosestone.

Inoursample,gameitemsarecubes,soweneedtocalculatetheintersectionofthecamera’sforwardvectorwithcubes.Itmayseemtobeaveryspecificcase,butindeedisveryfrequent.Inmanygames,thegameitemshaveassociatedwhat’scalledaboundingbox.Aboundingboxisarectanglebox,thatcontainsalltheverticesforthatobject.Thisboundingboxisusedalso,forinstance,forcollisiondetection.Infact,intheanimationchapter,yousawthateachanimationframedefinedaboundingbox,thathelpstosettheboundariesatanygiventime.

So,let’sstartcoding.Wewillcreateanewclassnamed CameraBoxSelectionDetector,whichwillhaveamethodnamedselectGameItem```whichwillreceivealistofgameitemsandareferencetothecamera.Themethodisdefinedlikethis.

publicvoidselectGameItem(GameItem[]gameItems,Cameracamera){

GameItemselectedGameItem=null;

floatclosestDistance=Float.POSITIVE_INFINITY;

dir=camera.getViewMatrix().positiveZ(dir).negate();

for(GameItemgameItem:gameItems){

gameItem.setSelected(false);

min.set(gameItem.getPosition());

max.set(gameItem.getPosition());

min.add(-gameItem.getScale(),-gameItem.getScale(),-gameItem.getScale());

max.add(gameItem.getScale(),gameItem.getScale(),gameItem.getScale());

if(Intersectionf.intersectRayAab(camera.getPosition(),dir,min,max,nearFar

)&&nearFar.x<closestDistance){

closestDistance=nearFar.x;

selectedGameItem=gameItem;

}

}

if(selectedGameItem!=null){

selectedGameItem.setSelected(true);

}

}

Themethod,iteratesoverthegameitemstryingtogettheonesthatinteresectwiththeraycastformthecamera.ItfirstdefinesavariablenamedclosestDistance.Thisvariablewillholdtheclosestdistance.Forgameitemsthatintersect,thedistancefromthecameratotheintersectionpointwillbecalculated,Ifit’slowerthanthevaluestoredinclosestDistance,thenthisitemwillbethenewcandidate.

3DObjectpicking

297

Beforeenteringintotheloop,weneedtogetthedirectionvectorthatpointswherethecameraisfacing.Thisiseasy,justusetheviewmatrixtogetthezdirectiontakingintoconsiderationcamera’srotation.Rememberthatpositivezpointsoutofthescreen,soweneedtheoppositedirectionvector,thisiswhywenegateit.

InthegameloopintersectioncalculationsaredonepereachGameItem.But,howdowedothis?ThisiswherethegloriousJOMLlibrarycomestotherescue.WeareusingJOML’sIntersectionfclass,whichprovidesseveralmethodstocalculateintersectionsin2Dand3D.Specifically,weareusingtheintersectRayAabmethod.

ThismethodimplementsthealgorithmthattestintersectionforAxisAlignedBoxes.Youcancheckthedetails,aspointedoutintheJOMLdocumentation,here.

Themethodtestsifaray,definedbyanoriginandadirection,intersectsabox,definedbyminimumandmaximumcorner.Thisalgorithmisvalid,becauseourcubes,arealignedwiththeaxis,iftheywererotated,thismethodwouldnotwork.Thus,themethodreceivesthefollowingparameters:

Anorigin:Inourcase,thiswillbeourcameraposition.Adirection:Thisiswherethecameraisfacing,theforwardvector.Theminimumcornerofthebox.Inourcase,thecubesarecenteredaroundtheGameItemposition,theminimumcornerwillbethosecoordinatesminusthescale.(Initsoriginalsize,cubeshavealengthof2andasacleof1).Themaximumcorner.Selfexplanatory.Aresultvector.Thiswillcontainthenearandfardistancesoftheintersectionpoints.

Themethodwillreturntrueifthereisanintersection.Iftrue,wechecktheclosesdistanceandupdateitifneeded,andstoreareferenceofthecandidateselectedGameItem.Thenextfigureshowsalltheelementsinvolvedinthismethod.

3DObjectpicking

298

Oncetheloophasfinished,thecandidateGameItemismarkedasselected.

Andthat’s,all.TheselectGameItemwillbeinvokedintheupdatemethodoftheDummyGameclass,alongwiththeviewmatrixupdate.

//Updateviewmatrix

camera.updateViewMatrix();

//Updatesoundlistenerposition;

soundMgr.updateListenerPosition(camera);

this.selectDetector.selectGameItem(gameItems,camera);

Besidesthat,across-hairhasbeenaddedtotherenderingprocesstocheckthateverythingisworkingproperly.Theresultisshowninthenextfigure.

3DObjectpicking

299

Obviously,themethodpresentedhereisfarfromoptimalbutitwillgiveyouthebasicstodevelopmoresophisticatedmethodsonyourown.Somepartsofthescenecouldbeeasilydiscarded,likeobjectsbehindthecamera,sincetheyarenotgoingtobeintersected.Besidesthat,youmaywanttoorderyouritemsaccordingtothedistancetothecameratospeedupcalculations.Inadditiontothat,calculationsonlyneedtobedoneifthecamerahasmovedor.rotatedfrompreviousupdate.

MouseSelectionObjectpickingwiththecameraisgreat,butwhatifwewanttobeabletofreelyselectobjectswiththemouse?Inthiscase,wewantthat,whenevertheuserclicksonthescreen,theclosestobjectisautomaticallyselected.

Thewaytoachievethisissimilartothemethoddescribedabove.Inthepreviousmethodwehadthecamerapositionandgeneratedraysfromitusingthe“forward”directionaccordingtocamera’scurrentorientation.Inthiscase,westillneedtocastrays,butthedirectionpointstoapointfarawayfromthecamera,wheretheclickhasbeenmade.Inthiscase,weneedtocalculatethatdirectionvectorusingtheclickcoordinates.

But,howdowepassfroma(x, y)coordinatesinviewportspacetoworldspace?Let’sreviewhowwepassfrommodelspacecoordinatestoviewspace.Thedifferentcoordinatetransformationsthatareappliedinordertoachievethatare:

Wepassfrommodelcoordinatestoworldcoordinatesusingthemodelmatrix.

3DObjectpicking

300

Wepassfromworldcoordinatestoviewspacecoordinatesusingtheviewmatrix(thatprovidesthecameraeffect)-Wepassfromviewcoordinatestohomogeneousclipspacebyapplyingtheperspectiveprojectionmatrix.FinalscreencoordinatesarecalculateautomaticallybyOpenGLforus.Beforedointthat,itpassestonormalizeddevicespace(bydividingthex, y, zcoordinatesbythewcomponent)andthentox, yscreencoordinates.

Soweneedjusttoperformthetraversetheinevrsepathtogetfromscreencoordinates(x, y),toworldcoordinates.

Thefirststepistotransformfromscreencoordinatestonormalizeddevicespace.The(x, y)coordinatesintheviewportspaceareintherange[0, screenwith][0, screenheight].Theupperleftcornerofthescreenhasavalueof[0, 0].Weneedtotransformthatintocoordinatesintherange[−1, 1].

Themathsaresimple:

x = 2 ⋅ screen /screenwidth− 1

y = 1 − 2 ∗ screen /screenheight

But,howdowecalculatethezcoordinate?Theanswerissimple,wesimplyassignitthe−1value,sothattheraypointstothefarthestvisibledistance(RememberthatinOpenGL,−1pointstothescreen).Nowwehavethecoordinatesinnormaliseddevicespace.

Inordertocontinuewiththetransformationsweneedtoconvertthemtothehomogeneousclipspace.Weneedtohavethewcomponent,thatisusehomogeneouscoordinates.Althoughthisconceptwaspresentedinthepreviouschapters,let’sgetbacktoit.Inorderto

x

y

3DObjectpicking

301

representa3Dpointwejustneedthex, yandzcoordinates,butwearecontinuouslyworkingwithanadditionalcoordinate,thewcomponent.Weneedthisextracomponentinordertousematricestoperformthedifferenttransformations.Sometransformationsdonotneedthatextracomponentbutotherdo.Forinstance,thetranslationmatrixdoesnotworkifweonlyhavex, yandzcomponents.Thus,wehaveaddedthewcomponentandassignedthemavalueof1sowecanworkwith4by4matrices.

Besidesthat,mostoftransformations,ortobemoreprecise,mostofthetransformationmatricesdonotalterthewcomponent.Anexceptiontothisistheprojectionmatrix.Thismatrixchangesthewvaluetobeproportionaltothezcomponent.

Transformingfromhomogeneousclipspacetonormalizeddevicecoordinatesisachievedbydividingthex,yandzcoordinatesbyw.Asthiscomponentisproportionaltothezcomponent,thisimpliesthatdistantobjectsaredrawnsmaller.Inourcaseweneedtodothereverse,weneedtounproject,butsincewhatwearecalculatingit’saraywejustsimplycanignorethatstep,wejustsetthewcomponentto1andleavetherestofthecomponentswiththeiroriginalvalue.

Nowweneedtogobackyoviewspace.Thisiseasy,wejustneedtocalculatetheinverseoftheprojectionmatrixandmultiplyitbyour4componentsvector.Oncewehavedonethat,weneedtotransformthemtoworldspace.Again,wejustneedtousetheviewmatrix,calculateit’sinverseandmultiplyitbyourvector.

Rememberthatweareonlyinterestedindirections,so,inthiscasewesetthewcomponentto0.Alsowecansetthezcomponentagainto−1,sincewewantittopointtowardsthescreen.Oncewehavedonethatandappliedtheinverseviewmatrixwehaveourvectorinworldspace.Wehaveourraycalculatedandcanapplythesamealgorithmasinthecaseofthecamerapicking.

WehavecreatedanewclassnamedMouseBoxSelectionDetectorthatimplementsthesetpsdescribedabove.Besidesthat,wehavemovedtheprojectionmatrixtotheWindowclasssowecanusetheminseveralplacesofthesourcecodeandrefactroedalittlebittheCameraBoxSelectionDetectorsotheMouseBoxSelectionDetectorcaninheitfromitandusethecollisiondetectionmethod.Youcancheckthesourcecodedirectly,sincetheimplemenattionit’sverysimple.

Theresultnowlookslikethis.

3DObjectpicking

302

Youjustneedtoclickovertheblockwiththemouseleftbuttontoperformtheselection.

Inanycase,ifyoucanconsultthedetailsbehindthestepsexplainedhereinanexcellentarticlewithverydetailedsketechsofthedifferentstepsinvolved.

3DObjectpicking

303

HUDRevisited-NanoVGInpreviouschaptersweexplainedhowaHUDcanbecreatedrenderingsshapesandtexturesoverthetopofthesceneusinganorthographicprojection.InthischapterwewilllearnhowtousetheNanoVGlibrarytobeabletorenderantialiasedvectorgraphicstoconstructmorecomplexHUDsinaneasyway.

Therearemanyotherlibrariesouttherethatyoucanusetoaccomplishthistask,suchasNiftyGUI,Nuklear,etc.InthischapterwewillfocusonNanovgsinceit’sverysimpletouse,butifyou’relookingfordevelopingcomplexGUIinteractionswithbuttons,menusandwindowsyoushouldprobablylookforNiftyGUI.

ThefirststepinordertostartusingNanoVGisaddingthedependencesinthepom.xmlfile(oneforthedependenciesrequiredatcompiletimeandtheotheroneforthenativesrequiredatruntime).

...

<dependency>

<groupId>org.lwjgl</groupId>

<artifactId>lwjgl-nanovg</artifactId>

<version>${lwjgl.version}</version>

</dependency>

...

<dependency>

<groupId>org.lwjgl</groupId>

<artifactId>lwjgl-nanovg</artifactId>

<version>${lwjgl.version}</version>

<classifier>${native.target}</classifier>

<scope>runtime</scope>

</dependency>

BeforewestartusingNanoVGwemustsetupsomethingsintheOpenGLsidesothesamplescanworkcorrectly.Weneedtoenablesupportforstencilbuffertest.Untilnowwehavetalkedaboutcolouranddepthbuffers,butwehavenotmentionedthestencilbuffer.Thisbufferstoresavalue(aninteger)foreverypixelwhichisusedtocontrolwhichpixelsshouldbedrawn.Thisbufferisusedtomaskordiscarddrawingareasaccordingtothevaluesitstores.Itcanbeused,forinstance,tocutoutsomepartsofthesceneinaneasyway.WeenablestencilbuffertestbyaddingthislinetotheWindowclass(afterweenabledepthtesting):

glEnable(GL_STENCIL_TEST);

Hudrevisited-NanoVG

304

Sinceweareusinganotherbufferwemusttakecarealsoofremovingitsvaluesbeforeeachrendercall.Thus,weneedtomodifytheclearmethodoftheRendererclass:

publicvoidclear(){

glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT|GL_STENCIL_BUFFER_BIT);

}

Wewillalsoaddanewwindowoptionforactivatingantialiasing.Thus,intheWindowclasswewillenableitbythisway:

if(opts.antialiasing){

glfwWindowHint(GLFW_SAMPLES,4);

}

NowwearereadytousetheNanoVGlibrary.ThefirsthingthatwewilldoisgetridofftheHUDartefactswehavecreated,thatistheshaderstheIHudinterface,thehudrenderingmethodsintheRendererclass,etc.Yocancheckthisoutinthesourcecode.

Inthiscase,thenewHudclasswilltakecareofitsrendering,sowedonotneedtodelegateittotheRendererclass.Let’startbydefiningthatclass,ItwillhaveaninitmethodthatsetsupthelibraryandtheresourcesneededtobuildtheHUD.Themethodisdefinedlikethis:

publicvoidinit(Windowwindow)throwsException{

this.vg=window.getOptions().antialiasing?nvgCreate(NVG_ANTIALIAS|NVG_STENCIL

_STROKES):nvgCreate(NVG_STENCIL_STROKES);

if(this.vg==NULL){

thrownewException("Couldnotinitnanovg");

}

fontBuffer=Utils.ioResourceToByteBuffer("/fonts/OpenSans-Bold.ttf",150*1024);

intfont=nvgCreateFontMem(vg,FONT_NAME,fontBuffer,0);

if(font==-1){

thrownewException("Couldnotaddfont");

}

colour=NVGColor.create();

posx=MemoryUtil.memAllocDouble(1);

posy=MemoryUtil.memAllocDouble(1);

counter=0;

}

Hudrevisited-NanoVG

305

ThefirstthingwedoiscreateaNanoVGcontext.InthiscaseweareusinganOpenGL3.0backendsincewearereferringtotheorg.lwjgl.nanovg.NanoVGGL3namespace.IfantialiasingisactivatedwesetuptheflagNVG_ANTIALIAS.

Next,wecreateafontbyusingaTrueTypefontpreviouslyloadedintoaByteBuffer.Weassignitanamesowecanlateronuseitwhilerenderingtext.OneimportantthingaboutthisisthattheByteBufferusedtoloadthefontmustbekeptinmemorywhilethefontisused.Thatis,itcannotbegarbagecollected,otherwiseyouwillgetanicecoredump.Thisiswhyitisstoredasaclassattribute.

Then,wecreateacolourinstanceandsomehelpfulvariablesthatwillbeusedwhilerendering.Thatmethodiscalledinthegameinitmethod,justbeforetherenderedisinitialized:

@Override

publicvoidinit(Windowwindow)throwsException{

hud.init(window);

renderer.init(window);

...

TheHudclassalsodefinesarendermethod,whichshouldbecalledafterthescenehasbeenrenderedsotheHUDisdrawnontopofit.

@Override

publicvoidrender(Windowwindow){

renderer.render(window,camera,scene);

hud.render(window);

}

TherendermethodoftheHudclassstartslikethis:

publicvoidrender(Windowwindow){

nvgBeginFrame(vg,window.getWidth(),window.getHeight(),1);

ThefirstthingthatwemustdoiscallthenvgBeginFrame``method.AlltheNanoVGrenderingoperationsmustbeenclosedbetweenanvgBeginFrameandnvgEndFramecalls.ThenvgBeginFrame```acceptsthefollowingparameters:

TheNanoVGcontext.Thesizeofthewindowtorender(widthanheight).Thepixelratio.IfyouneedtosupportHi-DPIyoucanchangethisvalue.Forthissamplewejustsetitto1.

Hudrevisited-NanoVG

306

Thenwecreateseveralribbonsthatoccupythewholescreenwith.Thefirstoneisdrawnlikethis:

//Upperribbon

nvgBeginPath(vg);

nvgRect(vg,0,window.getHeight()-100,window.getWidth(),50);

nvgFillColor(vg,rgba(0x23,0xa1,0xf1,200,colour));

nvgFill(vg);

Whilerenderingashape,thefirstmethodthatshallbeinvokedisnvgBeginPath,thatinstructsNanoVGtostartdrawinganewshape.Thenwedefinewhattodraw,arect,thefillcolourandbyinvokingthenvgFillwedrawit.

Youcanchecktherestofthesourcecodetoseehowtherestoftheshapesaredrawn.WhenrenderingtextisnotnecessarytocallnvgBeginPathbeforerenderingit.

Afterwehavefinisheddrawingalltheshapes,wejustcallthenvgEndFrametoendrendering,butthere’soneimportantthingtobedonebeforeleavingthemethod.WemustrestoretheOpenGLstate.NanoVGmodifiesOpenGLstateinordertoperformtheiroperations,ifthestateisnotcorrectlyrestoredyoumayseethatthesceneisnotcorrectlyrenderedoreventhatit'sbeenwipedout.Thus,weneedtorestoretherelevantOpenGLstatusthatweneedforourrendering.ThisisdelegatedintheWindowclass:

//Restorestate

window.restoreState();

Themethodisdefinedlikethis:

publicvoidrestoreState(){

glEnable(GL_DEPTH_TEST);

glEnable(GL_STENCIL_TEST);

glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);

if(opts.cullFace){

glEnable(GL_CULL_FACE);

glCullFace(GL_BACK);

}

}

Andthat’sall(besidessomeadditionalmethodstoclearthingsup),thecodeiscompleted.Whenyouexecutethesampleyouwillgetsomethinglikethis:

Hudrevisited-NanoVG

307

Hudrevisited-NanoVG

308

Optimizations-FrustumCulling(I)Atthismomentweareusingmanydifferentgraphiceffects,suchaslights,particles,etc.Inadditiontothat,wehavelearnedhowtoinstancedrenderingtoreducetheoverheadofdrawingmanysimilarobjects.However,westillhaveplentyofroomforapplyingsimpleoptimizationtechniquesthatwillincreasetheFramesPerSecond(FPS)thatwecanachieve.

YoumayhavewonderedwhyarewedrawingthewholelistofGameItemseveryframeevenifsomeofthemwillnotbevisible(becausetheyarebehindthecameraortoofaraway).YoumayeventhinkthatthisishandledautomaticallyhandledbyOpenGL,andsomewayyouaretrue.OpenGLwilldiscardtherenderingofverticesthatfalloffthevisiblearea.Thiscalledclipping.Theissuewithclippingisthatit’sdonepervertex,afterthevertexshaderhasbeenexecuted.Hence,eventhisoperationsavesresources,wecanbemoreefficientbynottryingtorenderobjectsthatwillnotbevisible.WewouldnotbewastingresourcesbysendingthedatatotheGPUandbyperformingtransformationsforeveryvertexthatispartofthoseobjects.Weneedtoremovetheobjectsthatdoarenotcontainedintotheviewfrustum,thatis,weneedtoperformfrustumculling.

But,firstlet’sreviewwhatistheviewfrustum.Theviewfrustumisavolumethatcontainseveryobjectthatmaybevisibletakingintoconsiderationthecamerapositionandrotationandtheprojectionthatweareusing.Typically,theviewfrustumisarectangularpyramidlikeshowninthenextfigure.

Optimizations

309

Asyoucansee,theviewfrustumisdefinedbysixplanes,anythingthatliesoutsidetheviewfrustumwillnotberendering.So,frustumcullingistheprocessofremovingobjectsthatareoutsidetheviewfrustum.

Thus,inordertoperformfrustumcullingweneedto:

Calculatefrustumplanesusingthedatacontainedintheviewandprojectionmatrices.

ForeveryGameItem,checkifitscontainedinsidethatviewfrustum,thatis,comatinedbetweenthesizefrustumplanes,andeliminatetheonesthatlieoutsidefromtherenderingprocess.

Solet’sstartbycalculatingthefrustumplanes.Aplane,isdefinedbyapointcontainedinitandavectororthogonaltothatplane,asshowninthenextfigure:

Theequationofaplaneisdefinedlikethis:

Ax+By + Cz +D = 0

Optimizations

310

Hence,weneedtocalculatethesixplaneequationsforthesixsidesofourviewfrustum.Inordertotothatyoubasicallyhavetwooptions.Youcanperformtediouscalculationsthatwillgetyouthesixplaneequations,thatthis,thefourconstants(A,B,CandD)fromthepreviousequation.TheotheroptionistoletJOMLlibrarytocalculatethisforyou.Inthiscase,wewillchosethelastoption.

Solet’sstartcoding.WewillcreateanewclassnamedFrustumCullingFilterwhichwillperform,asitsnamestates,filteringoperationsaccordingtotheviewfrustum.

publicclassFrustumCullingFilter{

privatestaticfinalintNUM_PLANES=6;

privatefinalMatrix4fprjViewMatrix;

privatefinalVector4f[]frustumPlanes;

publicFrustumCullingFilter(){

prjViewMatrix=newMatrix4f();

frustumPlanes=newVector4f[NUM_PLANES];

for(inti=0;i<NUM_PLANES;i++){

frustumPlanes[i]=newVector4f();

}

}

publicvoidupdateFrustum(Matrix4fprojMatrix,Matrix4fviewMatrix){

//Calculateprojectionviewmatrix

prjViewMatrix.set(projMatrix);

prjViewMatrix.mul(viewMatrix);

//Getfrustumplanes

for(inti=0;i<NUM_PLANES;i++){

prjViewMatrix.frustumPlane(i,frustumPlanes[i]);

}

}

TheFrustumCullingFilterclasswillalsohaveamethodtocalculatetheplaneequationscalledupdateFrustumwhichwillbecalledbeforerendering.Themethodisdefinedlikethis:

publicvoidupdateFrustum(Matrix4fprojMatrix,Matrix4fviewMatrix){

//Calculateprojectionviewmatrix

prjViewMatrix.set(projMatrix);

prjViewMatrix.mul(viewMatrix);

//Getfrustumplanes

for(inti=0;i<NUM_PLANES;i++){

prjViewMatrix.frustumPlane(i,frustumPlanes[i]);

}

}

Optimizations

311

First,westoreacopyoftheprojectionmatrixandmultiplyitbytheviewmatrixtogettheprojectionviewmatrix.Then,withthattransformationmatrixwejustsimplyneedtoinvokethefrustumPlanemethodforeachofthefrustumplanes.It’simportanttonotethattheseplaneequationsareexpressedinworldcoordinates,soallthecalculationsneedtobedoneinthatspace.

NowthatwehavealltheplanescalculatedwejustneedtocheckiftheGameIteminstancesareinsidethefrustumornot.Howcanwedothis?Let’sfirstexaminehowwecancheckifapointisinsidethefrustum.Wecanachievethatbycalculatingthesigneddistanceofthepointtoeachoftheplanes.Ifthedistanceofthepointtotheplaneispositive,thismeansthatthepointisinfrontoftheplane(accordingtoitsnormal).Ifit’snegative,thismeansthatthepointisbehindtheplane.

Therefore,apointwillbeinsidetheviewfrustumifthedistancetoalltheplanesofthefrustumispositive.Thedistanceofapointtotheplaneisdefinedlikethis:

dist = Ax +By + Cz +D,wherex ,y andz arethecoordinatesofthepoint.

So,apointisbehindtheplaneifAx +By + Cz +D <= 0.

But,wedonothavepoints,wehavecomplexmeshes,wecannotjustuseapointtocheckifanobjectisinsideafrustumornot.YoumaythinkincheckingeveryvertexoftheGameItemandseeifit’sinsidethefrustumornot.Ifanyofthepointsisinside,theGameItemshouldbedrawn.ButthiswhatOpenGLdoesinfactwhenclipping,thisiswhatwearetyingtoavoid.Rememberthatfrustumcullingbenefitswillbemorenoticeablethemorecomplexthemeshestoberenderedare.

WeneedtoenclsoeeveyGameItemintoasimplevolumethatiseasytocheck.Herewehavebasicallytwooptions:

0 0 0 0 0 0

0 0 0

Optimizations

312

Boundingboxes.

BoundingSpheres.

Inthiscase,wewillusespheres,sinceisthemostsimpleapproach.WewillencloseeveryGameItemsintoasphereandwillcheckifthesphereisinsidetheviewfrustumornot.Inordertodothat,wejustneedthecenterandtheradiusofthesphere.Thechecksarealmostequaltothepointcase,exceptthatweneedtotaketheradiusintoconsideration.Aspherewillbeoutsidethefrustimifitthefollowingconditionismet:dist = Ax0 +By0 + Cz0 <= −radius.

So,wewilladdanewmethodtotheFrustumCullingFilterclasstocheckifaspphereisinsidethefrustumornot.Themethodisdefinedlikethis.

publicbooleaninsideFrustum(floatx0,floaty0,floatz0,floatboundingRadius){

booleanresult=true;

for(inti=0;i<NUM_PLANES;i++){

Vector4fplane=frustumPlanes[i];

if(plane.x*x0+plane.y*y0+plane.z*z0+plane.w<=-boundingRadius){

result=false;returnresult;

}

}

returnresult;

}

Then,wewilladdmethodthatfilterstheGameItemsthatoutsidetheviewfrustum:

Optimizations

313

publicvoidfilter(List<GameItem>gameItems,floatmeshBoundingRadius){

floatboundingRadius;

Vector3fpos;

for(GameItemgameItem:gameItems){

boundingRadius=gameItem.getScale()*meshBoundingRadius;

pos=gameItem.getPosition();

gameItem.setInsideFrustum(insideFrustum(pos.x,pos.y,pos.z,boundingRadius));

}

}

Wehaveaddedanewattribute,insideFrustum,totheGameItemclass,totrackthevisibility.Asyoucansee,theradiusoftheboundingsphereispassedasparameterThisisduetothefactthattheboundingsphereisassociatedtotheMesh,it’snotapropertyoftheGameItem.But,rememberthatwemustoperateinworldcoordinates,andtheradiosoftheboundingspherewillbeinmodelspace.WewilltransformittoworldspacebyapplyingthescalethathasbeensetupfortheGameItem,WeareassumigalsothatthepositionoftheGameItemisthecentreofthespehere(inworldspacecoordinates).

Thelastmethod,isjustautilityone,thatacceptsthemapofmeshesandfiltersalltheGameIteminstancescontainedinit.

publicvoidfilter(Map<?extendsMesh,List<GameItem>>mapMesh){

for(Map.Entry<?extendsMesh,List<GameItem>>entry:mapMesh.entrySet()){

List<GameItem>gameItems=entry.getValue();

filter(gameItems,entry.getKey().getBoundingRadius());

}

}

Andthat’sit.Wecanusethatclassinsidetherenderingprocess.Wejustneedtoupdatethefrustumplanes,calculatewhichGameItemsarevisibleandfilterthemoutwhendrawinginstancedandnoninstancedmeshes.

frustumFilter.updateFrustum(window.getProjectionMatrix(),camera.getViewMatrix());

frustumFilter.filter(scene.getGameMeshes());

frustumFilter.filter(scene.getGameInstancedMeshes());

YoucanplaywithactivatinganddeactivatingthefilteringandcanchecktheincreaseanddecreaseintheFPSthatyoucanachieve.Particlesarenotconsideredinthefiltering,butitstrivialtoaddit.Inanycase,forparticles,itmaybebettertojustcheckthepositionoftheemitterinsteadofcheckingeveryparticle.

Optimizations-FrustumCulling(II)

Optimizations

314

Oncethebasisoffrustumcullinghasbeenexplained,wecangetadvatangeofmorerefinedmethodsthattheJOMLlibraryprovides.Inparticular,itprovdiesaclassnamedFrustumIntersectionwhichextractstheplanesoftheveiwfrustuminamoreefficientwayasdescribedinthispaper.Besidesthat,thisclassalsoprovidesmethodsfortestingboundingboxes,pointsandspheres.

So,let'schangetheFrustumCullingFilterclass.Theattributesandconstructoraresimplifiedlikethis:

publicclassFrustumCullingFilter{

privatefinalMatrix4fprjViewMatrix;

privateFrustumIntersectionfrustumInt;

publicFrustumCullingFilter(){

prjViewMatrix=newMatrix4f();

frustumInt=newFrustumIntersection();

}

TheupdateFrustummethodjustdelegatestheplaneextractiontotheFrustumIntersectioninstance.

publicvoidupdateFrustum(Matrix4fprojMatrix,Matrix4fviewMatrix){

//Calculateprojectionviewmatrix

prjViewMatrix.set(projMatrix);

prjViewMatrix.mul(viewMatrix);

//Updatefrustumintersectionclass

frustumInt.set(prjViewMatrix);

}

AndthemethodthatinsideFrustummethodisvenemoresimple:

publicbooleaninsideFrustum(floatx0,floaty0,floatz0,floatboundingRadius){

returnfrustumInt.testSphere(x0,y0,z0,boundingRadius);

}

WiththisapproachyouwillbeabletovenegetafewmoreFPS.Besidesthat,aglobalflaghasbeenaddedtotheWindowclasstoenable/disbalefrustumculling.TheGameItemclassalsohasaflagforenabling/disablingthefiltering,becausetheremaybesomeitemsforwhichfrustumcullingfilteringdoesnotmakesense.

Optimizations

315

CascadedShadowMapsIntheshadowschapterwepresentedtheshadowmaptechniquetobeabletodisplayshadowsusingdirectionallightswhenrenderinga3Dscene.Thesolutionpresentedthere,requiredyoutomanuallytweaksomeoftheparametersinordertoimprovetheresults.Inthischapterwearegoingtochangethattechniquetoautomatealltheprocessandtoimprovetherresultsforopenspaces.InordertoachievethatgoalwearegoingtouseatechniquecalledCascadedShadowMaps(CSM).

Let’sfirststartbyexamininghowwecanautomatetheconstructionofthelightviewmatrixandtheorthographicprojectionmatrixusedtorendertheshadows.Ifyourecallfromtheshadowschapter,weneedtodrawthesceneformthelight’sperspective.Thisimpliesthecreationofalightviewmatrix,whichactslikeacameraforlightandaprojectionmatrix.Sincelightisdirectional,andissupposedtobelocatedattheinfinity,wechoseanorthographicprojection.

Wewantallthevisibleobjectstofitintothelightviewprojectionmatrix.Hence,weneedtofittheviewfrustumintothelightfrustum.Thefollowingpicturedepictswhatwewanttoachieve.

CascadedShadowMaps

316

Howcanweconstructthat?Thefirststepistocalculatethefrustumcornersoftheviewprojectionmatrix.Wegetthecoordinatesinworldspace.Thenwecalculatethecentreofthatfrustum.Thiscanbecalculatingbyaddingthecoordinatesforallthecornersanddividingtheresultbythenumberofcorners.

Withthatinformationwecansetthepositionofthelight.Thatpositionanditsdirectionwillbeusedtoconstructthelightviewmatrix.Inordertocalculatetheposition,westartformthecentreoftheviewfrustumobtainedbefore.Wethengobacktothedirectionoflightanamountequaltothedistancebetweenthenearandfarzplanesoftheviewfrustum.

CascadedShadowMaps

317

Oncewehaveconstructedthelightviewmatrix,weneedtosetuptheorthographicprojectionmatrix.Inordertocalculatethemwetransformthefrustumcornerstolightviewspace,justbymultiplyingthembythelightviewmatrixwehavejustconstructed.Thedimensionsofthatprojectionmatrixwillbeminimumandmaximumxandyvalues.Thenearzplanecanbesetuptothesamevalueusedbyourstandardprojectionmatricesandthefarvaluewillbethedistancebetweenthemaximumandminimumzvaluesofthefrustumcornersinlightviewspace.

However,ifyouimplementthealgorithmdescribedaboveovertheshadowssample,youmaybedisappointedbytheshadowsquality.

CascadedShadowMaps

318

Thereasonforthatisthatshadowsresolutionislimitedbythetexturesize.Wearecoveringnowapotentiallyhugearea,andtexturesweareusingtostoredepthinformationhavenotenoughresolutioninordertogetgoodresults.Youmaythinkthatthesolutionisjusttoincreasetextureresolution,butthisisnotsufficienttocompletelyfixtheproblem.Youwouldneedhugetexturesforthat.

There’sasmartersolutionforthat.Thekeyconceptisthat,shadowsofobjectsthatareclosertothecameraneedtohaveahigherqualitythanshadowsfordistantobjects.Oneapproachcouldbetojustrendershadowsforobjectsclosetothecamera,butthiswouldcauseshadowstoappear/disappearaslongaswemovethroughthescene.

TheapproachthatCascadedShadowMaps(CSMs)useistodividetheviewfrustumintoseveralsplits.Splitsclosertothecameracoverasmalleramountspaceswhilstdistantregionscoveramuchwiderregionofspace.Thenextfigureshowsaviewfrustumdividedintothreesplits.

CascadedShadowMaps

319

Foreachofthesesplits,thedepthmapisrendered,adjustingthelightviewandprojectionmatricestocoverfittoeachsplit.Thus,thetexturethatstoresthedepthmapcoversareducedareaoftheviewfrustum.And,sincethesplitclosesttothecameracoverslessspace,thedepthresolutionisincreased.

Asitcanbededucedformexplanationabove,Wewillneedasmanydepthtexturesassplits,andwewillalsochangethelightviewandprojectionmatricesforeachofthe,Hence,thestepstobedoneinordertoapplyCSMsare:

Dividetheviewfrustumintonsplits.

Whilerenderingthedepthmap,foreachsplit:

Calculatelightviewandprojectionmatrices.

Renderthescenefromlight’sperspectiveintoaseparatedepthmap

Whilerenderingthescene:

Usethedepthsmapscalculatedabove.

Determinethesplitthatthefragmenttobedrawnbelongsto.

Calculateshadowfactorasinshadowmaps.

CascadedShadowMaps

320

Asyoucansee,themaindrawbackofCSMsisthatweneedtorenderthescene,fromlight’sperspective,foreachsplit.Thisiswhyisoftenonlyusedforopenspaces.Anyway,wewillseehowwecaneasilyreducethatoverhead.

Solet’sstartexaminingthecode,butbeforewecontinuealittlewarning,Iwillnotincludethefullsourcecodeheresinceitwouldbeverytedioustored.Instead,Iwillpresentthemainclassestheirresponsibilitiesandthefragmentsthatmayrequirefurtherexplanationinordertogetagoodunderstanding.Alltheshadingrelatedclasseshavebeenmovedtoanewpackagecalledorg.lwjglb.engine.graph.shadow.

Thecodethatrendersshadows,thatis,thescenefromlight’sperspectivehasbeenmovedtotheShadowRendererclass.(ThatcodewaspreviouslycontainedintheRendererclass).

Theclassdefinesthefollowingconstants:

publicstaticfinalintNUM_CASCADES=3;

publicstaticfinalfloat[]CASCADE_SPLITS=newfloat[]{Window.Z_FAR/20.0f,Window.

Z_FAR/10.0f,Window.Z_FAR};

Thefirstoneisthenumberofcascadesorsplits.Thesecondonedefineswherethefarzplaneislocatedforeachofthesesplits.Asyoucanseetheyarenotequallyspaced.Thesplitthatisclosertothecamerahastheshortestdistanceinthezplane.

Theclassalsostoresthereferencetotheshaderprogramusedtorenderthedepthmap,alistwiththeinformationassociatedtoeachsplit,modelledbytheShadowCascadeclass,andareferencetotheobjectthatwhillhostthedepthmapthinformation(tetxures),modelledbytheShadowBufferclass.

TheShadowRendererclasshasmethodsforsettinguptheshadersandtherequiredattributesandarendermethod.Therendermethodisdefinedlikethis.

CascadedShadowMaps

321

publicvoidrender(Windowwindow,Scenescene,Cameracamera,Transformationtransform

ation,Rendererrenderer){

update(window,camera.getViewMatrix(),scene);

//Setupviewporttomatchthetexturesize

glBindFramebuffer(GL_FRAMEBUFFER,shadowBuffer.getDepthMapFBO());

glViewport(0,0,ShadowBuffer.SHADOW_MAP_WIDTH,ShadowBuffer.SHADOW_MAP_HEIGHT);

glClear(GL_DEPTH_BUFFER_BIT);

depthShaderProgram.bind();

//Rendersceneforeachcascademap

for(inti=0;i<NUM_CASCADES;i++){

ShadowCascadeshadowCascade=shadowCascades.get(i);

depthShaderProgram.setUniform("orthoProjectionMatrix",shadowCascade.getOrthoP

rojMatrix());

depthShaderProgram.setUniform("lightViewMatrix",shadowCascade.getLightViewMat

rix());

glFramebufferTexture2D(GL_FRAMEBUFFER,GL_DEPTH_ATTACHMENT,GL_TEXTURE_2D,sha

dowBuffer.getDepthMapTexture().getIds()[i],0);

glClear(GL_DEPTH_BUFFER_BIT);

renderNonInstancedMeshes(scene,transformation);

renderInstancedMeshes(scene,transformation);

}

//Unbind

depthShaderProgram.unbind();

glBindFramebuffer(GL_FRAMEBUFFER,0);

}

Asyoucansee,Isimilartothepreviousrendermethodforshadowmaps,exceptthatweareperformingseveralrenderingpasses,onepersplit.IneachpasswechangethelightviewmatrixandtheorthographicprojectionmatrixwiththeinformationcontainedintheassociatedShadowCascadeinstande.

Also,ineachpass,weneedtochangethetextureweareusing.Eachpasswillrenderthedepthinformationtoadifferenttexture.ThisinformationisstoredintheShadowBufferclass,andissetuptobeusedbytheFBOwiththisline:

glFramebufferTexture2D(GL_FRAMEBUFFER,GL_DEPTH_ATTACHMENT,GL_TEXTURE_2D,shadowBuffe

r.getDepthMapTexture().getIds()[i],0);

CascadedShadowMaps

322

Asit’sjusthavebeenmentioned,theShadowBufferclassstorestheinformationrelatedtothetexturesusedtostoredepthinformation.Thecodeitsverysimilartothecodeusedintheshadowschapter,exceptthatweareusingtexturearrays.Thus,wehavecreatedanewclass,ArrTexture,thatcreatesanarrayoftextureswiththesameattributes.Thisclassalsoprovidesabindmethodthatbindsallthetexturearraysforusingtheminthesceneshader.Themethodreceivesaparameter,withthetextureunittostartwith.

publicvoidbindTextures(intstart){

for(inti=0;i<ShadowRenderer.NUM_CASCADES;i++){

glActiveTexture(start+i);

glBindTexture(GL_TEXTURE_2D,depthMap.getIds()[i]);

}

}

ShadowCascadeclass,storesthelightviewandorthographicprojectionmatricesassociatedtoonesplit.Eachsplitisdefinedbyanearandazfarplandistance,andwiththatinformationthematricesarecalculatedaccordingly.

Theclassprovidedandupdatemethodwhich,takingasaninputtheviewnatrixandthelightdirection.Themethodcalculatestheviewfrustumcornersinworldspaceandthencalculatesthelightposition.Thatpositioniscalculatedgoingback,suingthelightdirection,fromthefrustumcentretoadistanceequaltothedistancebetweenthefarandnearzplanes.

CascadedShadowMaps

323

publicvoidupdate(Windowwindow,Matrix4fviewMatrix,DirectionalLightlight){

//Buildprojectionviewmatrixforthiscascade

floataspectRatio=(float)window.getWidth()/(float)window.getHeight();

projViewMatrix.setPerspective(Window.FOV,aspectRatio,zNear,zFar);

projViewMatrix.mul(viewMatrix);

//Calculatefrustumcornersinworldspace

floatmaxZ=-Float.MAX_VALUE;

floatminZ=Float.MAX_VALUE;

for(inti=0;i<FRUSTUM_CORNERS;i++){

Vector3fcorner=frustumCorners[i];

corner.set(0,0,0);

projViewMatrix.frustumCorner(i,corner);

centroid.add(corner);

centroid.div(8.0f);

minZ=Math.min(minZ,corner.z);

maxZ=Math.max(maxZ,corner.z);

}

//Gobackfromthecentroiduptomax.z-min.zinthedirectionoflight

Vector3flightDirection=light.getDirection();

Vector3flightPosInc=newVector3f().set(lightDirection);

floatdistance=maxZ-minZ;

lightPosInc.mul(distance);

Vector3flightPosition=newVector3f();

lightPosition.set(centroid);

lightPosition.add(lightPosInc);

updateLightViewMatrix(lightDirection,lightPosition);

updateLightProjectionMatrix();

}

Withthelightpositionandthelightdirection.wecanconstructthelightviewmatrix.ThisisdoneintheupdateLightViewMatrix:

privatevoidupdateLightViewMatrix(Vector3flightDirection,Vector3flightPosition){

floatlightAngleX=(float)Math.toDegrees(Math.acos(lightDirection.z));

floatlightAngleY=(float)Math.toDegrees(Math.asin(lightDirection.x));

floatlightAngleZ=0;

Transformation.updateGenericViewMatrix(lightPosition,newVector3f(lightAngleX,li

ghtAngleY,lightAngleZ),lightViewMatrix);

}

Finally,weneedtoconstructtheorthographicprojectionmatrix.ThisisdoneintheupdateLightProjectionMatrixmethod.Themethodistotransformtheviewfrustumcoordinatesintolightspace.Wethengettheminimumandmaximumvaluesforthex,y

CascadedShadowMaps

324

coordinatestoconstructtheboundingboxthatenclosestheviewfrustum.Nearzplanecanbesetto0andthefarzplanetothedistancebetweenthemaximumandminimumvalueofthecoordinates.

privatevoidupdateLightProjectionMatrix(){

//Nowcalculatefrustumdimensionsinlightspace

floatminX=Float.MAX_VALUE;

floatmaxX=-Float.MAX_VALUE;

floatminY=Float.MAX_VALUE;

floatmaxY=-Float.MAX_VALUE;

floatminZ=Float.MAX_VALUE;

floatmaxZ=-Float.MAX_VALUE;

for(inti=0;i<FRUSTUM_CORNERS;i++){

Vector3fcorner=frustumCorners[i];

tmpVec.set(corner,1);

tmpVec.mul(lightViewMatrix);

minX=Math.min(tmpVec.x,minX);

maxX=Math.max(tmpVec.x,maxX);

minY=Math.min(tmpVec.y,minY);

maxY=Math.max(tmpVec.y,maxY);

minZ=Math.min(tmpVec.z,minZ);

maxZ=Math.max(tmpVec.z,maxZ);

}

floatdistz=maxZ-minZ;

orthoProjMatrix.setOrtho(minX,maxX,minY,maxY,0,distz);

}

Rememberthattheorthographicprojectionislikeaboundingboxthatshouldenclosealltheobjectsthatwillberendered.Thatboundingboxisexpressedinlightviewcoordinatesspace.Thus,whatwearedoingiscalculatetheminimumboundingbox,axisalignedwiththelightposition,hatenclosestheviewfrustum.

TheRendererclasshasbeenmodifiedtousetheclassesintheviewpackageandalsotomodifytheinformationthatispassedtotherenderers.Intherendererweneedtodealwiththemodel,themodelview,andthemodellightmatrices.Inpreviouschaptersweusedthemodel–view/light–viewmatrices,toreducethenumberofoperations.Inthiscase,weoptedtosimplifythenumberofelementstobepassedandnowwearepassingjustthemodel,viewandlightmatricestotheshaders.Also,forparticles,weneedtopreservethescale,sincewenolongerpassthemodelviewmatrix,thatinformationislostnow.Wereusetheattributeusedtomarkselecteditemstosetthatscaleinformation.Intheparticlesshaderwewillusethatvaluetosetthescalingagain.

Inthescenevertexshader,wecalculatemodellightviewmatrixforeachsplit,andpassitasanoutputtothefragmentshader.

CascadedShadowMaps

325

mvVertexPos=mvPos.xyz;

for(inti=0;i<NUM_CASCADES;i++){

mlightviewVertexPos[i]=orthoProjectionMatrix[i]*lightViewMatrix[i]*modelMatr

ix*vec4(position,1.0);

}

Inthefragmentshaderweusethosevaluestoquerytheappropriatedepthmapdependingonthesplitthatthefragmentis.Thisneedstobedoneinthefragmentshadersince,foraspecificitem,theirfragmentsmayresideindifferentsplits.

Also,inthefragmentshaderwemustdecidewhichsplitweareinto.Inordertodothat,weusethezvalueofthefragmentandcompareitwiththemaximumzvalueforeachsplit.Thatis,thezfarplanevalue.Thatinformationispassedasanewuniform:

uniformfloatcascadeFarPlanes[NUM_CASCADES];

Wecalculatedesplitlikethis.Thevariableidxwillhavethesplittobeused:

intidx;

for(inti=0;i<NUM_CASCADES;i++)

{

if(abs(mvVertexPos.z)<cascadeFarPlanes[i])

{

idx=i;

break;

}

}

Also,inthesceneshadersweneedtopassanarrayoftextures,anarrayofsampler2D's,tousethedepthmap,thetexture,associatedtothesplitweareinto.Thesourcecode,insteadofusinganarrayusesalistofuniformsthatwillholdthetextureunitthatisusedtorefertothedepthmapassociatedtoeachsplit.

uniformsampler2DnormalMap;

uniformsampler2DshadowMap_0;

uniformsampler2DshadowMap_1;

uniformsampler2DshadowMap_2;

Changingittoanarrayofuniformscausesproblemswithothertexturesthataredifficulttotrackforthissample.Inanycase,youcantrychangingitinyourcode.

Therestofthechangesinthesourcecode,andtheshadersarejustadaptationsrequiredbythechangesdescribedabove.Youcancheckitdirectlyoverthesourcecode.

CascadedShadowMaps

326

Finally,whenintroducingthesechangesyoumayseethatperformancehasdropped.Thisisduetothefactthatwearerenderingthreetimesthedepthmap.Wecanmitigatethiseffectbyavoidingrenderingatallwhenthescenehasnotchanged.Ifthecamerahasnotbeenmovedorthesceneitemshavenotchangedwedonotneedtorenderagainandagainthedepthmap.Thedepthmapsarestoredintextures,sotheyarenotwipedoutforeachrendercall.Thus,wehaveaddedanewvariabletotherendermethodthatindicatesifthishaschanged,avoidingupdatingthedepthmapsitremainsthesame.ThisincreasestheFPSdramatically.Attheend,youwillgetsomethinglikethis:

CascadedShadowMaps

327

Assimp

StaticMeshesThecapabilityofloadingcomplex3dmodelsindifferentformatsiscrucialinordertowriteagame.Thetaskofwritingparsersforsomeofthemwouldrequirelotsofwork.Evenjustsupportingasingleformatcanbetimeconsuming.Forinstance,thewavefrontloaderdescribedinchapter9,onlyparsesasmallsubsetofthespecification(materialsarenothandledatall).

Fortunately,theAssimplibraryalreadycanbeusedtoparsemanycommon3Dformats.It’saC++librarywhichcanloadstaticandanimatedmodelsinavarietyofformats.LWJGLprovidesthebindingstousethemfromJavacode.Inthischapter,wewillexplainhowitcanbeused.

Thefirstthingisaddingassimpmavendependenciestotheprojectpom.xml.Weneedtoaddcompiletimeandruntimedependencies.

<dependency>

<groupId>org.lwjgl</groupId>

<artifactId>lwjgl-assimp</artifactId>

<version>${lwjgl.version}</version>

</dependency>

<dependency>

<groupId>org.lwjgl</groupId>

<artifactId>lwjgl-assimp</artifactId>

<version>${lwjgl.version}</version>

<classifier>${native.target}</classifier>

<scope>runtime</scope>

</dependency>

Oncethedependencieshasbeenset,wewillcerateanewclassnamedStaticMeshesLoaderthatwillbeusedtoloadmesheswithnoanimations.Theclassdefinestwostaticpublicmethods:

Assimp

328

publicstaticMesh[]load(StringresourcePath,StringtexturesDir)throwsException{

returnload(resourcePath,texturesDir,aiProcess_JoinIdenticalVertices|aiProcess

_Triangulate|aiProcess_FixInfacingNormals);

}

publicstaticMesh[]load(StringresourcePath,StringtexturesDir,intflags)throwsE

xception{

//....

Bothmethodshavethefollowingarguments:

resourcePath:Thepathtothefilewherethemodelfileislocated.Thisisanabsolutepath,becauseAssimpmayneedtoloadadditionalfilesandmayusethesamebasepathastheresourcepath(Forinstance,materialfilesforwavefront,OBJ,files).

texturesDir:Thepathtothedirectorythatwillholdthetexturesforthismodel.ThisaCLASSPATHrelativepath.Forinstance,awavefrontmaterialfilemaydefineseveraltexturefiles.Thecode,expectthisfilestobelocatedinthetexturesDirdirectory.Ifyoufindtextureloadingerrorsyoumayneedtomanuallytweakthesepathsinthemodelfile.

Thesecondmethodhasanextraargumentnamedflags.Thisparameterallowstotunetheloadingprocess.Thefirstmethodsjustinvokesthesecondoneandpassessomevaluesthatareusefulinmostofthesituations:

aiProcess_JoinIdenticalVertices:Thisflagreducesthenumberofverticesthatareused,identifiyingthosethatcanbereusedbetweenfaces.

aiProcess_Triangulate:Themodelmayusequadsorothergeometriestodefinetheirelements.Sinceweareonlydealingwithtriangles,wemustusethisflagtosplitallhefacesintotriangles(ifneeded).

aiProcess_FixInfacingNormals:Thisflagstrytoreversenormalsthatmaypointinwards.

Therearemanyotherflagsthatcanbeused,youchancheckthemintheLWJGLJavadocdocumentation.

Let’sgobacktothesecondconstructor.ThefirstthingwedoisinvoketheaiImportFilemethodtoloadthemodelwiththeselectedflags.

AISceneaiScene=aiImportFile(resourcePath,flags);

if(aiScene==null){

thrownewException("Errorloadingmodel");

}

Therestofthecodefortheconstructorisaasfollows:

Assimp

329

intnumMaterials=aiScene.mNumMaterials();

PointerBufferaiMaterials=aiScene.mMaterials();

List<Material>materials=newArrayList<>();

for(inti=0;i<numMaterials;i++){

AIMaterialaiMaterial=AIMaterial.create(aiMaterials.get(i));

processMaterial(aiMaterial,materials,texturesDir);

}

intnumMeshes=aiScene.mNumMeshes();

PointerBufferaiMeshes=aiScene.mMeshes();

Mesh[]meshes=newMesh[numMeshes];

for(inti=0;i<numMeshes;i++){

AIMeshaiMesh=AIMesh.create(aiMeshes.get(i));

Meshmesh=processMesh(aiMesh,materials);

meshes[i]=mesh;

}

returnmeshes;

Weprocessthematerialscontainedinthemodel.Materialsdefinecolourandtexturestobeusedbythemeshesthatcomposethemodel.Thenweprocessthedifferentmeshes.Amodelcandefineseveralmeshesandeachofthemcanuseoneofthematerialsdefinedforthemodel.

IfyouexaminethecodeaboveyoumayseethatmanyofthecallstotheAssimplibraryreturnPointerBufferinstances.YoucanthinkaboutthemlikeCpointers,theyjustpointtoamemoryregionwhichcontaindata.Youneedtoknowinadvancethetypeofdatathattheyholdinordertoprocessthem.Inthecaseofmaterials,weiterateoverthatbuffercreatinginstancesoftheAIMaterialclass.Inthesecondcase,weiterateoverthebufferthatholdsmeshdatacreatinginstanceoftheAIMeshclass.

Let’sexaminetheprocessMaterialmethod.

Assimp

330

privatestaticvoidprocessMaterial(AIMaterialaiMaterial,List<Material>materials,S

tringtexturesDir)throwsException{

AIColor4Dcolour=AIColor4D.create();

AIStringpath=AIString.calloc();

Assimp.aiGetMaterialTexture(aiMaterial,aiTextureType_DIFFUSE,0,path,(IntBuffer

)null,null,null,null,null,null);

StringtextPath=path.dataString();

Texturetexture=null;

if(textPath!=null&&textPath.length()>0){

TextureCachetextCache=TextureCache.getInstance();

texture=textCache.getTexture(texturesDir+"/"+textPath);

}

Vector4fambient=Material.DEFAULT_COLOUR;

intresult=aiGetMaterialColor(aiMaterial,AI_MATKEY_COLOR_AMBIENT,aiTextureType

_NONE,0,colour);

if(result==0){

ambient=newVector4f(colour.r(),colour.g(),colour.b(),colour.a());

}

Vector4fdiffuse=Material.DEFAULT_COLOUR;

result=aiGetMaterialColor(aiMaterial,AI_MATKEY_COLOR_DIFFUSE,aiTextureType_NON

E,0,colour);

if(result==0){

diffuse=newVector4f(colour.r(),colour.g(),colour.b(),colour.a());

}

Vector4fspecular=Material.DEFAULT_COLOUR;

result=aiGetMaterialColor(aiMaterial,AI_MATKEY_COLOR_SPECULAR,aiTextureType_NO

NE,0,colour);

if(result==0){

specular=newVector4f(colour.r(),colour.g(),colour.b(),colour.a());

}

Materialmaterial=newMaterial(ambient,diffuse,specular,1.0f);

material.setTexture(texture);

materials.add(material);

}

Wecheckifthematerialdefinesatextureornot.Ifso,weloadthetexture.WehavecreatedanewclassnamedTextureCachewhichcachestextures.Thisisduetothefactthatseveralmeshesmaysharethesametextureandwedonotwanttowastespaceloadingagainandagainthesamedata.Thenwetrytogetthecoloursofthematerialfortheambient,diffuseandspecularcomponents.Fortunately,thedefinitionthatwehadforamaterialalreadycontainedthatinformation.

Assimp

331

TheTextureCachedefinitionisverysimpleisjustamapthatindexesthedifferenttexturesbythepathtothetexturefile(Youcancheckdirectlyinthesourcecode).Duetothefact,thatnowtexturesmayusedifferentimageformats(PNG,JPEG,etc.),wehavemodifiedthewaythattexturesareloaded.InsteadofusingthePNGlibrary,wenowusetheSTBlibrarytobeabletoloadmoreformats.

Let’sgobacktotheStaticMeshesLoaderclass.TheprocessMeshisdefinedlikethis.

privatestaticMeshprocessMesh(AIMeshaiMesh,List<Material>materials){

List<Float>vertices=newArrayList<>();

List<Float>textures=newArrayList<>();

List<Float>normals=newArrayList<>();

List<Integer>indices=newArrayList();

processVertices(aiMesh,vertices);

processNormals(aiMesh,normals);

processTextCoords(aiMesh,textures);

processIndices(aiMesh,indices);

Meshmesh=newMesh(Utils.listToArray(vertices),

Utils.listToArray(textures),

Utils.listToArray(normals),

Utils.listIntToArray(indices)

);

Materialmaterial;

intmaterialIdx=aiMesh.mMaterialIndex();

if(materialIdx>=0&&materialIdx<materials.size()){

material=materials.get(materialIdx);

}else{

material=newMaterial();

}

mesh.setMaterial(material);

returnmesh;

}

AMeshisdefinedbyasetofverticesposition,normalsdirections,texturecoordinatesandindices.EachoftheseelementsareprocessedintheprocessVertices,processNormals,processTextCoordsandprocessIndicesmethod.AMeshalsomaypointtoamaterial,usingitsindex.IftheindexcorrespondstothepreviouslyprocessedmaterialswejustsimplyassociatethemtotheMesh.

TheprocessXXXmethodsareverysimple,theyjustinvokethecorrespondingmethodovertheAIMeshinstancethatreturnsthedesireddata.Forinstance,theprocessprocessVerticesisdefinedlikethis:

Assimp

332

privatestaticvoidprocessVertices(AIMeshaiMesh,List<Float>vertices){

AIVector3D.BufferaiVertices=aiMesh.mVertices();

while(aiVertices.remaining()>0){

AIVector3DaiVertex=aiVertices.get();

vertices.add(aiVertex.x());

vertices.add(aiVertex.y());

vertices.add(aiVertex.z());

}

}

YoucanseethatgetgetabuffertotheverticesbyinvokingthemVerticesmethod.WejustsimplyprocessthemtocreateaListoffloatsthatcontaintheverticespositions.Since,themethodretyrnsjusstabufferyoucouldpassthatinformationdirectlytotheOpenGLmethodsthatcreatevertices.Wedonotdoitthatwayfortworeasons.Thefirstoneistrytoreduceasmuchaspossiblethemodificationsoverthecodebase.Secondoneisthatbyloadingintoanintermediatestructureyoumaybeabletoperformsomepros-processingtasksandevendebugtheloadingprocess.

Ifyouwantasampleofthemuchmoreefficientapproach,thatis,directlypassingthebufferstoOpenGL,youcancheckthissample.

TheStaticMeshesLoadermakestheOBJLoaderclassobsolete,soithasbeenremovedformthebasesourcecode.AmorecomplexOBJfileisprovidedasasample,ifyourunityouwillseesomethinglikethis:

Animations

Assimp

333

Nowthatwehaveusedassimpforloadingstaticmesheswecanproceedwithanimations.Ifyourecallformtheanimationschapter,theVAOassociatedtoameshcontainstheverticespositions,thetexturecoordinates,theindicesandalistofweightsthatshouldbeappliedtojointpositionstomodulatefinalvertexposition.

Eachvertexpositionhasassociatedalistoffourweightsthatchangethefinalposition,referringthebonesindicesthatwillbecombinedtodetermineitsfinalposition.Eachframealistoftransformationmatricesareloaded,asuniforms,foreachjoint.Withthatinformationthefinalpositioniscalculated.

Intheanimationchapter,wedevelopedaMD5parsertoloadanimatedmeshes.Inthischapterwewilluseassimplibrary.ThiswillallowustoloadmanymoreformatsbesidesMD5,suchasCOLLADA,FBX,etc.

Beforewestartcodinglet’sclarifysometerminology.Inthischapterwewillrefertobonesandjointsindistinguishably.Ajoint/boneisarejustelementsthataffectvertices,andthathaveaparentformingahierarchy.MD5formatusesthetermjoint,butassimpusesthetermbone.

Let’sreviewfirstthestructureshandledbyassimpthatcontainanimationinformation.Wewillstartfirstwiththebonesandweightsinformation.ForeachMesh,wecanaccesstheverticespositions,texturecoordinatesandindices.Meshesstorealsoalistofbones.Eachboneisdefinedbythefollowingattributes:

Aname.Anoffsetmatrix:Thiswillusedlatertocomputethefinaltransformationsthatshouldbeusedbyeachbone.

Assimp

334

Bonesalsopointtoalistofweights,eachweights.Eachweightsisdefinedbythefollowingattributes:

Aweightfactor,thatis,thenumberthatwillbeusedtomodulatetheinfluenceofthebone’stransformationassociatedtoeachvertex.Avertexidentifier,thatis,thevertexassociatedtothecurrentbone.

Thefollowingpictureshowstherelationshipsbetweenalltheseelements.

Hence,thefirstthingthatwemustdoistoconstructthelistofverticespositions,thebones/joints/indicesandtheassociatedweightsfromthestructureabove.Oncewehavedonethat,weneedtopre-calculatethetransformationmatricesforeachbone/jointforalltheanimationframesdefinedinthemodel.

AssimpsceneobjectdefinesaNode’shierarchy.EachNodeisdefinedbyanamealistofchildrennode.Animationsusethesenodestodefinethetransformationsthatshouldbeappliedto.Thishierarchyisdefinedindeedthebones’hierarchy.Everyboneisanode,andhasaparent,excepttherootnode,andpossibleasetofchildren.Therearespecialnodesthatarenotbones,theyareusedtogrouptransformations,andshouldbehandledwhencalculatingthetransformations.AnotherissueisthattheseNode’shierarchyisdefinedfrothewholemodel,wedonothaveseparatehierarchiesforeachmesh.

Ascenealsodefinesasetofanimations.Asinglemodelcanhavemorethanoneanimation.Youcanhaveanimationsforamodeltowalk,runetc.Eachoftheseanimationsdefinedifferenttransformations.Ananimationhasthefollowingattributes:

Aname.Aduration.Thatis,thedurationintimeoftheanimation.namemayseemconfusing

Assimp

335

sinceananimationisthelistoftransformationsthatshouldbeappliedtoeachnodeforeachdifferentframe.Alistofanimationchannels.Ananimationchannel,contains,foraspecificinstantintimethetranslation,rotationandscalinginformationsthatshouldbeappliedtoeachnode.TheclassthatmodelsthedatacontainedintheanimationchannelsistheAINodeAnim.

Thefollowingfigureshowstherelationshipsbetweenalltheelementsdescribedabove.

Foraspecificinstantoftime,foraframe,thetransformationtobeappliedtoaboneisthetransformationdefinedintheanimationchannelforthatinstant,multipliedbythetransformationsofalltheparentnodesuptotherootnode.Hence,weneedtoreordertheinformationstoredinthescene,theprocessisasfollows:

Constructthenodehierarchy.Foreachanimation,iterateovereachanimationchannel(foreachanimationnode):Constructthetransformationmatricesforalltheframes.Thetransformationmmatrixisthecompositionofthetranslation,rotationandscalematrix.Reorderthatinformationforeachframe:ConstructthefinaltransformationstobeappliedforeachboneintheMesh.Thisisachievedbymultiplyingthetransformationmatrixofthebone(oftheassociatednode)bythetransformationmatricesofalltheparentnodesuptotherootnode.

Solet’sstartcoding.WewillcreatefirsaclassnamedAnimMeshesLoaderwhichextendsfromStaticMeshesLoader,butinsteadofreturninganarrayofMeshes,itreturnsanAnimGameIteminstance.Itdefinestwopublicmethodsforthat:

Assimp

336

publicstaticAnimGameItemloadAnimGameItem(StringresourcePath,StringtexturesDir)

throwsException{

returnloadAnimGameItem(resourcePath,texturesDir,

aiProcess_GenSmoothNormals|aiProcess_JoinIdenticalVertices|aiProcess_T

riangulate

|aiProcess_FixInfacingNormals|aiProcess_LimitBoneWeights);

}

publicstaticAnimGameItemloadAnimGameItem(StringresourcePath,StringtexturesDir,i

ntflags)

throwsException{

AISceneaiScene=aiImportFile(resourcePath,flags);

if(aiScene==null){

thrownewException("Errorloadingmodel");

}

intnumMaterials=aiScene.mNumMaterials();

PointerBufferaiMaterials=aiScene.mMaterials();

List<Material>materials=newArrayList<>();

for(inti=0;i<numMaterials;i++){

AIMaterialaiMaterial=AIMaterial.create(aiMaterials.get(i));

processMaterial(aiMaterial,materials,texturesDir);

}

List<Bone>boneList=newArrayList<>();

intnumMeshes=aiScene.mNumMeshes();

PointerBufferaiMeshes=aiScene.mMeshes();

Mesh[]meshes=newMesh[numMeshes];

for(inti=0;i<numMeshes;i++){

AIMeshaiMesh=AIMesh.create(aiMeshes.get(i));

Meshmesh=processMesh(aiMesh,materials,boneList);

meshes[i]=mesh;

}

AINodeaiRootNode=aiScene.mRootNode();

Matrix4frootTransfromation=AnimMeshesLoader.toMatrix(aiRootNode.mTransformation

());

NoderootNode=processNodesHierarchy(aiRootNode,null);

Map<String,Animation>animations=processAnimations(aiScene,boneList,rootNode,

rootTransfromation);

AnimGameItemitem=newAnimGameItem(meshes,animations);

returnitem;

}

ThemethodsarequitesimilartotheonesdefinedintheStaticMeshesLoaderwiththefollowingdifferences:

Themethodthatpassesadefaultsetofloadingflags,usesthisnewparameter:aiProcess_LimitBoneWeights.Thiswilllimitthemaximumnumberofweightsthataffecta

Assimp

337

vertextofour(Thisisalsothemaximumvaluethatwearecurrentlysupportingfromtheanimationschapter).ThemethodthatactuallyloadsthemodeljustloadsthedifferentmeshesbutitfirstcalculatesthenodehierarchyandthencallstotheprocessAnimationsattheendtobuildanAnimGameIteminstance.

TheprocessMeshmethodisquitesimilartotheoneintheStaticMeshesLoaderwiththeexceptionthatitcreatesMeshespassingjointindicesandweightsasaparameter:

processBones(aiMesh,boneList,boneIds,weights);

Meshmesh=newMesh(Utils.listToArray(vertices),Utils.listToArray(textures),

Utils.listToArray(normals),Utils.listIntToArray(indices),

Utils.listIntToArray(boneIds),Utils.listToArray(weights));

ThejointindicesandweightsarecalculatedintheprocessBonesmethod:

Assimp

338

privatestaticvoidprocessBones(AIMeshaiMesh,List<Bone>boneList,List<Integer>bon

eIds,

List<Float>weights){

Map<Integer,List<VertexWeight>>weightSet=newHashMap<>();

intnumBones=aiMesh.mNumBones();

PointerBufferaiBones=aiMesh.mBones();

for(inti=0;i<numBones;i++){

AIBoneaiBone=AIBone.create(aiBones.get(i));

intid=boneList.size();

Bonebone=newBone(id,aiBone.mName().dataString(),toMatrix(aiBone.mOffsetM

atrix()));

boneList.add(bone);

intnumWeights=aiBone.mNumWeights();

AIVertexWeight.BufferaiWeights=aiBone.mWeights();

for(intj=0;j<numWeights;j++){

AIVertexWeightaiWeight=aiWeights.get(j);

VertexWeightvw=newVertexWeight(bone.getBoneId(),aiWeight.mVertexId(),

aiWeight.mWeight());

List<VertexWeight>vertexWeightList=weightSet.get(vw.getVertexId());

if(vertexWeightList==null){

vertexWeightList=newArrayList<>();

weightSet.put(vw.getVertexId(),vertexWeightList);

}

vertexWeightList.add(vw);

}

}

intnumVertices=aiMesh.mNumVertices();

for(inti=0;i<numVertices;i++){

List<VertexWeight>vertexWeightList=weightSet.get(i);

intsize=vertexWeightList!=null?vertexWeightList.size():0;

for(intj=0;j<Mesh.MAX_WEIGHTS;j++){

if(j<size){

VertexWeightvw=vertexWeightList.get(j);

weights.add(vw.getWeight());

boneIds.add(vw.getBoneId());

}else{

weights.add(0.0f);

boneIds.add(0);

}

}

}

}

Thismethodtraversesthebonedefinitionforaspecificmesh,gettingtheirweightsandgeneratingfillingupthreelists:

boneList:Itcontainsalistofnodes,withtheiroffsetmatrices.Itwilluseslaterontocalculatenodestransformations.AnewclassnamedBonehasbeencreatedtoholdthatinformation.Thislistwillcontainthebonesforallthemeshes.

Assimp

339

boneIds:ItcontainsjusttheidentifiersofthebonesforeachvertexoftheMesh.Bonesareidentifiedbyitspositionwhenrendering.ThislistonlycontainsthebonesforaspecificMesh.weights:ItcontainstheweightsforeachvertexoftheMeshtobeappliedfortheassociatedbones.

TheinformationcontainedintheweightsandboneIdsisusedtoconstructtheMeshdata.TheinformationcontainedintheboneListwillbeusedlaterwhencalculatinganimationdata.

Let’sgobacktotheloadAnimGameItemmethod.OncewehavecreatedtheMeshes,wealsogetthetransformationwhichisappliedtotherootnodewhichwillbeusedalsotocalculatethefinaltransformatio.Afterthat,weneedtoprocessthehierarchyofnodes,whichisdoneintheprocessNodesHierarchymethod.Thismethodisquitesimple,Itjusttraversesthenodeshierarchystartingfromtherootnodeconstructingatreeofnodes.

privatestaticNodeprocessNodesHierarchy(AINodeaiNode,NodeparentNode){

StringnodeName=aiNode.mName().dataString();

Nodenode=newNode(nodeName,parentNode);

intnumChildren=aiNode.mNumChildren();

PointerBufferaiChildren=aiNode.mChildren();

for(inti=0;i<numChildren;i++){

AINodeaiChildNode=AINode.create(aiChildren.get(i));

NodechildNode=processNodesHierarchy(aiChildNode,node);

node.addChild(childNode);

}

returnnode;

}

WehavecreatedanewNodeclassthatwillcontaintherelevantinformationofAINodeinstances,andprovidesfindmethodstolocatethenodeshierarchytofindanodebyitsname.BackintheloadAnimGameItemmethod,wejustusethatinformationtocalculatetheanimationsintheprocessAnimationsmethod.ThismethodreturnsaMapofAnimationinstances.Rememberthatamodelcanhavemorethanoneanimation,sotheyarestoredindexedbytheirnames.WiththatinformationwecanfinallyconstructanAnimAgameIteminstance.

TheprocessAnimationsmethodisdefinedlikethis

Assimp

340

privatestaticMap<String,Animation>processAnimations(AISceneaiScene,List<Bone>bo

neList,

NoderootNode,Matrix4frootTransformation){

Map<String,Animation>animations=newHashMap<>();

//Processallanimations

intnumAnimations=aiScene.mNumAnimations();

PointerBufferaiAnimations=aiScene.mAnimations();

for(inti=0;i<numAnimations;i++){

AIAnimationaiAnimation=AIAnimation.create(aiAnimations.get(i));

//Calculatetransformationmatricesforeachnode

intnumChanels=aiAnimation.mNumChannels();

PointerBufferaiChannels=aiAnimation.mChannels();

for(intj=0;j<numChanels;j++){

AINodeAnimaiNodeAnim=AINodeAnim.create(aiChannels.get(j));

StringnodeName=aiNodeAnim.mNodeName().dataString();

Nodenode=rootNode.findByName(nodeName);

buildTransFormationMatrices(aiNodeAnim,node);

}

List<AnimatedFrame>frames=buildAnimationFrames(boneList,rootNode,rootTran

sformation);

Animationanimation=newAnimation(aiAnimation.mName().dataString(),frames,

aiAnimation.mDuration());

animations.put(animation.getName(),animation);

}

returnanimations;

}

Foreachanimation,animationchannelsareprocessed.Eachchanneldefinesthedifferenttransformationsthatshouldbeappliedovertimeforanode.ThetransformationsdefinedforeachnodearedefinedinthebuildTransFormationMatricesmethod.Thesematricesarestoreforeachnode.Oncethenodeshierarchyisfilledupwiththatinformationwecanconstructtheanimationframes.

Let’sfirstreviewthebuildTransFormationMatricesmethod:

Assimp

341

privatestaticvoidbuildTransFormationMatrices(AINodeAnimaiNodeAnim,Nodenode){

intnumFrames=aiNodeAnim.mNumPositionKeys();

AIVectorKey.BufferpositionKeys=aiNodeAnim.mPositionKeys();

AIVectorKey.BufferscalingKeys=aiNodeAnim.mScalingKeys();

AIQuatKey.BufferrotationKeys=aiNodeAnim.mRotationKeys();

for(inti=0;i<numFrames;i++){

AIVectorKeyaiVecKey=positionKeys.get(i);

AIVector3Dvec=aiVecKey.mValue();

Matrix4ftransfMat=newMatrix4f().translate(vec.x(),vec.y(),vec.z());

AIQuatKeyquatKey=rotationKeys.get(i);

AIQuaternionaiQuat=quatKey.mValue();

Quaternionfquat=newQuaternionf(aiQuat.x(),aiQuat.y(),aiQuat.z(),aiQuat.

w());

transfMat.rotate(quat);

if(i<aiNodeAnim.mNumScalingKeys()){

aiVecKey=scalingKeys.get(i);

vec=aiVecKey.mValue();

transfMat.scale(vec.x(),vec.y(),vec.z());

}

node.addTransformation(transfMat);

}

}

Asyoucansee,anAINodeAniminstancedefinesasetofkeysthatcontaintranslation,rotationandscalinginformation.Thesekeysarereferredtospecificinstantoftimes.Weassumethatinformationisorderedintime,andconstructalistofmatricesthatcontainthetransformationtobeappliedforeachframe.ThatfinalcalculationisdoneinthebuildAnimationFramesmethod:

Assimp

342

privatestaticList<AnimatedFrame>buildAnimationFrames(List<Bone>boneList,Noderoot

Node,

Matrix4frootTransformation){

intnumFrames=rootNode.getAnimationFrames();

List<AnimatedFrame>frameList=newArrayList<>();

for(inti=0;i<numFrames;i++){

AnimatedFrameframe=newAnimatedFrame();

frameList.add(frame);

intnumBones=boneList.size();

for(intj=0;j<numBones;j++){

Bonebone=boneList.get(j);

Nodenode=rootNode.findByName(bone.getBoneName());

Matrix4fboneMatrix=Node.getParentTransforms(node,i);

boneMatrix.mul(bone.getOffsetMatrix());

boneMatrix=newMatrix4f(rootTransformation).mul(boneMatrix);

frame.setMatrix(j,boneMatrix);

}

}

returnframeList;

}

ThismethodreturnsalistofAnimatedFrameinstances.EachAnimatedFrameinstancewillcontainthelistoftransformationstobeappliedforeachboneforaspecificframe.Thismethodjustiteratesoverthelistthatcontainsallthebones.Foreachbone:

Getstheassociatednode.BuildsatransformationmatrixbymultiplyingthetransformationoftheassociatedNodewithallthetransformationsoftheirparentsuptotherootnode.ThisisdoneintheNode.getParentTransformsmethod.Itmultipliesthatmatrixwiththebone’soffsetmatrix.Thefinaltransformationiscalculatedbymultiplyingtheroot’snodetransformationwiththematrixcalculatedinthestepabove.

Therestofthechangesinthesourcecodeareminorchangestoadaptsomestructures.Attheendyouwillbeabletoloadanimationslikethisone(youneedyopressspacepartochangetheframe).

Assimp

343

Thecomplexityofthissampleresidesmoreintheadaptationsoftheassimpstructurestoadaptittotheengineusedinthebookandtopre-calculatethedataforeachframe.Beyondthat,theconceptsaresimilartotheonespresentedintheanimationschapter.Youmaytryalsotomodifythesourcecodetointerpolatebetweenframestogetsmootheranimations.

Assimp

344

Recommended