8
The PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann Arbor, Michigan, USA *([email protected]), **([email protected]) Abstract. In 2007 Blue Newt Software began designing and building a new visual rendering system for visual simulation markets. The image generator, PixelTransit is built on our engine called Blue Sprocket. Our rendering technology has been instrumental in demonstrating to new customers how graphics hardware can be used not just to create better images, but to gain better insight into their simulated environments. Blue Sprocket was designed to address four core goals from a rendering perspective: improved performance, higher-quality, scalability, and im- proved lighting. The engine additionally was redesigned to use standards wherever possible, and to bring a degree of modularity and scalability not available in this domain. This paper will de- scribe the design and implementation of this system and discuss current problems and future work to be done in this space. 1 Overview Blue Newt Software is a software company formed in 2001 as a consultancy performing graphics work around the world. This work has focused primarily on cross-platform OpenGL applications and the company has gained much experience and exposure to the visual simulation market during that time. In late 2007 it became apparent that there was an opportunity within the visual simulation market, and ground-based simulation in particular, to improve visual quality, performance, and take advantage of modern graphics processing units (GPUs). Blue Newt began work on an image gener- ator (IG) product known as PixelTransit based on a simulation engine called Blue Sprocket. This engine was designed to be modular, scalable, and fully cross-platform. Beyond the system ar- chitecture, the graphics technologies and techniques chosen to be integrated were to be of the best practices of the present, with an eye towards the future of graphics hardware. At that time, a new graphics architecture was becoming available approximately every two years, and our goal was to design our graphics infrastructure so that simulators could be more easily tuned to the hardware of the day, without a major rewrite. In 2008 our decisions about our direction were validated when Daimler AG chose Blue Newt to re- place their visual system with our new engine and IG. In this paper we’ll describe the overall simu- lation engine, the PixelTransit IG, and the design of the core rendering architecture. We’ll focus on the graphics and detail the architecture we’ve implemented and discuss its scalability and feature set. 2 System Design The Blue Sprocket engine and PixelTransit Image Generator (Figure 2) were designed to address the limitations of simulators on the market in the mid-2000s. There existed three key areas to which

The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

Embed Size (px)

Citation preview

Page 1: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

The PixelTransit Image Generator: A Next-Generation Visual Simulation Engine

Robert P. Kuehne*, Sean Carmody**

Blue Newt Software, LLC, Ann Arbor, Michigan, USA*([email protected]), **([email protected])

Abstract. In 2007 Blue Newt Software began designing and building a new visual rendering system for visual simulation markets. The image generator, PixelTransit is built on our engine called Blue Sprocket. Our rendering technology has been instrumental in demonstrating to new customers how graphics hardware can be used not just to create better images, but to gain better insight into their simulated environments. Blue Sprocket was designed to address four core goals from a rendering perspective: improved performance, higher-quality, scalability, and im-proved lighting. The engine additionally was redesigned to use standards wherever possible, and to bring a degree of modularity and scalability not available in this domain. This paper will de-scribe the design and implementation of this system and discuss current problems and future work to be done in this space.

1 Overview

Blue Newt Software is a software company formed in 2001 as a consultancy performing graphics work around the world. This work has focused primarily on cross-platform OpenGL applications and the company has gained much experience and exposure to the visual simulation market during that time. In late 2007 it became apparent that there was an opportunity within the visual simulation market, and ground-based simulation in particular, to improve visual quality, performance, and take advantage of modern graphics processing units (GPUs). Blue Newt began work on an image gener-ator (IG) product known as PixelTransit based on a simulation engine called Blue Sprocket.

This engine was designed to be modular, scalable, and fully cross-platform. Beyond the system ar-chitecture, the graphics technologies and techniques chosen to be integrated were to be of the best practices of the present, with an eye towards the future of graphics hardware. At that time, a new graphics architecture was becoming available approximately every two years, and our goal was to design our graphics infrastructure so that simulators could be more easily tuned to the hardware of the day, without a major rewrite.

In 2008 our decisions about our direction were validated when Daimler AG chose Blue Newt to re-place their visual system with our new engine and IG. In this paper we’ll describe the overall simu-lation engine, the PixelTransit IG, and the design of the core rendering architecture. We’ll focus on the graphics and detail the architecture we’ve implemented and discuss its scalability and feature set.

2 System Design

The Blue Sprocket engine and PixelTransit Image Generator (Figure 2) were designed to address the limitations of simulators on the market in the mid-2000s. There existed three key areas to which

Page 2: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

we were paying close attention: first, customer needs, second, the state of the image generator busi-ness, and third, the evolution of graphics processors.

When we began looking at what we might build, even state-of-the-art simulators had relatively simple models, lighting, and shading. Many IGs of that time period had no shading, especially for ground simulation, and some didn’t even have lighting. As we worked with clients over the years in a variety of industries we heard again and again how important lighting was becoming to simulation experiments. In conjunction with these trends, GPUs were becoming increasingly powerful, and de-signed more specifically for fully shaded rendering.

We began researching what the state of the art in OpenGL graphics rendering was with an eye to-wards the needs of the simulation market. As we expected, the most advanced rendering techniques were occurring in games, and much of the Blue Sprocket engine is designed based on extant game engine ideas, patterns, and techniques. Our approach became to take the best ideas from the re-search and development community of gaming and simulation, and make those techniques available in service of the needs of the simulator market.

We built Blue Sprocket based on this research to be fully cross-platform, running on Windows, Linux, and Mac OS X, 64-bit capable, and graphics-vendor independent. We’ve built a fully-float-ing-point deferred lighting renderer, capable of rendering hundreds of lights in-scene at interactive framerates, along with a full post-processing pipeline . We support multiple windows, channels, and inset channels, as well as hardware genlock and framelock. In the next section, we’ll begin by ex-ploring the overall framework architecture, the PixelTransit IG, and then explain the rendering ar-chitecture.

Figure 1: The PixelTransit Image Generator rendering a typical night scene with >80 lights active.

3 Architecture

Blue Sprocket is designed to be modular at its core, with key modules supporting various operations as simulations might require. Among these are audio, code, scripting, networking, and of course, rendering. The base engine is coarsely threaded, with individual threads roughly handling tasks per-module. The overall architecture is as seen in Figure 2.

72

Page 3: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

Figure 2: Blue Sprocket architecture and modules.

The core of the Blue Sprocket Engine is a suite of pluggable components for graphics, audio, code, network, and specific features, such as SpeedTree for real-time vegetation, and DIGuy for human kinematic and crowd modeling and rendering. An application built on top of these, in our case, the PixelTransit IG, simply defines how and when these will be run, and how data flows among them. Applications are currently written in C++, though we produce a scene and object editor that utilizes our Python bindings interactively. We deliver our own networking protocol XVIS as well as the in-dustry standard CIGI. We also support a variety of audio engines and pluggable code components. Additionally, we have an architecture that will support arbitrary custom plugins such as physics, au-dio-communication, etc. Of particular relevance to this paper is that we have implemented several renderers for the engine in recent years, including a traditional low-dynamic-ranged forward render-er, a high-dynamic-range forward renderer, and most recently, and the focus of this paper, a deferred lighting renderer.

Application data flows through a threaded architecture in which several phases occur in-order. First, network data is unpacked and used to update objects in the scene, the running IG itself, and any other manipulable state of the application. Next, any objects with update handlers are executed to perform any operations they wish to perform. For example, car objects in our system may turn their lights on dynamically, so in the update phase, the car enables these lights, and informs the lighting subsystem about the new lights in the scene.

Next, we enter the rendering phases, beginning with culling. Blue Sprocket contains a variety of culling structures suited to different circumstances. We have written specialized cullers for typical ground-simulation application data, such as binary tree cullers, adaptive kd-tree cullers, and more, as well as more traditional view-frustum cullers, object size cullers, etc. The culling architecture of Blue Sprocket itself is a major undertaking. For purposes of this paper, we’ll focus on its results, which are simply a list of objects to submit to the rendering subsystem. Next the rendering system sorts these objects into efficient batches for rendering, and these objects are ready for use by the graphics subsystem described below.

4 Graphics

The goal of the Blue Sprocket Engine was to allow for fully shaded, high dynamic range rendering with the capability of running more than the standard eight OpenGL lights. We required shading to be present on all objects in-scene so that we could have fully modern appearances including realist-

73

Page 4: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

ic paint shading, bump-mapping, gloss-mapping or most any other surface appearance that could be written in code.

OpenGL has, since its inception, supported up to eight hardware lights, however these lights are tra-ditionally linearly expensive. The cost of lighting with two lights is twice as expensive as with one. This performance fall-off proceeds linearly as more lights are added. This is a core problem with traditional fixed-function rendering architectures. The complementary problem with the multiple-light scenario in a shader-based world is that of shader code complexity. The properties of each light (type, position, attenuation, etc) must be available to any shader wishing to light in order to have its contribution added to the final surface color of an object. Thus, for a single sphere in the world, its shader must have access to all lights that could possibly influence it, as well as be able to compute the influence of these lights, be they point lights, spot lights, or some other type of light.

Traditional forward rendering requires a lot of complexity for each surface shader to both light and shade the objects. This system might prompt developers to write a shader composition system, how-ever this becomes very complex, very quickly. Because of this problem, and a variety of undesir-able solutions, our research indicated that a deferred approach and specifically deferred lighting would work best for us because it neatly separates these problems into two separate shaders: one for surface appearance and one for light influence. This decoupling makes our system very flexible in its ability to rapidly introduce new forms of both lighting and surface appearance. One additional benefit is that surface shaders easily fit into an artist’s conceptual and technical workflows.

Our rendering pipeline is one in which we precompute a variety of required resources for later stages, where they are combined with surface appearance. This architecture can be seen in Figure 3.

Figure 3: Blue Sprocket rendering stages and paths.

Rendering is performed in five major stages. In the first stage, all scene geometry is rendered to an offscreen buffer. Each shader is responsible for writing depth, normal, and specular power values for the current fragment. This stage and results are visible in Figure 4.

74

Page 5: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

Figure 4: Blue Sprocket rendering Stage 1: depth, world-space normals, and specular power.

Using this offscreen buffer, lighting and shading may be performed in the second stage independent of the geometry itself. For each light, proxy geometry that tightly encloses the light's area of influ-ence is rendered to another offscreen buffer, referred to as the light buffer. For each pixel location covered by proxy geometry, lighting and shading is computed. In this way, a light's contribution is only ever computed for those pixels that are close enough to be influenced by it. Furthermore, lighting is never computed for pixels that end up being occluded.

To compute the lighting and shading for a given pixel, the depth value stored in stage one is first read and used to reconstruct that pixel's world-space position. Using this position, along with the pixel normal and specular power, the Phong lighting model may be evaluated. Shadows are com-puted using stable, cascaded shadow maps. After proxy geometry has been rendered for each light, the light buffer contains the contributions of every light affecting the current view. Figure 5 shows these results.

Figure 5: Blue Sprocket rendering stage 2: lit scene with shadows.

In stage three, all scene geometry is rendered a second time. Shaders in this stage are responsible for generating color values for each fragment as well as combining this color with the lighting com-puted in stage two. Though in this stage it is necessary to render scene geometry twice, we are do-ing so with a depth-buffer that is already filled, and therefore this second pass can use the depth buffer to quickly reject occluded pixels. This means that while we are rendering twice, we’re only computing shading on visible pixels, which effectively means once-per-window-pixel.

75

Page 6: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

Figure 6: Blue Sprocket rendering stage 3: light and albedo combination.

After computing the fully lit image, post processing is performed. In this fourth stage, any number of effects may be applied to adjust the image including tonemapping to reduce the image's high dy-namic range to one displayable on LDR monitors, fog, color correction, contrast enhancement, depth of field, bloom, and others. Tonemapping in our system can be performed adaptively, taking place over a number of frames to simulate the irising of the eye, or set overall, to allow a uniform balance of detail in shadow and light. The difference between Figure 7 and 6 is subtle in print but when displayed on a monitor, quite striking and obvious. In particular notice the additional detail in shadow in the tall building on the left in Figure 7, and the overall reduction in contrast.

Figure 7: Blue Sprocket rendering stage 4: post-processing.

Finally, the post-processed image is copied to any window wishing to display it. We output to mul-tiple card outputs to accommodate simultaneous projector and monitor outputs.

76

Page 7: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

5 Future Work

Our next areas of research and development for our engine include three key areas: anti-aliasing, transparent surface lighting, and object blurring.

Anti-aliasing is very important in simulation to eliminate visual artifacts which may influence sub-jects during experiments. However, anti-aliasing poses unique challenges for deferred pipelines. Modern GPUs support hardware features that make a variety of new anti-aliasing techniques pos-sible though little research into these has been done yet. Anti-aliasing is also a popular research top-ic and papers continue to emerge at a rapid rate [11,12]. We’ll are on-track to implement a modified version of a combined version of several of these techniques during our work in late 2009.

Transparency is of course a requirement in a modern Image Generator. Transparent surfaces have always been difficult to handle in rendering in general, due to their strict sorting requirements for proper blending. In deferred lighting situations, this is even more tricky as each surface must be lit correctly by the nearby lights of influence and then also sorted properly for blending. Recent games research have a few techniques which we’ll be adapting to our pipeline for handling not just light-ing, but also shadowing, for transparent surfaces as well.

Blurring is important specifically to the driving simulator domains as objects suffer from strobing effects where they can, at certain rotational velocities, appear to slow down, stand completely still, or rotate backwards. This is due to the framerate of the IG, and the rotational change of key features in a wheel misaligning by some small amount. Our solution to these and other motion artifacts is to apply blur. Preliminary research indicates that object-specific blur techniques can eliminate these artifacts directly, though our research is at an early stage. Future versions of our engine will explore and productize these results.

6 Acknowledgments

Many thanks to the rest of our talented staff at Blue Newt, specifically Jason Elliott who has been instrumental in creating many of the scenes and models shown here. Thanks to our colleagues at Daimler for keeping us challenged to push the boundaries of what’s possible. Thanks to our review-ers too, including a special thanks to our colleague Volkhard Schill at VSimulation, GMBH for his encouragement and improvements. Finally thanks to our families, specifically Bob's wife Kim and Sean's wife Sarah, for their support and patience during the long process of bringing this product to market.

References

[1] Wolfgang Engel: Shader X 7, Charles River Media, 2009

[2] Wolfgang Engel: Shader X 6, Charles River Media, 2008

[3] Randima Fernando, GPU Gems, Addison-Wesley Professional, 2004

[4] Matt Pharr, Randima Fernando, GPU Gems 2, Addison-Wesley Professional, 2005

[5] Hubert Nguyen, GPU Gems 3, Addison-Wesley Professional, 2007

[6] T. Forsyth, Scene Graphs Just Say No, 2006,http://home.comcast.net/~tom_forsyth/blog.wiki.html#[[Scene%20Graphs%20-%20just%20say%20no]]

77

Page 8: The PixelTransit Image Generator: A Next … PixelTransit Image Generator: A Next-Generation Visual Simulation Engine Robert P. Kuehne*, Sean Carmody** Blue Newt Software, LLC, Ann

[7] T. Forsyth, Linear-Speed Vertex Cache Optimisation, http://home.comcast.net/~tom_forsyth/papers/fast_vert_cache_opt.html, 2006

[8] T. Forsyth, blog, http://home.comcast.net/~tom_forsyth/blog.wiki.html

[9] W. Engel, blog, http://diaryofagraphicsprogrammer.blogspot.com/

[10] Andersson, Tatarchuk, Frostbite Rendering Architecture, Game Developers Conference, 2007

[11] Alexander Reshetov, Morphological Antialiasing, High Performance Graphics, 2009

[12] Konstantine Iourcha, Jason Yang, and Andrew Pomianowski, A Directionally Adaptive Edge Anti-Aliasing Filter, High Performance Graphics, 2009

78