88
Sandstorm: A Dynamic Multi- contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation By: Michael Smith

By: Michael Smith

Embed Size (px)

DESCRIPTION

By: Michael Smith. Sandstorm: A Dynamic Multi-contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation. acknowledgments. Overview. Introduction Background Idea Software Engineering Prototype Results Conclusions and Future Work. Introduction. - PowerPoint PPT Presentation

Citation preview

Page 1: By: Michael Smith

Sandstorm: A Dynamic Multi-

contextual GPU-based Particle

System, that uses Vector Fields for

Particle Propagation

By: Michael Smith

Page 2: By: Michael Smith

acknowledgments

Page 3: By: Michael Smith

Overview

Introduction

Background

Idea

Software Engineering

Prototype

Results

Conclusions and Future Work

Page 4: By: Michael Smith

Introduction

The use of Virtual Reality(VR) to visualize

scientific phenomenon, is quite common.

VR can allow a scientists to immerse

themselves in the phenomenon that they are

studying.

Page 5: By: Michael Smith

Introduction

Such phenomenon, such as dust clouds or

smoke, would need a particle system to

visualizes such fuzzy systems.

Vector fields can be used to 'guide' particles

according to real scientific data.

Not a new idea, Vector Fields by Hilton and

Egbert, c 1994.

Page 6: By: Michael Smith

Introduction

VR applications and simulations require a multi-

context environment.

A main context, controls and updates multiple

rendering contexts.

This multi-contextual environment can cause

problems with particle systems.

Page 7: By: Michael Smith

Introduction

GPU offloading techniques have been proven

to allow applications and simulations to offload

work to the graphics hardware.

This can allow for acceleration of non-traditional

graphics calculations.

GPU offloading can be used to accelerate

particle calculations.

Page 8: By: Michael Smith

Introduction

Sandstorm

Dynamic

Multi-contextual

GPU-based

Particle System

Using Vector Fields for Particle Propagation

Page 9: By: Michael Smith

Background

Helicopter and Dust Simulation(Heli-Dust), is a

scientific simulation in which the effect of a

helicopter's downdraft on the surrounding

desert terrain.

Written using the Dust Framework, a framework

which allows the developer to setup a scene

using an XML file

Page 10: By: Michael Smith

Background

Page 11: By: Michael Smith

Background

Early prototypes for Heli-Dust, implemented a

very simple particle system.

This particle system did not have a way to

guide particles, according to observed scientific

data.

Page 12: By: Michael Smith

Background

Virtual Reality, is a technology which allows a

user to interact with a computer-simulated

environment, be it a real or imagined one.

Immerses the user in an environment.

Page 13: By: Michael Smith

Background

Depth Cues, is an indicator in which a human

can perceive information regarding depth.

They come in many shapes and sizes.

Monoscopic

Stereoscopic

Motion

Page 14: By: Michael Smith

Background

Monoscopic depth cue

Information from a single eye, or image is available.

Information can include:

Position

Size

Brightness

Page 15: By: Michael Smith

Background

Stereoscopic depth cue:

Information from two eyes.

This information is derived from the parallax

between the different images received by each eye.

Parallax, is the apparent displacement of objects

viewed from different locations.

Page 16: By: Michael Smith

Background

Motion depth cue

Motion parallax

The changing relative position between the head

and the object being observed.

Objects in the distance move less than objects

closer to the viewer.

Page 17: By: Michael Smith

Background

Stereoscopic Displays, 'trick' the user's eyes

into thinking there is depth where no depth

exists.

Come in all shapes and sizes.

Page 18: By: Michael Smith

Background

Page 19: By: Michael Smith

Background

Page 20: By: Michael Smith

Background

Multiple Contexts

A main context which controls multiple rendering

contexts.

Because of these multiple context, a Virutal Reality

application developer needs to make sure that all

context sensitive information and algorithms are

multiple context safe.

Page 21: By: Michael Smith

Background

There are many Virtual Reality toolkits and

libraries.

Such toolkits and libraries handle things such

as:

Generating Stereoscopic Images

Setting up the VR environment

And some handle distribution methods.

Page 22: By: Michael Smith

Background

Virtual Reality User Interface, or VRUI, is a

virtual reality development toolkit.

Developed by Oliver Kreylos at UC Davis.

VRUI's main mission statement is to shield the

developer from a particular configuration of a

VR system.

Page 23: By: Michael Smith

Background

Tries to accomplish the mission by the

abstraction of three main areas

Display abstraction

Distribution abstraction

Input abstraction

Another feature of VRUI is its built in menu

systems.

Page 24: By: Michael Smith

Background

Page 25: By: Michael Smith

Background

FreeVR, developed and maintained by William

Sherman.

Open-source virtual reality interface/intergration

library.

FreeVR was designed to work on a diverse

range of input and output hardware.

FreeVR currently is designed to work on shared

memory systems.

Page 26: By: Michael Smith

Background

In 1983, William T. Reeves wrote, Particle

Systems – A Technique for Modeling a Class of

Fuzzy Objects.

This paper introduces the particle system, a

modeling method that models an object as a

cloud of primitives particles that define its

volume.

Page 27: By: Michael Smith

Background

Reeves categories particle systems as “fuzzy”

objects, in which they do not have smooth, well-

defined, and shiny surfaces.

Instead their surfaces are irregular, complex,

and ill defined.

This particle system was used to create the

Genesis Effect, for the movie Star Trek II: The

Wrath of Khan.

Page 28: By: Michael Smith

Background

Page 29: By: Michael Smith

Background

Page 30: By: Michael Smith

Background

Reeves described, in his paper, a particle

system that had five steps.

Particle Generation

Particle Attributes Assignment

Particle Dynamics

Particle Extinction

Particle Rendering

Page 31: By: Michael Smith

Background

Particle Generation

First the number of particles to be generated per

time interval is calculated.

Then the particles are generated.

Page 32: By: Michael Smith

Background

Particle Attributes Assignment, whenever a

particle is created, the particle system must

determine values for the following attributes:

Initial position and velocity

Initial size, color and transparency

And initial shape and lifetime.

Initial position of the particles is determined by

a generation shape.

Page 33: By: Michael Smith

Background

Page 34: By: Michael Smith

Background

Particle Dynamics, once all the particles have

been created and assign initial attributes, the

positions and or velocities are updated.

Particle Extinction, once a particle has live past

its predetermined lifetime, measured in frames,

the particle dies.

Page 35: By: Michael Smith

Background

Particle Rendering, once the position and

appearance of the particles where determined

the particles are rendered.

Two assumption where made

Particles do not intersect with other surface-based

objects.

Particles where considered point light sources.

Page 36: By: Michael Smith

Background

In recent years, graphics vendors have

replaced areas of fixed functionality with areas

of programmability.

Two such areas are the Vertex and Fragment

Processors.

Page 37: By: Michael Smith

Background

Vertex Processor, is a programmable unit that

operates on incoming vertex values.

Some duties of the vertex processor are:

Vertex transformation

Normal transformation and normalization

Texture coordinate generation and transformation.

Page 38: By: Michael Smith

Background

Fragment Processor, is a programmable unit

that operates on incoming fragment values.

Some duties of the fragment processor are:

Operations on interpolated values.

Texture access.

Texture application.

Fog

Page 39: By: Michael Smith

Background

While a program, shader, is running on one of

these processors, the fixed functionality is

disable.

Several programming languages where created

to aid in the development of shaders, one such

langauge is OpenGL Shading

Language(GLSL).

Page 40: By: Michael Smith

Background

Page 41: By: Michael Smith

Background

Vertex and Fragment shaders can't create

vertices, only work on data past to them.

Geometry shaders can create any number

vertices.

Can allow shaders to create geometry without

having to be told to by the CPU.

Page 42: By: Michael Smith

Background

Transform Feedback, allows a shader to specify

the output buffer.

The target output buffer can be the input buffer

of another shader.

Allows developers to create multi-pass shaders

that do not relay information back to the CPU

for the other passes.

Page 43: By: Michael Smith

Background

ParticleGS, is a Geometry Shader based

particle system, that does the following:

Stores particle information in Vertex Buffer Objects.

Uses a Geometry shader to create particles, and

store them as vertex information in VBOs.

Uses Transform Feedback, to send particle data in

between shaders.

Uses a Geometry shader to create billboards and

point sprites to render particles.

Page 44: By: Michael Smith

Background

In the days before shaders, the GPU was used

just for rendering.

But with the advent of shaders, GPU's can now

be used to aid scientific computation.

One can 'trick' the GPU into thinking that it is

working on rendering information

Page 45: By: Michael Smith

Background

Uber Flow, is a system for real-time animation

and rendering of large particle sets using GPU

computation.

Million Particle System, a GPU-based particle

system that can render a large set of particles

Page 46: By: Michael Smith

Background

Both particle systems doing the following

Store particle information to textures.

Use a series of vertex and fragment shaders to

update the particle information.

Use the CPU to create and send rendering

information.

And use a series of vertex and fragment shaders to

render the information from CPU.

Page 47: By: Michael Smith

Background

Page 48: By: Michael Smith

Idea

Sandstorm

Dynamic

Multi-contextual

GPU-based

Particle System

That uses Vector Fields for Particle Propagation.

Page 49: By: Michael Smith

Idea

Dynamic, Sandstorm should have the ability to

change certain attributes on the fly.

Rate of emission

Size of particles

Lifetime of particles

Page 50: By: Michael Smith

Idea

Multi-contextual, as previously stated 3D VR

environments uses multiple contexts.

Thus Sandstorm must be designed to handle

these multiple contexts.

Random number generation

Between screen consistency

Page 51: By: Michael Smith

Idea

GPU-based, Sandstorm will be designed to

leverage the uses of todays most powerful and

advanced GPUs.

Use Geometry shaders to create, update, and

render particles.

Use Transform Feedback to direct data

between shaders.

Page 52: By: Michael Smith

Idea

Vector Fields, in order to 'guide' particles

according to observed scientific data, vector

fields will be used in Sandstorm.

But, Sandstorm should not be a vector field

simulator, only take vector fields.

Page 53: By: Michael Smith

Software Engineering

Page 54: By: Michael Smith

Software Engineering

Page 55: By: Michael Smith

Software Engineering

Page 56: By: Michael Smith

Software Engineering

Page 57: By: Michael Smith

Prototype

GPU-Based Particle System, like most particle

systems, Sandstorm has three main phases:

Creation and Destruction

Update

Rendering

Page 58: By: Michael Smith

Prototype

Creating and Destroying Particles, Traditional

particle systems would create a particle and

store it into a dynamically growing data

structure.

But, GPU-based particle systems store the

particles in a texture.

Page 59: By: Michael Smith

Prototype

Textures need to describe a rectangular area

that encompasses the entire area of the data.

For example if we had 19 members, the texture

would need to cover an area of 20.

Not so with VBO's

VBO's can fit the exact amount of data that is to

be used.

Page 60: By: Michael Smith

Prototype

Like Million Particle System and Uber Flow,

Sandstorm stores its particle information in a

double buffered approach.

Each frame one of the buffers is used as the

read buffer and the other as a write buffer.

Each buffer holds two VBO's, one for the

particle position the other for the velocities

Page 61: By: Michael Smith

Prototype

Geometry shaders can emit one or more

vertices.

At the beginning of the Creation/Destruction

phase, the read buffer is passed to the shader.

The shader then determines if its dealing with

an emitter.

Page 62: By: Michael Smith

Prototype

If the shader is dealing with an emitter the

following happens.

How many particles are to be generated is

determined.

The initial information for the particles is

determined.

Page 63: By: Michael Smith

Prototype

Determining amount of particles to be emitted

How many particles have already been emitted, a

Subtract a from the about of particles that is to be

emitted per second, p

Divide p by the amount of time left in that cycle,

each cycle is a second.

Page 64: By: Michael Smith

Prototype

Determining the initial information of particles

A random information texture is used.

The texture is Translated, Scaled, and Rotated, by

a random amount, then sent to the shader.

Emitters, have random numbers in the velocity

information, that is used to do texture look ups.

The use of the texture is to make sure the particles

are consistent between contexts

Page 65: By: Michael Smith

Prototype

If the Creation/Destruction shader is passed a

particle something different happens.

First the particle is determined to be alive or

dead.

If alive, the particle is re-emitted into the buffer.

If dead, the a blank particle is emitted into the

buffer.

Page 66: By: Michael Smith

Prototype

Updating Particles, once new particles are

created and old ones destroyed, the particles

are updated:

The delta time between frames is passed to the

update shader.

A lookup in the vector field, 3D texture, is done

according to the particles position.

Vector field velocity is added to particles velocity,

and then the particles position is updated.

Page 67: By: Michael Smith

Prototype

Once the particles have been updated, the

particles are then rendered.

The particle positions are passed to the render

shader, using Transform Feedback.

Particles are rendered as either

Textured deferred shaded billboards

Or, points.

Page 68: By: Michael Smith

Prototype

Particle position represent the center of the

particle.

So, four points have to be determined to create

a billboard.

Information that is already known, the center of

the particle and the vector pointing to the eye.

Page 69: By: Michael Smith

Prototype

Page 70: By: Michael Smith

Prototype

Once the vectors are found, they can be added

to the particles position to get the four points.

Page 71: By: Michael Smith

Prototype

Once the points are found, they can be emitted

to create the billboard.

Once the billboard has been emitted, a texture

is applied.

Once the result of billboard shader is

determined, a deferred shading method can be

applied.

Page 72: By: Michael Smith

Prototype

Currently Sandstorm uses a deferred shading

method to blend the particles together.

First step is to accumulate the particles, per

pixel, so that the more particles that are behind

a particle pixel the denser it looks.

Once that is done the result can be rendered to

a full screen quad with the result textured onto

it.

Page 73: By: Michael Smith

Prototype.

There is a catch though.

Using this method requires the user of

Sandstorm to:

Render the scene into an off-screen buffer, with a

depth buffer attached.

Give the deferred shader the depth buffer, so that it

can blend with the scene, obscuring any solid

object and also being obscured by solid objects.

Page 74: By: Michael Smith

Prototype

Vector Fields, in Sandstorm are represented as

3D textures.

A texture was used instead of a VBO, because

of existing internal methods for dealing with 3D

textures, such as wrapping, indexing, and

interpolating.

Page 75: By: Michael Smith

Prototype

When a lookup is done on the vector field, to

get information for a particle particle the

following happens:

The position of the particle is interpolated

Dividing the members by the width, height, and

depth of the vector field.

Page 76: By: Michael Smith

Prototype

Like previously stated, when dealing with a

multi-contextual environment, one must be care

to make context sensitive data and algorithms

conform to the multi-contextual environment.

This is handle in Sandstorm, by having a

controlling class, which is stored on the main

context of the VR environment, control multiple

update/render classes.

Page 77: By: Michael Smith

Prototype

Page 78: By: Michael Smith

Dynamic

Sandstorm, has the ability to change some of

its attributes, both at run-time and compile-time.

Page 79: By: Michael Smith

Results

A sample application was created to test out

Sandstorm.

The Heli-Dust application was used as a test

bed.

A basic vector field was used in the sample

application

Page 80: By: Michael Smith

Results

Considering that Sandstorm is not a vector field

simulator, a simple helicopter interaction model

was made, as the helicopter throttle increase:

The rate of emission was increased.

The lifetime of the particle was increased.

And the maximum amount of particles was

increased.

Page 81: By: Michael Smith

Results

The sample application was run on the

following system, that powered a four side

CAVE environment.

A multi-cored shared memory machine, with four

quad-core chips.

48 Gbs of RAM

an Nvidia Quadroplex

Running Ubuntu 7.10 Linux

Page 82: By: Michael Smith

Results

Rendered 300,000 deferred shaded particles at:

15-20 FPS while standing in the particle system

~65 FPS while standing a good distance back from

the particle system.

Page 83: By: Michael Smith

Results

Show movies.

Page 84: By: Michael Smith

Conclusion and Future Work

Vector fields can be used to 'guide' particles.

Sandstorm can run in a multi-contextual

environment.

Sandstorm utilizes the latest in GPU off-loading

techniques

Sandstorm can render more than 100,000

particles at above 15 FPS.

Page 85: By: Michael Smith

Conclusion and Future Work

Opitmizations:

Currently both emitters and particles reside in the

same buffers, separating them can limit branching

in shaders.

Currently buffer sizes are static, allowing them to

grow and shrink can increase speed of updating

and rendering.

Page 86: By: Michael Smith

Conclusion and Future Work

Other improvements

Collisions between the particles and objects in the

scene.

Soft Particles, Motion Blur, and Light Scattering

could be used to give the particles more realism

A shader based physics model could be

implemented to allow user to change the behavior

of the particles

Page 87: By: Michael Smith

Conclusion and Future Work

Other work

A vector field simulator could be create to feed

Sandstorm dynamically changing vector fields, so

that particle motion acts more naturally.

A vector field creator/editor can be create to help

scientist visualize vector fields before they are used

in Sandstorm.

Page 88: By: Michael Smith

Questions/Comments?