18
Enhancing Interactive Non-Planar Projections of 3D Geovirtual Environments with Stereoscopic Imaging Matthias Trapp , Haik Lorenz, Markus Jobst, Jürgen Döllner Hasso-Plattner-Institute at the University of Potsdam True-3D in Cartography 1 st International Conference on 3D Maps August 24 - 28, 2009 Dresden, Germany 1

Stereoscopy for Non-Planar Projections (TRUE 3D 2009)

Embed Size (px)

Citation preview

Enhancing Interactive Non-Planar Projections

of 3D Geovirtual Environments with Stereoscopic Imaging

Matthias Trapp, Haik Lorenz, Markus Jobst, Jürgen DöllnerHasso-Plattner-Institute at the University of Potsdam

True-3D in Cartography1st International Conference on 3D Maps

August 24 - 28, 2009 Dresden, Germany

1

motivation geo-media technology

provides interactivity, immersion facilitates the communication of 3D geo spatial data

applications to cartography: increase immersion into 3D geovirtual environments support for depth-cues

planar stereoscopy: well understood – rendering: straight forward supported by graphics hardware / driver

non-planar stereoscopy: provides high field-of-view and image resolution 2

problem: non-planar projection surfaces

rendering of digital 3D city and landscape models: high amount of geometry and texture data real-time constraints (> 20 frames per second)

current generation of graphics hardware (GPU) no native support for non-planar projection surfaces requires specific rendering techniques classified into image, geometry, and ray-based

approaches

hardware-accelerated stereoscopic imaging: available stereo hardware modifies vertex pipeline stage cannot be used for rendering non-planar

stereoscopy3

framework - conceptual overview

1..N Stereo Mates

Stereo Rendering Component

SceneGeometry + Textures 1..N Virtual Cameras

Stereo Mate Generation

Chroma Depth Active

Output Device (Screen, Projector, Printer)

Passive ...

Planar Projection Image-basedNon-Planar Projection

Geometry-basedNon-Planar Projection

4

review: image-based approach (IBA)

basic concept: dynamic cube map + screen-aligned quad image warping based on normal vectors:

3-phase rendering process:1. create/update dynamic cubemap2. setup projection shader3. render screen-aligned quad -β/2

-α/2

Fst = (s,t)β/2

0

α/2

φ

Viewport

5

adapting IBA for stereoscopy basic idea for image-based non-planar projections:

create cubemaps for each virtual camera derive non-planar projection for each cube-map

examplary workflow for two stereo mates:

Layered Rendering Projection Function δP + Layer Sampling

Polygonal Scene Texture Layers Non-Planar Projection Stereo PairsLeft

Right

6

review: geometry-based approach (GBA)

projection computed on a per-vertex basis ensure sufficient on-screen vertex density dynamic mesh refinement required

primitive index (e.g.

0,0,1,2,2,2,…)

step 3: indexed renderingrender each indexed primitive

into projection pieces

projection & clip matrices framebuffer

step 2: primitive replication create an index containing the respective number

of replications per input triangle

scene triangles

per-triangle replication

count

step 1: replication determination calculate and write one replication count per input

triangle

COP COP

Non-Planar Projection Surface Non-Planar Projection SurfaceApproximation (coarse)

7

adapting GBA for stereoscopy straight forward approach:

setup piece-wise projection for each virtual camera render into different color-buffers additional post-processing step: layer compositing

example for stereo image pairs:

step 3: indexed renderingrender each indexed primitive into two projection

pieces (for left and right eye)

scene triangles

per-triangle replication

count

step 1: replication determination calculate and write one replication count per input

triangle

step 2: primitive replication create an index containing the respective number

of replications per input triangle

left eye framebuffer

projection & clip matrices

primitive index (e.g.

0,0,1,2,2,2,…)

right eye framebuffer

step 4: stereo compositingjoin both framebuffers into a single output image

framebuffer

8

rendering active & passive stereo active stereo:

using quad-buffering usually encapsulated by graphics driver

passive stereo: anaglyph: color-buffer compositing polarized: render to framebuffer chromo-depth stereo: apply directly during rendering

Left Right

9

rendering passive anaglyph - results

HITIT 10

rendering chromo-stereoscopy

nearv

rm = ƒ

rm (vclam

p )

vdt = ƒdt(V)

vclamp = ƒclamp(vdt) far

RG

B

CM

Y

RW

B

Cout =

ƒsam

ple (vrm )

cs ce

no need for generating stereo image pair color as a function of depth

11

rendering chromo-stereoscopy - results

HITIT 12

applying chromo-stereoscopy GBA: straight forward application to fragment‘s

depth IBA: needs depth correction

13

rendering chromo-stereoscopy - results

HITIT 14

chroma-stereoscopy issues common problems for IBA and GBA:

distribution of color can decrease stereo effect perception: facade information (texture) is altered interaction: focal plane must be adapted

Near Focal Plane Far Focal Plane

Equally Distributed

15

binary comparision GBA vs. IBAComparison Criteria GBA IBA

Stereo Functionality Image Quality Rendering Performance Memory Footprint Implementation Complexity Overall Rating

16

conclusions & future work conclusions:

interactive stereoscopic rendering for non-planar projections

increases immersion, thus psychological depth cues performance limited by geometric complexity of the

scene GBA outperforms IBA but IBA much easier to

implement/use

open problem: omni-directional stereo without image artifacts

future work: auto stereoscopy for non-planar projections surfaces eye tracking to adjust user‘s focal plane 17

Thank you for your attention! Questions?

Contact Matthias [email protected]

Haik [email protected]

Markus [email protected]

Jürgen Dö[email protected] Workgroup 3D Geoinformationwww.3dgi.de/

Computergraphics System Groupwww.hpi.uni-potsdam.de/doellner/

18