10
Haptic Rendering Aaron Hall CS 525 Eh? Haptics refers generally to a technological incorporation of human touch, covering areas from psychological test apparatus to computer I/O. Touch being a group of diverse senses, haptics can involve pressure, texture, heat, weight, pain, and so forth. Current research in the field often focuses on parallel visual/force feedback modelling of a geometric environment. History Often cited as the earliest haptic technology is the “stick-shaker”. An airplane with direct controls (no hydraulics) provides tactile feedback in the form of stick forces. On development of hydraulic controls, the loss of feedback became most problematic as the plane neared stall conditions. The pilots, expecting mushiness and buffetting at the controls, felt no warning and tended to overextend the plane. Thus planes began to become equipped with imbalanced rotors to shake the stick as a stall warning. In this same group are “rumble packs” for game controllers, popular some years later. These technologies are somewhat unusal as pure output haptic devices. Conversely haptics are common on the input side of computer hardware. Mouse movement, keyboard spring rate and travel, Trackpoints, touchpads; all rely to some extent on the user's touch sense. However the most complete use of haptics in computer interfaces, and the area of most current attention, is feedback coupling both input and output. The earliest intentionally haptic I/O device is thought to be the mechanical telemanipulators developed during the Manhattan project to work with highly active radioisotopes separated from the operator by a meter- thick pane of quartz.

Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Haptic RenderingAaron Hall

CS 525

Eh?

Haptics refers generally to a technological incorporation of human touch, covering areasfrom psychological test apparatus to computer I/O. Touch being a group of diversesenses, haptics can involve pressure, texture, heat, weight, pain, and so forth. Currentresearch in the field often focuses on parallel visual/force feedback modelling of ageometric environment.

History

Often cited as the earliest haptic technology is the “stick-shaker”. An airplane with directcontrols (no hydraulics) provides tactile feedback in the form of stick forces. Ondevelopment of hydraulic controls, the loss of feedback became most problematic as theplane neared stall conditions. The pilots, expecting mushiness and buffetting at thecontrols, felt no warning and tended to overextend the plane. Thus planes began tobecome equipped with imbalanced rotors to shake the stick as a stall warning. In thissame group are “rumble packs” for game controllers, popular some years later. Thesetechnologies are somewhat unusal as pure output haptic devices.

Conversely haptics are common on the input side of computer hardware. Mousemovement, keyboard spring rate and travel, Trackpoints, touchpads; all rely to someextent on the user's touch sense.

However the most complete use of haptics in computerinterfaces, and the area of most current attention, isfeedback coupling both input and output. The earliestintentionally haptic I/O device is thought to be themechanical telemanipulators developed during theManhattan project to work with highly activeradioisotopes separated from the operator by a meter-thick pane of quartz.

Page 2: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Human Perception of Touch

A considerable amount of haptic research liesin measuring and modeling human touchperception itself. With the complexity andvariety of touch senses, and their relativeweakness compared with vision and hearing,there are many unknowns here. However,measuring features from simpledisplacement/volume changes (pictured here) todetailed videomicroscopy of finger/objectcontact areas and MRI scans, in addition topsychological experiments of reportedperceptions, have elucidated some patterns.

In exploring an object by touch, humans tend tograsp it with a standard force-vs-time curve,then feel the feedback force-vs-space to senseits overall properties. Variations in staticpressure at contact areas, especially in thefingerpads, provide course texture information.And so-called microtexture with features as small as 50 nm can be sensed by feelingvibration in fingerprint ridges dragged along the surface.

Page 3: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Often for machine I/O direct hand/sceneinteractions are too complicated to render inrealtime. An intermediary, a rigid probe, isused both as a hardware device and simplifiedobject for scene modeling. Humans perceivethe roughness of a texture, through a probe, asthe frequency and amplitude of vibrations asit's dragged along the surface. Probe size,normal force, and dragging speed all affectthis sense in known ways that can be used tomore simply model the texture through aprobelike device.

Point/object interactions, or I/O devices without torque representation, are described asthree-degree-of-freedom (3-DoF). Generalized object/object interactions, with torque, areknown as 6-DoF.

Basics of Haptic Rendering

Most current 3-DoF haptic rendering techniques use someform of proxy contact model. A haptic endpoint, representingthe probe in object space, is connected by some force network(here an idealized spring) to a proxy object that interacts withthe object model. The proxy is most often a point or sphere,while the object space is typically modelled as for visual oraudio rendering, as some form of polygonal mesh or, lessfrequently, NURBS surfaces.

Page 4: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Texture Rendering

In visual rendering textures are oftenused, combining detailed 2-Dinformation with a lower-resolution3-D model to achieve image qualitywith substantially better efficiencythan a pure 3-D model. Haptictextures can be used similarly, infact, as human tactile perception isless acute than visual but the higherupdate frequencies required canmake it more computationallydemanding, there is potentiallygreater benefit from them. Simplehaptic textures, useful for slidingfriction texture, map 2-D height fields to model surfaces, applying gradient interpolationto add texture detail to object collision forces from the model itself. A more realisticapproach, depicted in the “force shading” diagram, uses normal maps for textures. By thistechnique highly polygonal models can be made to feel smoothly varying.

proxy object and the texture surface. This provides a better representation of the effect ofproxy geometry on the qualitative perceptions of torques and the effects of probe forceand speed. However, the more complicated calculations limit update frequencies to 100-200 Hz in complicated situations like the file scrubbing shown here.

Page 5: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Proxy Graphs in MarsView

MarsView is a notable application combining haptic and graphic interfaces for exploringlarge (indeed, planet-sized) topographic maps. Rendering height maps obtained fromNASA's Mars Orbital Laser Altimeter or the USGS Digital Elevation Models (of Earth),containing on the order of 100 million triangles, challenges the necessary ~1 kHz updatefrequency using standard haptic algorithms.

In typical 3DF proxy-contact rendering, eachtriangle encountered during one update cycleadds a constraint surface. Tensionminimization within those constraints producesfinal proxy contact, spring tension, and thusforce suitable for “display” on the hapticdevice. But over a high-resolution height map the proxy may encounter dozens oftriangles per cycle. Retaining all constraints leads to a perceptial “viscosity” on thesurface, while removing constraints as the proxy passes triangles is computationallyinfeasable at 1 kHz for these triangle densities. To achieve sufficient rendering efficiency,MarsView uses the proxy graph algorithm: as the proxy traverses the height map, itsposition is restricted to triangle edges. Gradient descent along these edges yields a tensionminimum sufficiently near the surface-based minimum to reproduce haptic forcesindistinguishable to the user, while computable in realtime.

This approximation relies on heightfield-specific assumptions; the triangular mesh isseamless and nonintersecting, the proxy moves through free space until contacting themesh surface, the heightfield is a regular grid (allowing triangles not along the proxy'ssurface “shadow” to be culled from intersection testing).

Page 6: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Voxel Sampling

When modeling the manipulation of tools and parts through an assembly, 6-DOF hapticsare necessary to handle the interactions of a complex object (instead of a point or sphericproxy.) One approach to efficiently model these object-object forces being explored byresearchers at The Boeing Company represents the static scene with voxels and the proxyobject with a shell of points. Although force discontinuities have been problematic for 3-DOF voxel models, stochastic mixing blurs them in the 6-DOF case.

The static model tiles space with voxels taking one of four values: free space, objectinterior, surface, and proximity. They are grouped into an “octree” limited to three levelsfor rendering speed, and expanded to 23N fanout (N = 1 a true octree, N = 3 found to betypically most efficient) to cover sufficient space. The dynamic model first samples thedynamic object into a voxel representation, then uses the surface voxels' center points andlocal normals as a point shell for the proxy.

Page 7: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

are summed around the object's center point, thusproducing both net force and net torque vectors. Thatcenter is coupled to a haptic handle, shadowing thehandle on the haptic I/O device, with a dampedrotatable spring. To avoid rapid movements tunnelingthrough the static surfaces to object interiors orcompletely through fine objects, a pre-contact brakingforce is computed when a point approaches a surface,using a field of proximity voxels surrounding thesurface.

Although generally successful in tests, the dynamic object's size is limited bycomputational speed, and the static model's by memory. Parallelizing the computations toimprove voxel resolution and/or world size is an approach being examined. Additionally,more voxel types are being investigated to support extended force fields around surfacesthat could model compliant material types.

Page 8: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Multiresolution Hierarchy

Retaining the more common polygonal mesh models, while feasably computing 6-DOFinteractions, Otaduy and Lin investigate multiresolution hierarchies that attempt tosimplify object meshes in a way that preserves haptic features. A given mesh isrepeatedly simplified, yielding a Bounding Volume Test Tree containing a hierarchy ofmeshes at different Levels of Detail, computed offline. Once complete the online hapticrendering loop performs object collision queries on the BVTT, starting at the lowest LoDand using perceptual quality criteria to selectively refine nodes to higher LoD asnecessary, or until the time constraint of the rendering frequency intervenes.

Building the BVTT to best preserve hapticqualities is acieved with filtered edge collapse.The mesh is first decomposed into the convexsurface patches used in the force model, thenmesh complexity is reduced by collapsingedges into vertices and reducing theconnectivity of the thus-merged vertices.Edges to collapse are selected consideringboth local and global convexivity constraints,such that lower LoD nodes remain convexwhile faithfully representing the original mesh. Once the BVTT is constructed, each nodeis labeled with its resolution, computed from three features: the deviation between eachconvex patch and its corresponding mesh, the support area of each vertex (roughly, therelevant area of adjacent triangles), and the Hausdorff distance between each patch andits descendant patches.

During online rendering contact distance betweenconvex patches is computed for penalty-based forcemodeling: the force at each contact point is alignedwith the contact normal and proportional tointerpenetration depth. For each contact theresolution features described above are used todetermine the perceptibility of missing detail, inturn used to select BVTT nodes for refinement. Thejaw image demonstrates different LoDs selected forthe same perceptual resolution.Implemented on a modern desktop computer with a6-DOF haptic Phantom, applied to test scenariossuch as the jaw model shown, this approach givestwo orders of magnitude speed improvement (overfull-resolution convex patch modeling) withperceptually identical force profiles.

Page 9: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Meshless Modelling

Modeling deformable objects, such as human tissue, challenges standard approaches suchas voxel or mesh models which are reconfigured tracking object deformation only withconsiderable difficulty. An alternative approach represents objects with a volumetriccloud of interior points. Haptic forces are computed using a spheric shell field aroundeach point, separately interacting with the proxy object to produce a net force (andpossibly torque as well). Model deformation (and associated point-point forces) arecomputed using standard differential equations of elastisity, restricted to pointinteractions for tractability. Visual rendering relies on splatting, in which each point isassociated with a surrounding density function, that function is warped and filtered into ascreen-space representation (the resampling kernel), which is then point sampled into anaccumulation buffer of pixels tiling the screen space.

Page 10: Haptic Rendering Aaron Hall CS 525 · Texture Rendering In visual rendering textures are often used, combining detailed 2-D information with a lower-resolution 3-D model to achieve

Bibliography

The Apple Motion Sensor As A Human Interface DeviceAmit Singhhttp://www.kernelthread.com/software/ams2hid

Haptic Chameleon: A New Concept of Shape-Changing User Interface Controls withForce FeedbackG. Michelitsch, J. Williams, M. Osen, B. Jiminez, S. Rapp2004, April 24-29, ACM

Haptic Interaction With Fluid MediaWilliam Baxter, Ming C. Lin2004, May 17-19, Proceedings of Graphics Interface

Laboratory for Human and Machine Hapticshttp://touchlab.mit.edu

Large Haptic Topographic Maps: MarsView and the Proxy Graph AlgorithmSean P. Walker, J. Kenneth Salisbury2003, ACM

A Perceptually-Inspired Force Model for Haptic Texture RenderingMiguel A. Otaduy, Ming C. Lin2004, ACM

SensAble Technologieshttp://www.sensable.com/

Sensation Preserving Simplification for Haptic RenderingMiguel A. Otaduy, Ming C. Lin2003, ACM

Six Degree-of-Freedom Haptic Rendering Using Voxel SamplingWilliam A. McNeely, Kevin D. Puterbaugh, James J. Troy1999, ACM SIGGRAPH 99

TorqueBAR: An Ungrounded Haptic Feedback DeviceColin Swindells, Alex Unden, Tao Sang2003, November 5-7, ICMI