Upload
doyun-park
View
215
Download
2
Embed Size (px)
Citation preview
1071-5819/$ - se
doi:10.1016/j.ijh
nCorrespond
E-mail addr
Int. J. Human-Computer Studies 69 (2011) 839–853
www.elsevier.com/locate/ijhcs
Investigating the affective quality of interactivity by motion feedbackin mobile touchscreen user interfaces
Doyun Park, Ji-Hyun Leen, Sangtae Kim
Graduate School of Culture Technology, KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701, Republic of Korea
Received 20 November 2010; received in revised form 27 June 2011; accepted 29 June 2011
Communicated by S. Boedker
Available online 14 July 2011
Abstract
Emotion is a key aspect of user experience. To design a user interface for positive emotional experience, the affective quality of the
user interface needs to be carefully considered. A major factor of affective quality in today’s user interface for digital media is
interactivity, in which motion feedback plays a significant role as an element. This role of motion feedback is particularly evident in
touchscreen user interfaces that have been adopted rapidly in mobile devices. This paper presents two empirical studies performed to
increase our understanding of motion feedback in terms of affective quality in mobile touchscreen user interfaces. In the first study, the
relationships between three general motion properties and a selected set of affective qualities are examined. The results of this study
provide a guideline for the design of motion feedback in existing mobile touchscreen user interfaces. The second study explores a new
dimension of interactivity that is the Weight factor of Laban’s Effort system. To experiment the Weight factor in a mobile touchscreen
user interface, a pressure sensitive prototype was developed to recognize the amount of force applied by the user’s finger action. With
this prototype, the effects of implementing pressure requirements on four different types of user interfaces were examined. Results show
that implementing the Weight factor can significantly influence the affective quality and complement the physical feel of a user interface.
The issues to consider for effective implementation are also discussed.
& 2011 Elsevier Ltd. All rights reserved.
Keywords: User experience; Affective quality; Interactivity; Motion feedback; Mobile touchscreen user interface
1. Introduction
User experience (UX) is defined as ‘‘a person’s per-ceptions and responses that results from the use or antici-pated use of a product, system or service (ISO, 2009).’’In this domain of perception and response, emotion is aprimary issue as we can see from many UX frameworks(Vermeeren et al., 2008; Desmet and Hekkert, 2007).A way for a designer to approach emotion is by handlingthe affective quality of the artifact. Affective quality refersto the feel and impression of an artifact and is commonlydescribed with adjectives such as simple, light, or elegant.Russell (2003) defines affective quality as the features of anartifact that influences a person’s emotion. A model ofproduct emotions by Desmet et al. (2001) shows that these
e front matter & 2011 Elsevier Ltd. All rights reserved.
cs.2011.06.006
ing author. Tel.: þ82 42 350 2919; fax: þ82 42 350 2910.
ess: [email protected] (J.-H. Lee).
features of a product evoke emotional responses throughappraisal against the user’s goals, standards and attitude.Positive emotions result when the features are perceived tosupport these concerns but negative emotions result whenthe features are perceived to harm them. Therefore, toachieve positive emotional experiences through an artifact,the affective quality needs to be designed with respect tothe user group and usage context.In user interfaces, interactivity is an important factor of
affective quality as it relates to the feel dimension ofinteractive media (Svanæs, 2000). Also, a study by Limet al. (2008) shows that interactivity has a strong influenceon the user’s emotional experience with an interactiveproduct. Thus, an understanding of interactivity design isan essential part of the design for emotion. In today’s userinterface, a commonly used element of interactivity ismotion feedback. Until the usability paradigm, the use ofmotion was generally reserved to serving instrumental
Fig. 1. Different types of information that a product can offer to guide
the user’s action towards the intended function (Wensveen, 2005).
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853840
purposes such as providing functional feedback or request-ing attention (Stone et al., 2005). Nevertheless, with thedevelopment of post-WIMP generation user interfaces pow-ered by high data processing speed and elaborate sensoryhardware, motion feedback is playing a significant role ininteractivity. This role extends to the influence on affectivequality, as our previous work shows by verifying thesignificance of motion on the affective quality of userinterfaces (Park and Lee, 2010). In particular, this trend isprevalent in touchscreen user interfaces which are being usedincreasingly in many interactive systems (Voorhees, 2008)including mobile phones (Robinson, 2006). Thus, in the UXparadigm with new generation user interfaces, motion feed-back is an element that needs to be considered seriously foraffective quality as an element of interactivity. However,studies on how to design interactivity in terms of affectivequality are at an early stage and understanding is limited.Thus, we are motivated to gain a better understanding ofmotion feedback design in touchscreen user interfaces interms of affective quality.
Our study to achieve this goal is composed of two stages.In the first stage, we examine motion properties relevant tomotion feedback in mobile touchscreen user interfaces toidentify their relationship with affective qualities. With theresult of this examination, we can gain a more practical anddetailed understanding of how to design motion feedbackfor affective quality. In the second stage of our study, weexplore a new factor of interactivity in mobile touch-screen user interfaces to investigate a new way to designaffective qualities more effectively. This new factor is theWeight factor of Laban’s Effort system that refers to theforce or pressure applied to the touchscreen by the user.The Weight factor of input is made to interact with motionfeedback in our prototype interface and its effect on theaffective quality of the user interface is examined. Overall,our investigations are performed empirically by developingprototypes and conducting user studies to extract practicaldesign guidelines.
2. Related work
2.1. Interactivity for affective quality
There are many empirical studies on affective qualitywhich we can reference for user interfaces. Zhang and Li(2005) showed that the perceived affective quality of aninterface has significant influence on the perceived useful-ness and perceived ease of use. In another area,Schenkman and Jonsson (2000), Kim et al. (2003), andvan der Heijden (2003) studied the affective qualities ofwebsites. The design elements under investigation in thesestudies included shape, texture, color, and layout, whichare the mainly dealt factors of affective quality since thetraditional media. Despite the acknowledged importanceof affective quality, studies on elements unique to thedigital media, such as interactivity, have only recentlygained attention. Another concern is that user interfaces
which we use to interact with digital media have a criticallimitation concerning affective quality. As mentioned byNorman (2005), user interfaces cannot convey physical feel
because they are virtual in nature. Physical feel refers to thetouch and feel of a product, such as the feel of turning aknob or pressing a switch, which has a significant influenceon human perception in terms of affective quality. In userinterfaces, interactivity can be a way to complement thislimitation.Interaction can be defined as ‘‘a cyclic process in which
two actors alternately listen, think and speak (Crawford,2003).’’ Interaction concerns this communicative aspectbetween a system and a user. In accordance, interactivityrefers to the interactive behavior of a system (Wikipedia,2010). Svanæs (2000) uses the notion of ‘‘look and feel’’ todescribe interactivity by pointing out that the ‘‘look’’ of auser interface is made from its visual elements while the‘‘feel’’ comes from its interactivity. In a similar manner,Lim et al. (2009) distinguished interactivity as the dynamicaspect of interaction which has an invisible quality. Here,the influence of interactivity attributes on various emotionalqualities was identified. These previous studies show thesignificance of interactivity concerning the affective qualityof user interfaces. Thus, we need to raise our currentunderstanding of this ‘‘feel’’ dimension in relation tointeractivity as much as the ‘‘look’’ dimension by the visualelements.As a cyclic process of listening, thinking, and speaking,
interactivity requires a system to provide feedback (speak-ing) from the user’s input (listening). Three forms offeedback are distinguished by Wensveen (2005): functional,augmented, and inherent feedback (Fig. 1). Informationgenerated by the system when performing the function isfunctional feedback while information from an indirectand additional source is augmented feedback. Inherentfeedback, on the other hand, was defined by Laurilard as‘‘information provided as a natural consequence of makingan action. It is feedback arising from the movement itself.’’Hence, this type of feedback is related to the physical feelof an interface and thus is the target of our investigation.Within our scope of mobile touchscreen user interfaces,
the user performs input by finger action and the userinterface is designed to provide feedback by output, suchas sound and vibration. Along with these types of output,
Fig. 3. Six basic input types by finger action on a touchscreen user
interface (Choi, 2008).
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853 841
motion feedback from the user interface performs the roleof inherent feedback because it is provided in direct andimmediate response to the user’s input movement. Thisrole of motion feedback has become increasingly commonnowadays with the wide usage of touchscreens on mobiledevices.
2.2. Mobile touchscreen user interfaces
Post-WIMP interfaces are characterized by being basedon the user’s pre-existing knowledge of the non-digitalworld (Jacob et al., 2008). One of the most popular typesof post-WIMP user interfaces used today is the touch-screen user interface, which detects the location of inputprovided by a finger or a stylus on the display area. Up todate, we can see touchscreen applied to various devicessuch as appliances, game consoles, GPS systems, kiosks,ATMs, and especially mobile phones (Fig. 2).
Mobile touchscreen user interfaces employ a uniqueinteraction style in relation to the input technique applied.In general, interaction through a touchscreen basicallyoccurs by computer recognition to the location and changein location of input within the display area. Hence,interactivity in touchscreen user interfaces occurs inresponse to the two dimensional position, path and speedof the input action. This allows six basic finger actions forinput: tap, double tap, long tap (hold), drag, flick, andmulti-touch (free or rotate) as depicted in Fig. 3.
Nowadays a rising issue in touchscreen technology ispressure sensitivity for mobile user interfaces. Until now, mosttouchscreen user interfaces could only recognize two dimen-sions of the user’s input, the x-axis and the y-axis on thedisplay area. Some types of touchscreens, such as the onesmade by resistive overlay technology, already use pressure torecognize user input. However, the sensitivity is not highenough to detect the different levels of pressure applied.
With the advancement of technology and cost reduction,new touchscreen technology with high pressure sensitivity foran additional z-axis interaction has become more feasible and
Fig. 2. Expectation graph of touchscreen ado
is expected to appear in the near future. Researchers atPeratech (2010) are using a new type of electrically conductivematerial called Quantum Tunneling Composite (QTC) todevelop extremely thin and sensitive pressure sensing solutionsfor various applications (Fig. 4). Meanwhile, Sony ComputerScience Lab. (Sony CSL) has developed and demonstrated aprototype mobile device that uses pressure sensitive touchpanel for navigation (Nezu, 2010). This new addition ofdimension to touchscreen has high potential as it cansignificantly expand the boundaries of interactivity and createa new design space of user interfaces (Graham-Rowe, 2010).There are also several studies on the use of pressure
sensitive interfaces. Stewart et al. (2010) mapped functionsfor pressure input and investigated the performance ofapplying pressure according to hand poses on mobiletouchscreen devices. In a more application centered investi-gation, Brewster and Hughes (2009) studied the effectiveness
ption to mobile phones (Robinson, 2006).
Fig. 4. A type of pressure sensitive technology under development for
various applications (Peratech, 2010).
Table 1
Description of the Effort system (Chi, 1999).
Effort
factor
Element Description
Space Indirect Flexible, meandering, wandering, multi-focus
Direct Single focus, channeled, undeviating
Weight Light Buoyant, easily overcoming gravity, decreasing
pressure
Strong Powerful, having an impact, increasing pressure
Time Sustained Lingering, leisurely, indulging in time
Sudden Hurried, urgent
Flow Free Uncontrolled, abandoned
Bound Controlled, restrained
Table 2
Finger actions in relation to the Effort combinations (Yook, 2009).
Finger
action
Effort
combination
Space Time Weight
– Punch Direct Sudden Strong
Tap Dab Direct Sudden Light
Hold Press Direct Sustained Strong
Drag Glide Direct Sustained Light
– Slash Indirect Sudden Strong
Flick Flick Indirect Sudden Light
– Wring Indirect Sustained Strong
– Float Indirect Sustained Light
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853842
of using pressure based input techniques for text entry inmobile devices while Wilson et al. (2010) examined the use ofpressure based input for menu selection. As a new interac-tion technique, Miyaki and Rekimoto (2009) developed asingle-handed interaction model based on pressure sensingcalled ‘‘GraspZoom.’’ This model used pressure with ges-tures to enable new ways to zoom and scroll. In contextsother than the mobile touchscreen user interface, Forlineset al. (2005) proposed an interaction technique usingpressure sensitive input devices to allow easy previewingduring editing tasks while Blasko and Feiner (2004) usedpressure in their interaction techniques with pads to accessdifferent function modes. Like the studies mentioned above,most of the previous studies on pressure sensitive interactionare focused on applications for usability and functionality.Nevertheless, pressure sensitive interaction also presents newpossibilities towards the affective aspect. This is because it isa basic sensory dimension which is used by users to perceivethe physical feel of a system. This additional dimension cancreate a more rich interactive experience and significantlyexpand the range of affective quality that can be perceivedfrom touchscreen user interfaces.
2.3. Framework for the affective quality of motion feedback
To examine the affective quality of motion feedback ininteractivity, the Effort system was adopted for a systematicapproach. The Effort system is a framework within theLaban Movement Analysis (LMA) system which was devel-oped by Rudolf Laban (Laban and Lawrence, 1974). It isused to interpret, describe, visualize, and notate humanmovement and is normally used to analyze the movement ofdancers and athletes. LMA comprises four main categories:Body, Effort, Shape, and Space. The Effort system isconcerned with the expressive aspect of movement and iscomposed of four factors (Space, Time, Weight, and Flow)which have two opposing elements (Table 1). Most move-ments show a combination of these factors.
At the other side of motion feedback (output) is theuser’s finger action (input) which completes interactivity.The Effort system was applied to distinguish the finger actions
in mobile touchscreen user interfaces by Yook (2009). Four ofthe main finger actions (tap, hold, drag, and flick) wereanalyzed according to the Effort factors (Table 2).It is important to note that the general touchscreen user
interfaces used today are capable of discerning the Spaceand Time factors but not the Weight factor of a fingeraction. Thus, all finger actions performed on a touchscreenuser interface are addressed as having the Light elementin terms of the Weight factor. Although the hold actionwas analyzed as having the Strong element in this previousstudy, this is only from the user’s perspective since it isneglected from the user interface which cannot recognizethe Weight factor. Nevertheless, we implemented theWeight factor in our second study to examine its effectand potential on the affective quality of touchscreen userinterfaces.
3. Methodology
3.1. Research framework
To achieve our research goal, a research framework wasdesigned as shown in Fig. 5. The research frameworkconsists of two stages: (1) investigate the relationshipbetween motion properties and affective quality, and (2)investigate the effect of motion feedback in response to theWeight factor along with the Space and Time factors of the
Fig. 5. Research framework.
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853 843
finger action. Both stages were carried out in mobiletouchscreen user interfaces.
The objective of the first stage is to understand how motionproperties influence affective qualities so that it can be used todesign motion feedbacks for the intended affective qualities.Three motion properties that can be generally applied tomotion feedbacks which occur from the flicking finger actionwere studied: acceleration, responding duration, and over-shoot. These properties were investigated in an existing mobiletouchscreen interface which recognizes the Space and Timefactors of the user’s finger action. In the second stage of ourstudy, we attempt to expand the boundaries of affectivequalities expressible through mobile touchscreen user inter-faces. To do this, our scope of interactivity expands to therecognition and reaction to the Weight factor.
Overall, the framework of our investigation towards emo-tion takes the approach offered by Norman (2005). Thisapproach describes how users go beyond cognition and relateto artifacts emotionally by the three levels of emotion (visceral,behavioral, and reflective). This perspective often neglects thesocial factors or temporal processes which can influence theuser’s emotional experience (Palen and Bødker, 2008). Never-theless, Norman’s approach allows for a basic and systematicunderstanding of how the design of HCI relates to the user’s
emotion which is appropriate at this early stage of investiga-tion on affective quality and interactivity.
3.2. Research techniques
3.2.1. Measuring affective qualities
There are several ways to measure the user’s perceptionof affective qualities. These methods can largely be dividedinto two categories: the physiological and psychologicalmethod. Physiological methods provide an objective mea-sure to emotion. Bodily reactions, such as heart rate,electrodermal activity, and electromyogram, are used asindexes to emotional responses. However, it creates anunnatural setting for the user by sensors attachments andrestricted body movements. Moreover, the most criticallimitation lies in the resolution of data which can bemeasured. The data collected by the physiological methodis insufficient to precisely discern the users’ perceivedaffective quality. Thus, the psychological method of self-report was used in our study. Psychological methods askusers to report how they feel on a set of adjective intervalscales or on a non-verbal image based measurement tool.Although this type of method is subjective in nature, itprovides a quick measure of affective quality and a clearidentification of the affective qualities perceived.
3.2.2. Types of affective qualities to measure
To measure the influence on the affective quality of userinterfaces, we need to select the types of affective qualities thatare relevant to our study. This is because the perceived typesof affective qualities vary according to the domain and targetof experience. Investigations to identify these types of affectivequalities have been performed in many previous studies.To retrieve the affective qualities relevant to our study, wereferred to previous studies on the affective qualities ofwebsites, interactive systems, and physical controls.In the study for emotional experience of interactive
systems, Lim et al. (2008) measured affective qualities basedon Norman’s three levels of emotional response. As for theaffective qualities of physical controls, Wellings et al. (2010)identified the affective qualities that explain the perceivedcharacteristics of switch haptics in automotive interfaces.Jun et al. (2010), on the other hand, identified the affectivequalities related to dials in their study on torque profilemeasure and sensibility evaluation. Based on these studies,we selected eleven bipolar affective quality pairs appropriatefor mobile touchscreen interfaces as in Table 3. Moreover,these affective quality pairs were composed to examine thethree levels of emotional response: the visceral, behavioral,and reflective level (Norman, 2005). Affective qualities atthe visceral level are formed from our physical senses. Theyare the most primitive and immediate responses which areformed before the other two levels. Affective qualities at thebehavioral level involve cognition and are related to the useand behavioral aspects of a system. The reflective level is thehighest layer of emotional response. This level involvesinterpretation and reasoning which can be influenced by
Table 3
Affective qualities selected for mobile touchscreen
user interface.
Level of
emotion
Bipolar affective quality pairs
Visceral Heavy Light
Soft Hard
Tight Loose
Clicky Smooth
Precise Imprecise
Behavioral Simple Complicated
Clear Ambiguous
Deep Shallow
Reflective Natural Artificial
Refined Unrefined
Interesting Dull
Fig. 6. Prototype user interface designed to flick through a set of images
on a mobile touchscreen device.
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853844
previous experiences, individual differences, and culturalbackground.
4. Experiment I: investigation of motion properties
The objective of this experiment is to understand howmotion feedback can be controlled in mobile touchscreenuser interfaces. We investigated the relationship betweenmotion properties and affective qualities in a mobiletouchscreen user interface.
4.1. Experiment design
The types of motion properties investigated were relatedto motion feedbacks which occur from the finger actionson mobile touchscreen user interfaces. As an initial stage ofinvestigation, the flicking finger action was adopted becauseit allows the design of affective quality by motion feedbackto be more feasible than with other finger actions.For instance, motion feedback from flicking is not restrainedto the finger action as it is from dragging and also notdetached as it is from tapping.
The examined motion properties were related to thepush in/out motion feedback which is generally used inresponse to the flicking finger action. They were respond-ing duration, acceleration, and overshoot. Varying para-meters of these three properties were combined toimplement 18 types of motion feedbacks in the prototype.For each motion feedback, the perceived affective qualitieswere measured using a seven point semantic differentialscale on each set of bipolar affective quality pairs. A with-in subject design was conducted to reduce influence fromthe participants’ individual differences.
4.2. Experiment method
4.2.1. Setting and procedure
The experiment was performed in a room with a Mac-Book Pro computer and an iPod touch device. The iPod
touch was used as the mobile touchscreen user interfaceand the MacBook was used to control the experimentprocedure. The iPod touch display is 3.5 inches diagonaland uses capacitive touch technology to detect input. 30graduate students, 15 male and 15 female, aged from 23 to39, participated in the experiment. 18 user interfaces withdifferent motion parameter settings were prepared for theexperiment. After a short tryout period with the prototype,the experiment began with one of the user interfacespresented to the participant. When the participants feltthey had sufficiently interacted with the user interface toperceive its affective quality, they were asked to rate theirperception in terms of the eleven bipolar affective qualitypairs on a questionnaire. This set of procedure wasrepeated for each of the 18 prototype interfaces whichwere presented in random order.
4.2.2. Prototypes
The prototype user interface developed for this userstudy was an image viewing application on an iPod touchdevice. This application was designed for the participant tobrowse through a set of images horizontally by flickingthrough the user interface (Fig. 6). A set of images wereimplemented in each of the 18 user interfaces with varyingmotion parameters for the experiment. Every set included6 images of household objects, such as a cup, pot, clock, andlamp. Pictures of household objects were selected to minimizethe influence by content because they are known to have aneutral effect on emotion (Stins and Beek, 2007). Theprototype was implemented in iOS 3.1.3. The UIKit frame-work was used to load and render images but the motionanimation method was redefined for the experiment.Each motion property was varied into three parameter
settings. The types of acceleration applied were ease-in,linear, and ease-out (Fig. 7).
Fig. 7. Three types of acceleration applied to the prototype motion
feedbacks. The graphs show acceleration with time on the x-axis and
position of the moving object on the y-axis. Both axes range from 0 (start
point) to 1 (end point). The easing function is shown below each graph.
Fig. 8. Mean ratings of affective qualities in relation to acceleration.
Error bars represent standard error of the mean.
Fig. 9. Mean ratings of affective qualities in relation to responding duration.
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853 845
To adapt to the mobile touchscreen user interface, theparameters for responding duration were implemented byvalues relative to the finger action speed. Finger speed hererefers to the speed of the finger moving from the laststandstill point to the point when the finger leaves contactof the screen. The three values applied were 25%, 100%,and 175% of the flicking finger speed. These valuesensured the three parameters of responding duration tobe perceived differently by the participants without effort.To reflect the flicking finger speed to the motion feedback,the finger speed was directly applied to the initial tenth(Ratio¼0.1) of the motion feedback duration and theremaining duration was calculated in proportion to thecorresponding acceleration of the motion. The followingformula was used to calculate the duration of each motionfeedback according to the responding duration parameter:
Motion feedback duration
¼Motion feedback distance� Easing function ðRatioÞ
Finger speed� Parameter�Ratio
Lastly, the three types of overshoot were significant at10% of the screen width (5.05 mm), slight at 5% of thescreen width (2.54 mm), and none (0 mm). Out of the 27combinations that can be made with these parameterssettings of each motion property, 18 were used in theexperiment to maintain reliable response quality by avoid-ing fatigue effect. The combinations selected for theexperiment reflected all the parameter settings of the threeproperties evenly. Hence, every parameter setting wasevaluated in equal amount by the participants.
Fig. 10. Mean ratings of affective qualities in relation to overshoot.
4.3. Results
The participants’ ratings of each motion property on theeleven affective qualities were statistically analyzed. Thethree graphs show the mean ratings (y-axis) on eachaffective quality (x-axis) for each parameter setting of thecorresponding motion property (Figs. 8–10).
The different levels of influence that each motionproperty has on the affective qualities can be observedfrom the graphs. Overall, responding duration showed thestrongest influence while overshoot showed the weakest
influence. A correlation analysis and an analysis ofvariance (ANOVA) were performed for a more detailedinspection. Table 4 shows the correlations between the
Table 4
Correlation matrix of the motion properties and the affective qualities.
Affective quality pairs Acc. Resp. duration Overshoot
Visceral
Heavy–light 0.202 0.641
Soft–hard 0.320 0.259
Tight–loose �0.315 �0.298
Clicky–smooth �0.270 �0.643
Precise–imprecise 0.108
Behavioral
Simple–complicated �0.115 �0.387 0.191
Clear–ambiguous �0.190 �0.444 0.166
Deep–shallow 0.190 0.511
Reflective
Natural–artificial 0.167 �0.194 0.129
Refined–unrefined 0.295 0.468
Interesting–dull �0.114 0.193 �0.157
Correlation is significant at the 0.01 level (2-tailed).
Table 5
Standardized coefficients of the motion properties in the regression model.
Affective quality pairs R2Standardized coefficient, b
Acc. Resp. duration Overshoot
Visceral
Heavy–light 0.451 0.811 1.866
Soft–hard 0.170 1.142 0.670
Tight–loose 0.188 �1.150 �0.791
Clicky–smooth 0.487 �1.091 �1.884
Precise–imprecise
Behavioral
Simple–complicated 0.176 �0.862 0.113
Clear–ambiguous 0.251 �0.659 �1.093 0.101
Deep–shallow 0.297 0.107 0.077
Reflective
Natural–artificial 0.078 0.605 �0.510
Refined–unrefined 0.306 0.111 0.080
Interesting–dull 0.057 0.395 �0.095
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853846
motion properties and the affective qualities that aresignificant at the po0.01 level.
The acceleration property had a linear relationship withthree of the affective quality pairs, while for the other sevenaffective quality pairs, either decelerating or accelerating hadsignificant influence compared to no acceleration. As a result,decelerating motion feedback significantly increased percep-tion of the heavy, soft, loose, smooth, complicated, andambiguous affective qualities, while accelerating motion feed-back significantly increased perception of the hard, tight,clicky, imprecise, clear, shallow, artificial, unrefined, andinteresting affective qualities. Secondly, the responding dura-tion property showed to have a clear linear relationship withalmost all affective quality pairs. Low responding duration(25%) influenced the affective quality to be perceived asheavy, soft, loose, smooth, complicated, ambiguous, deep,artificial, refined, and interesting, while high respondingduration (175%) influenced the affective quality to beperceived as light, hard, tight, clicky, simple, clear, shallow,and unrefined. The influence of responding duration wasespecially strong on the heavy–light and smooth–clicky affec-tive quality pairs. Yet, it did not show any influence on theprecise–imprecise affective quality pair. Lastly, the overshootproperty showed a significant influence on five affectivequality pairs. Both the slight (2.54 mm) and significantamount (5.05 mm) of overshoot increased the complicated
and ambiguous affective qualities, while only the significantamount significantly increased the imprecise, artificial, andinteresting affective qualities. In particular, the overshootproperty showed to have the strongest relation towards theprecise–imprecise affective quality in comparison to the othertwo motion properties.
A regression analysis followed to investigate the strengthin which the motion properties contribute to the perceptionof the affective qualities. Table 5 presents the standardizedcoefficients of the properties that showed significant con-tribution (po0.01). The coefficient of determination (R2)
shows that the three motion properties have a relativelyhigh influence on the heavy–light, clicky–smooth, andrefined–unrefined affective quality pairs by over 30%. Interms of the three levels of emotional response, the threegenerally applicable motion properties showed to have arelatively high influence on the visceral and reflective levelaffective qualities. However, no evidence of strong influenceon the behavioral level affective qualities was found.
5. Experiment II: investigation of the Weight factor
The previous experiment shows that manipulating onlythe motion properties has limitations to the controllablerange of several affective qualities. The objective of thisexperiment is to gain more control and facilitate the designfor affective quality by investigating the Weight factor ofuser input on mobile touchscreen user interfaces. Generalmobile touchscreen user interfaces of today can onlyrespond to the Space and Time factors of finger motion.However, the Weight factor is also an important aspect inwhich humans use to interact with and perceive artifacts.For instance, two dials with the same shape and size will beperceived differently if one requires significantly morestrength to turn than the other. Thus, this user study seeksto expand the affective qualities perceivable by enablingthe user interface to recognize and interact with the Weightfactor from the user. This new dimension is hypothesizedto offer a more elaborate design of affective qualities.For the experiment, we implemented a prototype inter-
face with a pressure sensor to enable motion feedback inthe interface to react to the amount of force applied bythe user’s finger action. The effect of the Weight factoron the perception of affective quality was examined in twodifferent types of interface: a switch interface type anda typical application interface type. The switch interfacewas implemented as a way to directly observe how theWeight factor influences the physical feel of touchscreen
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853 847
user interfaces. Within these interfaces, the finger actionsprevalent in a mobile touchscreen user interface (tap, drag,and flick) were examined with the Weight factor.
5.1. Experiment design
A total of four interfaces (3 switch interfaces and1 application interface) were developed to experiment theWeight factor in different types of interface and fingeraction. Among the three switch interfaces, the buttoninterface operated by tapping, the slider knob interfaceby dragging, and the rotary switch interface by flicking. Asfor the image viewing application interface, dragging andflicking were the relevant finger actions.
To adopt the Weight factor in this experiment, theprototype user interfaces were designed to operate onlywhen a predefined amount of pressure was applied by theparticipant. For instance, the button interface switched on/off only when the participant applied more than a certainamount of pressure to the button. Three different levels ofpressure requirement were implemented on each prototypeinterface: level 0 which required no pressure, level 1which required slight pressure, and level 2 which requiredsignificant pressure. The two different levels of pressuresensitivity (levels 1 and 2) were implemented to investigatehow the varying levels of the Weight factor can influenceaffective quality. The perceived affective qualities weremeasured for each pressure requirement in every interfaceon a seven point semantic differential scale.
5.2. Experiment method
5.2.1. Setting and procedure
The experiment was performed in a room with a Mac-Book Pro computer and an iPod touch device. The proto-type user interfaces were presented through the iPod touchinterface and the MacBook was used to control the experi-ment procedure and transmit pressure sensor signals to theiPod touch device. 30 graduate students, 15 male and 15female, aged from 23 to 39, participated in the experiment.After a short tryout period with the prototype user interface,the three pressure requirements on a user interface werepresented to the participants in random order. After theparticipants felt they had sufficiently interacted with a userinterface to perceive its affective quality, they were asked torate how it was perceived in terms of the eleven bipolaraffective quality pairs on a questionnaire. This procedurewas repeated for each user interface. At the end of theexperiment, a short interview about the experience ofpressure requiring touchscreen user interfaces and the pre-ference on its implementation was performed.
5.2.2. Prototypes
The Weight factor of the finger action was implementedby detecting the pressure applied to the user interface inreal-time. A pressure sensor was embedded in our proto-type to detect the amount of pressure being applied. The
pressure sensor used here was a force sensing resistor(FSR) with a thickness of 0.46 mm, a diameter of 12.7 mm,and a pressure sensitivity range of 0.1–10 kg/cm2. Withblocks of sponge material for cushion, this sensor wasplaced in between an iPod touch device and an acrylic casewhich covers the device and was connected to an ArduinoUSB board. Arduino is an open-source electronics proto-typing platform used for physical interaction design(Arduino, 2010).After the signals from the sensor were sent from the
pressure sensor to the experiment management program inthe MacBook Pro by serial communication, these pressurelevel data were then wirelessly transmitted to the iPodtouch device through a Wi-Fi connection (Fig. 11). TheWi-Fi connection was set up using the Mac OS X Airportsharing function which made the MacBook Pro act as aWi-Fi base station. This enabled the data packets totransmit directly to the iPod touch device without delay.The three switch interfaces developed are shown in
Fig. 12. The push button and rotary switch interfacetoggled on/off and the slider knob interface moved verti-cally. In the image viewing application, images of luxurysedans from a single brand were used to simulate a digitalcatalog. Through a pilot study, three levels of pressurerequirements were set to fully operate these interfaces. Thepressure levels were read in device specific values whichranged from 0 to 1024. Level 0 required no pressure (valueZ0), level 1 required slight pressure (value Z250), andlevel 2 required significant pressure (value Z500).When the pressure applied is less than the required amount
of pressure, each interface gave feedback in ways similar tocorresponding interfaces in the physical world. This type offeedback in the push button interface was shown by thebutton moving slightly inwards but not switching the on/offstate. In the other switch interfaces, the knob of the sliderknob interface stayed still while dial of the rotary switchinterface bounced back after a slight turn in the direction ofthe finger movement. Lastly, feedback from the image viewingapplication interface was given by the image rolling back tothe current image when not enough pressure was appliedwhile dragging or flicking. To simulate the heaviness ofphysical objects, different amounts of roll back distance wereshown according to pressure levels by implementing differentresponding distances. Responding distance refers to themotion feedback distance which is determined in proportionto the dragging finger action. The responding distance was66% at pressure level 1 and 33% at pressure level 2. Roll backmotion feedback occurred in both cases when the fingermoved across a third of the screen width. In this manner, theinterface which required more pressure to fully operatedisplayed shorter distance of rolling back when insufficientpressure was applied.
5.3. Results
The participants’ ratings on the perceived affectivequalities were analyzed in relation to each pressure
Fig. 12. Prototype switch interfaces: push button (left), slider knob (middle), and rotary switch (right).
Fig. 11. Prototype hardware diagram devised for the mobile user interface to sense and respond to the pressure applied by the user. Pressure level data is
transmitted from the sensor to the iPod touch via an Arduino board and a MacBook Pro computer.
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853848
requirement level. The mean ratings from the three switchinterfaces and an image viewing application interface areshown in the graphs (Figs. 13 and 14).
From these graphs, we can see that the perceived affectivequalities of an interface change drastically just by implement-ing pressure requirement which is the Weight factor. ANOVAwas performed to investigate the significance of this effect. Asa result, interactivity that required pressure (level 1 and 2)made the user interface feel significantly more heavy, hard,tight, smooth, and imprecise at the visceral level, complicated,ambiguous, and deep at the behavioral level, and artificial,refined, and interesting at the reflective level. However,differences among pressure requirement level 1 and 2 werefound to be insignificant in all cases except for the heavy–light
pair in the button and slider knob interface. This insignificancecan be explained in part by the limited pressure sensitivity ofthe prototype hardware. This limitation made the actual
pressure requirement levels to be uneven by requiring moreforce to reach pressure level 1 than intended. Yet, thisinsignificance is also analyzed to be caused by the participants’unfamiliarity to pressure sensitive interaction in touchscreens.Thus, the experience of pressure requirement itself seems tohave raised a strong impact on the participants regardless ofthe difference in the level of requirement.Next, a regression analysis was performed to investigate the
strength that pressure requirement contributes to the percep-tion of the affective qualities. The coefficient of determinationshows the proportion of variability that pressure requirementaccounts for. All values are significant at po0.01 (Table 6).The coefficient of determination values for all four interfacescombined shows that the affective quality pair most greatlyinfluenced by pressure requirement is heavy–light by explainingover 65% of the variation. Other pairs which pressurerequirement contributes to its variation by more than 36%
Fig. 13. Mean ratings of affective qualities in relation to pressure requirement in each user interface. Error bars represent standard error of the
mean.
Fig. 14. Mean ratings of affective qualities from the four user interfaces
combined.
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853 849
are soft–hard, clicky–smooth, simple–complicated, deep–shallow,and natural–artificial. These results show that adding pressurerequirements can strongly influence affective qualities at all
three levels of emotion. This influence was greater thanexpected since the effect of pressure requirement was expectedto be more focused on the visceral level affective qualities. Thisinfluence shows that the Weight factor should be implementedwith caution since it can impact a wide range of affectivequalities.Among the four user interfaces, pressure requirement
had the highest influence on the following four affectivequality pairs in the slider knob interface: heavy–light, soft–
hard, clicky–smooth, and deep–shallow. Pressure require-ment in the push button interface also showed highinfluence on heavy–light as well as refined–unrefined. Onthe other hand, simple–complicated and natural–artificial
was highly influenced in the rotary switch and imageviewing application interface. However, despite thesevalues, ANOVA result shows that the level of influenceby pressure requirement did not differ significantly accord-ing to the user interface type. The only significant differ-ence was shown in the light–heavy affective quality pair,where pressure requirement level 2 showed to have asignificantly greater effect on the slider knob interfacethan the push button or the rotary switch interface.
Table 6
Linear regression analysis on the influence of pressure requirement in each user interface.
Affective quality pairs Push button Slider knob Rotary switch Image viewer
R2 b R2 b R2 b R2 b
Visceral
Heavy–light 0.693 �1.900 0.746 �2.417 0.586 �1.867 0.614 �2.067
Soft–hard 0.406 1.367 0.503 1.767 0.408 1.433 0.384 1.467
Tight–loose 0.108 �0.550 0.152 �0.833 0.241 �1.033 0.072 �0.517
Clicky–smooth 0.488 1.567 0.610 1.967 0.392 1.417 0.482 1.633
Precise–imprecise 0.093 0.550 0.101 0.700 0.138 0.717 0.180 0.883
Behavioral
Simple–complicated 0.357 1.200 0.335 1.350 0.425 1.550 0.433 1.500
Clear–ambiguous 0.210 0.983 0.182 1.000 0.271 1.200 0.258 1.150
Deep–shallow 0.419 �1.200 0.486 �1.317 0.249 �0.800 0.366 �0.983
Reflective
Natural–artificial 0.324 1.233 0.324 1.267 0.423 1.383 0.454 1.600
Refined–unrefined 0.431 �1.067 0.302 �0.983 0.125 �0.533 0.319 �0.900
Interesting–dull 0.117 �0.583 0.231 �0.750 0.163 �0.600 0.263 �0.833
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853850
6. Discussions and conclusion
6.1. Affective quality by motion feedback
The result of the first experiment shows how theaffective qualities are perceived by motion feedbackaccording to the three general motion properties in mobiletouchscreen user interfaces. In summary, acceleration andresponding duration showed significant correlations withall except the precise–imprecise affective quality pair, whileovershoot showed significant correlations with five of theaffective quality pairs. Moreover, responding duration hadthe greatest influence on the affective qualities overall,followed by acceleration which had moderate influenceand overshoot which had the least influence.
From the regression analysis, the three motion proper-ties were found to have a relatively higher influence on thevisceral and reflective level affective qualities than thebehavioral level affective qualities. One of the reasons forthis would be that the prototype application was an imageviewer which users find as generally easy to use andfamiliar in mobile touchscreen user interfaces. Neverthe-less, these results provide evidence that motion feedbackcan be an effective way to design affective qualities atthe visceral level and also provide a practical guidelinefor the design of motion feedback in existing mobile touch-screen user interfaces.
6.2. Affective quality by the Weight factor
In the second experiment, the Weight factor of the user’sinput was added to the interaction framework of touch-screen user interfaces which previously considered only theSpace and Time factor.
The addition of the Weight factor was found to have asignificant influence on all the affective quality pairs
investigated. The strength of influence on the affectivequalities was also significantly greater with the Weightfactor than with only the motion properties as observedfrom the first experiment. This provides an additional wayof creating a greater sense of physical feel in touchscreenuser interfaces by controlling the visceral level affectivequalities, such as heavy–light and soft–hard.Implementation of the Weight factor can also be used as
an alternative way to create physical feel. In user interfacesthat intrinsically cannot show much motion feedback withthese properties, such as a push button, the Weight factorcan be an effective method of designing physical feel. Inthese respects, the efficacy of the Weight factor has beenverified as a dimension of interactivity to control theaffective quality of mobile touchscreen user interfaces.
6.2.1. Influence of user interface type
In relation to the different types of user interfacesinvestigated, the influence of the Weight factor was notdependent on the finger action type as much as expected.The strength and direction of influence by the Weightfactor was similar along all four types of user interfaceswhich differed in the required finger action. This similaritycan be partly explained by the users’ reaction to anunaccustomed type of interaction. Thus, different influencepatterns on affective quality by finger action type mayevolve as users become familiar and proficient with theWeight factor of interactivity.Unlike the similarities from the quantitative result,
significant differences were discovered from the interviewsession. Differences in the preference of Weight factorimplementation were fairly clear according to the type ofuser interface. Most participants favored pressure require-ment on the push button and the slider knob while itwas not favored on the rotary switch and image viewer.
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853 851
This implies that users prefer to have the Weight factorconsidered for user interfaces which operate by tappingand dragging but not for ones which operate by flicking.This preference was analyzed to be dependent on threefactors: ease of use, intuitiveness of operation, and thecontext.
The push button which operated by tapping was themost easy to use and was the most intuitive with pressureapplied since push buttons in the physical world alsooperate by applying force downwards on to its surface.The slider knob and the rotary switch were found to bemore complicated to use with pressure applied becausethese switches normally operate by applying force to-and-fro or sideways in the physical world while pressure ontouchscreen user interfaces can only be applied down-wards. In relation to this restriction, interactions thatrequired more control over different dimensions loweredthe ease of use. Unlike the push button which simplyrequired pressure downwards, many participants haddifficulties using the rotary switch with pressure whichrequired the finger action to first push down on the dialand then follow through from drag to flick. Moreover, thiscomplexity made it difficult to differentiate between thetwo levels of pressure requirements. Nevertheless, thisbasic preference was found to vary by contextual factors,such as content. Some participants commented that,although more difficult to use, they preferred the imageviewer application interface with pressure requirementbecause it suited the content images of luxury sedans. Inother words, the affective qualities perceived by thepressure requirement (heavy, deep, refined, etc.) wereappropriate and consistent with the content, making theexperience positive.
6.2.2. Guidelines for implementation
Overall, the participants commented that pressure requir-ing user interfaces felt unnatural because they were notaccustomed to this new dimension of interactivity whenusing touchscreen user interfaces. Also, a major limitationwhich becomes even more conspicuous with pressureapplied is the lack of tactile feedback. Nevertheless, theresults and observations of this experiment show that thisunnatural feeling can be overcome by sufficient experiencefrom the user’s side and proper motion feedback andhardware implementation from the designer’s side. Thefollowing four aspects have been identified for considera-tion on the designer’s side: motion feedback implementa-tion, pressure requirement level, physical surface oftouchscreen, and context of use.
First is the use of motion feedback to substitute tactilefeedback. Since touchscreen user interfaces cannot providetactile feedback, motion feedback needs to be designed as asubstitute to provide sufficient usability and appropriateaffective quality. As a way to cope with usability and makethe interaction feel natural, it is important to show motionfeedback occurring as pressure is applied to the user the
interface. Moreover, in succession to this feedback, it isimportant to provide a cue to the user showing that therequired amount of pressure has been reached. These aretwo of the basic requirements to substitute tactile feedbackwith motion feedback when the Weight factor is involvedin touchscreen user interfaces.The second aspect is concerned with the amount of
pressure required. From the qualitative data, the naturalfeel of pressure requiring user interfaces was found to bedependent on the amount of pressure required and theamount of pressure that the user can comfortably apply.Due to this aspect, some participants expressed thatpressure requirement level 2 felt more natural and waspreferred over level 1. Therefore, it is important toimplement the amount of pressure that the user cancomfortably apply on a touchscreen user interface. Findingthis point can be tricky because a number of femaleparticipants were observed to have difficulties applyingthe pressure level which male participants did not have anytrouble with. Moreover, the comfortable amount of pres-sure was observed to differ according to the way themobile device was held, such as the difference betweenholding the device with one hand with the thumb for inputand holding the device with one hand with the index fingerof the other hand for input.The third aspect for consideration is the physical surface
of the touchscreen. Several participants noted that theinterface felt stiff when the pressure requirement was at itshighest setting. Also, a participant commented that theaffective quality of the interface felt increasingly hard
with higher pressure requirement because he could feelthe hard surface of the touchscreen more strongly. Theseobservations show that the physical surface plays asignificant role in forming the affective quality of atouchscreen user interface when the Weight factor isimplemented. Physical surface was important in terms ofusability as well. Some finger actions, such as dragging,were difficult to perform with high pressure appliedbecause of the increased friction with the surface. Thus,the texture and friction level of the physical surface shouldbe considered when implementing the Weight factor.Lastly, there were noteworthy comments on the imple-
mentation of the Weight factor related to the context of use.Many participants acknowledged that implementing pres-sure requirement would be useful in game control and inpreventing unintended operations on a touchscreen inter-face. This contextual factor is also relevant to affectivequality as mentioned above with the case of preferring thepressure requiring interface to view images of luxury sedans.Through quantitative and qualitative analyses of our
experiment, we were able to gain a better understanding onhow the Weight factor can be used to design the affectivequality of motion feedback and identified several aspectsfor a successful implementation of the Weight factor inmobile touchscreen user interfaces. Although furtherresearch is required for a more practical understanding,these findings can be used as the basic guidelines for the
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853852
design of pressure sensitive mobile touchscreen user inter-faces to come in the near future.
7. Future work
There are several limitations that the methods applied inthis study are faced with for the design of motion feedbackin terms of affective quality. A major part of future workwill need to focus on overcoming these limitations toprovide a more elaborate and holistic way to designmotion feedback. This would involve investigations onfactors which have not been scrutinized in this study, suchas the context of the motion feedback. This includes therepresentation of the object that displays the motion, thenarrative which the motion is positioned in, and othersensory cues that can occur at the same time. Thesecontextual factors are likely to have significant influenceson the perception of affective quality as they are addressedin the elements of an experience, such as anticipating,connecting, and interpreting (Wright et al., 2005). Thisissue also relates to the wider perspective of emotion inHCI which takes the social and temporal aspects intoaccount (Palen and Bødker, 2008). Thus, additionalresearch is required to determine which contextual factorscome in to play and how much influence they have on theperception of motion feedback with respect to differentuser groups. This is an important aspect of future inves-tigation for the implementation of the Weight factoras well.
Another area of future work is concerned with investigat-ing the relationship of motion feedback with a wider rangeof affective qualities. As an initial stage of research on thistopic, the affective qualities examined in this study werelimited to selected types for investigation on differentcategories of perception. Nevertheless, studies on a widerrange of affective qualities which encompass the perceptionsprevalent in user interfaces are required to raise thepracticality of our understanding.
In terms of the Weight factor, a more detailed investiga-tion on the levels of pressure is expected to provide greaterinsights on its potential. In analog interfaces, the diversepressure requirement levels plays a significant role in howthe affective quality of an interface is perceived (Jun et al.,2010). From the results found in this study, we can expectthis to be feasible in touchscreen user interfaces as well.Thus, future work with a more precise and sensitivehardware prototype will be able to provide a betterunderstanding for an effective implementation of theWeight factor.
Overall, the future works described above center ontwo categories of investigation: one is to deepen ourunderstanding of the factors involved in how motionfeedback influences affective qualities and the other is toexpand our understanding of how motion feedback influ-ences the wider variety of affective qualities perceivable intouchscreen user interfaces. With these understandings andthe implementation of the Weight factor, designers will
gain significant leverage in their control and communica-tion of interactivity in touchscreen user interfaces towardspositive emotions and an enhanced UX.
References
Arduino. /http://www.arduino.ccS (accessed August 29, 2010).
Blasko, G., Feiner, S., 2004. Single-handed interaction techniques for
multiple pressure-sensitive strips. In: Proceedings of the CHI ’04
Extended Abstracts on Human Factors in Computing Systems,
Vienna, Austria, April 24–29, pp. 1461–1464.
Brewster, S., Hughes, M., 2009. Pressure-based text entry for mobile
devices. In: Proceedings of the 11th International Conference on
Human–Computer Interaction with Mobile Devices and Services,
Bonn, Germany, September 15–18, Article No. 9.
Chi, D., 1999. A Motion Control Scheme for Animating Expressive Arm
Movements. Ph.D. Dissertation, Computer and Information Science,
University of Pennsylvania, USA.
Choi, W., 2008. A Study on the User Interface Design of Touch Screen
Mobile Phone. M.S. Thesis, Kookmin University, Korea.
Crawford, C., 2003. The Art of Interactive Design: A Euphonious and
Illuminating Guide to Building Successful Software. No Starch Press,
San Francisco.
Desmet, P., Overbeeke, C., Tax, S., 2001. Designing products with added
emotional value: development and application of an approach for
research through design. The Design Journal 4 (1), 32–47.
Desmet, P., Hekkert, P., 2007. Framework of product experience.
International Journal of Design 1 (1), 57–66.
Forlines, C., Shen, C., Buxton, B., 2005. Glimpse: a novel input model for
multi-level devices. In: Proceedings of the CHI ’05 Extended Abstracts
on Human Factors in Computing Systems, Portland, USA, April 2–7,
pp. 1375–1378.
Graham-Rowe, D., 2010. Mobile touch screen could soon feel the
pressure. Technology Review. /http://www.technologyreview.com/
communications/24414/S (accessed October 31, 2010).
ISO FDIS 9241-210, 2009. Ergonomics of Human System
Interaction—Part 210: Human-Centred Design for Interactive Systems
(formerly known as 13407). International Organization for Standar-
dization (ISO), Switzerland.
Jacob, R., Girouard, A., Hirshfield, L., Horn, M., Shaer, O., Solovey, E.,
Zigelbaum, J., 2008. Reality-based interaction: a framework for post-
WIMP interfaces. In: Proceedings of the 26th International Con-
ference on Human Factors in Computing Systems, Florence, Italy,
April 5–10, pp. 201–210.
Jun, C., Choo, H., Park, S., Kim, L., Shin, S., 2010. Torque profile
measuring and sensibility evaluation of a haptic device. Transactions
of the Society of CAD/CAM Engineers 15 (3), 222–233.
Kim, J., Lee, L., Choi, D., 2003. Designing emotionally evocative
homepages: an empirical study of the quantitative relations between
design factors and emotional dimensions. International Journal of
Human-Computer Studies 59, 899–940.
Laban, R., Lawrence, F., 1974. Effort: Economy in Body Movement.
Plays Inc., Boston.
Lim, Y., Lee, S., Lee, K., 2009. Interactivity attributes: a new way of
thinking and describing interactivity. In: Proceedings of the 27th
International Conference on Human Factors in Computing Systems,
Boston, USA, April 4–9, pp. 105–108.
Lim, Y., Donaldson, J., Jung, H., Kunz, B., Royer, D., Ramalingam, S.,
Thirumaran, S., Stolterman, E., 2008. Emotional experience and
interaction design. Lecture Notes in Computer Science 4868, 116–129.
Miyaki, T., Rekimoto, J., 2009. GraspZoom: zooming and scrolling
control model for single-handed mobile interaction. In: Proceedings of
the 11th International Conference on Human–Computer Interaction
with Mobile Devices and Services, Bonn, Germany, September 15–18,
Article No. 11.
D. Park et al. / Int. J. Human-Computer Studies 69 (2011) 839–853 853
Nezu, T., 2010. Sony’s New Touch Panel Detects Amount of Pressure.
Tech-On, June 7. /http://techon.nikkeibp.co.jp/english/NEWS_EN/
20100607/183263/?P=2S (accessed October 31, 2010).
Norman, D., 2005. Emotional Design: Why We Love (or Hate) Everyday
Things. Basic Books, New York.
Park, D., Lee, J., 2010. Investigating the affective quality of motion in
user interfaces to improve user experience. Lecture Notes in Computer
Science 6243, 67–78.
Peratech. /http://www.peratech.com/S (accessed October 31, 2010).
Palen, L., Bødker, S., 2008. Don’t get emotional. Lecture Notes in
Computer Science 4868, 12–22.
Robinson, S., 2006. Touch Screen Phones Ready for Take Off. Strategy
Analytics, June 28. /http://www.strategyanalytics.com/default.
aspx?mod=PressReleaseViewer&a0=2970S (accessed October 31,
2010).
Russell, J., 2003. Core affect and the psychological construction of
emotion. Psychological Review 110 (1), 145–172.
Schenkman, B., Jonsson, F., 2000. Aesthetics and preferences of Web
pages. Behaviours & Information Technology 19, 367–377.
Stewart, C., Rohs, M., Kratz, S., Essl, G., 2010. Characteristics of
pressure-based input for mobile devices. In: Proceedings of the 28th
International Conference on Human Factors in Computing Systems,
Atlanta, USA, April 10–15, pp. 801–810.
Stins, J., Beek, P., 2007. Effects of affective picture viewing on postural
control. BMC Neuroscience 8, 83.
Stone, D., Jarrett, C., Woodroffe, M., Minocha, S., 2005. User Interface
Design and Evaluation. Morgan Kaufman, Amsterdam.
Svanæs, D., 2000. Understanding Interactivity: Steps to a Phenomenology
of Human–Computer Interaction. Ph.D. Dissertation, Norges
Teknisk-naturvitenskapelige Universitet.
van der Heijden, H., 2003. Factors influencing the usage of Web sites: the
case of a generic portal in the Netherlands. Information and Manage-
ment 40, 541–549.
Vermeeren, A., Kort, J., Cremers, A., Fokker, J., 2008. Comparing UX
measurements, a case study. In: Law, E., Bevan, N., Christou, G.,
Springett, M., Larusdottir, M. (Eds.), Proceedings of the International
Workshop on Meaningful Measures: Valid Useful Experience Mea-
surement, Reykjavik, Iceland, June 18, pp. 72–78.
Voorhees, S., 2008. Touch Screen Market to Reach $3.3 Billion by 2015.
DisplaySearch, May 8. /http://www.displaysearch.com/cps/rde/xchg/
displaysearch/hs.xsl/new_displaysearch_touch_panel_market_report_
analyzes_fast_growing_area.aspS (accessed October 31, 2010).
Wellings, T., Williams, M., Tennant, C., 2010. Understanding customers’
holistic perception of switches in automotive human–machine inter-
faces. Applied Ergonomics 41, 8–17.
Wensveen, S., 2005. A Tangibility Approach to Affective Interaction. Ph.D
Dissertation, Delft University of Technology, The Netherlands.
Wikipedia. /http://en.wikipedia.org/wiki/InteractivityS (accessed October
31, 2010).
Wilson, G., Stewart, C., Brewster, S., 2010. Pressure-based menu selection
for mobile devices. In: Proceedings of the 12th International Con-
ference on Human Computer Interaction with Mobile Devices and
Services, September 7–10, Lisbon, Portugal, pp. 181–190.
Wright, P., McCarthy, J., Meekison, L., 2005. Making sense of experience.
In: Blythe, M., Monk, A., Overbeeke, C., Wright, P.C. (Eds.), Funology:
From Usability to User Enjoyment. Kluwer, Dordrecht, pp. 43–53.
Yook, H., 2009. A Study on the Types of Interactive Motions in Mobile
Touch Interface. Ph.D. Dissertation, Hongik University, Korea.
Zhang, P., Li, N., 2005. The importance of affective quality. Commu-
nications of the ACM 48, 105–108.