Upload
chandler-torres
View
24
Download
0
Embed Size (px)
DESCRIPTION
Evaluating Search Interfaces. Marti Hearst UC Berkeley. Enterprise Search Summit West Search UI Design Panel. Evaluating Search Interfaces. This is very hard to do well First, a recap on iterative design and evaluation. Then I’ll present some do’s and don’ts. Interface design is iterative. - PowerPoint PPT Presentation
Citation preview
Evaluating Search Interfaces
Marti HearstUC Berkeley
Enterprise Search Summit WestSearch UI Design Panel
Marti Hearst, UI Design Panel 2007
Evaluating Search Interfaces
This is very hard to do well First, a recap on iterative design and
evaluation. Then I’ll present some do’s and don’ts.
Marti Hearst, UI Design Panel 2007
Interface design is iterative
Design
Prototype
Evaluate
Marti Hearst, UI Design Panel 2007
Discount Testing vs. Formal Testing
Fast A small number of
participants (5) Test mock-ups and
prototypes in addition to finished designs
Learn about what doesn’t work, a bit about what does, maybe new good ideas for future iterations.
More time-consuming Need many participants
(often still too few) Test particular
components or principles to be used by others
Learn if something is better than something else, and by how much.
Marti Hearst, UI Design Panel 2007
Qualitative Semi-Formal Studies
After the design has been mocked up, evaluated, redesigned several times,
Evaluate the system holistically or in parts with a large user base Watch the participants use the system on their own queries Use Likert scales to get subjective responses to different
features Find bugs Find features/tasks that need to be streamlined Determine next round of useful features
Refine and test again.
Do Use Motivated Participants
Participants need to know and care about the search goal
(Jared Spool, UIE.com)
Do Longitudinal Studies
Have people use the system for their needs for several weeks or months.
Observe changes in behavior, and subjective preferences.
Do Add New Features Gradually
If you’re doing something new with search, start simple, see what works,
then add more in using additional evaluations.
Beware of Query Sensitivity
In search engine comparisons, variability between queries/tasks can be
greater than variability between systems.
Beware of Cool vs. Usable
Some things are eye-catching, but serve best to draw the user in.
Will they really like it over time?Or if they don’t like it at first, will they learn to like it? (rarer)
Do Compare Against a Strong Baseline
Compare your new idea against the best, most popular current solution.
A good test:“How often would you use this system?”
Subjective vs. Quantitative Measures
Time to complete the task can be a misleading metric.
Subjective impressions are key for determining search interface success.
Subjective vs. Quantitative Measures
Time to complete the task can be a misleading metric.
Subjective impressions are key for determining search interface success.
Marti Hearst, UI Design Panel 2007
Summary
Search evaluation is hard because of huge variations in Information needs Searchers’ knowledge and skills Collection contents
A good strategy is to: Add a few features at a time, test as you add Obtain subjective preference information Measure over time using longitudinal studies.