Site Search Analytics in a Nutshell

  • Published on
    17-Aug-2014

  • View
    44.755

  • Download
    22

Embed Size (px)

DESCRIPTION

Originally presented at SXSW March 13, 2011, on panel with Fred Beecher and Austin Govella. Modified and updated for Web 2.0 Expo talk, October 12, 2011, UX Web Summit September 26, 2012; Webdagene September 10, 2013.

Transcript

<ul><li> Site Search Analytics in a Nutshell Louis Rosenfeld lou@louisrosenfeld.com @louisrosenfeld Webdagane 10 September 2013 </li> <li> Hello, my name is Lou www.louisrosenfeld.com | www.rosenfeldmedia.com </li> <li> Lets look at the data </li> <li> No, lets look at the real data Critical elements in bold: IP address, time/date stamp, query, and # of results: XXX.XXX.X.104 - - [10/Jul/2006:10:25:46 -0800] "GET /search?access=p&amp;entqr=0 &amp;output=xml_no_dtd&amp;sort=date%3AD%3AL %3Ad1&amp;ud=1&amp;site=AllSites&amp;ie=UTF-8 &amp;client=www&amp;oe=UTF-8&amp;proxystylesheet=www&amp; q=lincense+plate&amp;ip=XXX.XXX.X.104 HTTP/1.1" 200 971 0 0.02 XXX.XXX.X.104 - - [10/Jul/2006:10:25:48 -0800] "GET /searchaccess=p&amp;entqr=0 &amp;output=xml_no_dtd&amp;sort=date%3AD%3AL %3Ad1&amp;ie=UTF-8&amp;client=www&amp; q=license+plate&amp;ud=1&amp;site=AllSites &amp;spell=1&amp;oe=UTF-8&amp;proxystylesheet=www&amp; ip=XXX.XXX.X.104 HTTP/1.1" 200 8283 146 0.16 </li> <li> No, lets look at the real data Critical elements in bold: IP address, time/date stamp, query, and # of results: XXX.XXX.X.104 - - [10/Jul/2006:10:25:46 -0800] "GET /search?access=p&amp;entqr=0 &amp;output=xml_no_dtd&amp;sort=date%3AD%3AL %3Ad1&amp;ud=1&amp;site=AllSites&amp;ie=UTF-8 &amp;client=www&amp;oe=UTF-8&amp;proxystylesheet=www&amp; q=lincense+plate&amp;ip=XXX.XXX.X.104 HTTP/1.1" 200 971 0 0.02 XXX.XXX.X.104 - - [10/Jul/2006:10:25:48 -0800] "GET /searchaccess=p&amp;entqr=0 &amp;output=xml_no_dtd&amp;sort=date%3AD%3AL %3Ad1&amp;ie=UTF-8&amp;client=www&amp; q=license+plate&amp;ud=1&amp;site=AllSites &amp;spell=1&amp;oe=UTF-8&amp;proxystylesheet=www&amp; ip=XXX.XXX.X.104 HTTP/1.1" 200 8283 146 0.16 What are users searching? </li> <li> No, lets look at the real data Critical elements in bold: IP address, time/date stamp, query, and # of results: XXX.XXX.X.104 - - [10/Jul/2006:10:25:46 -0800] "GET /search?access=p&amp;entqr=0 &amp;output=xml_no_dtd&amp;sort=date%3AD%3AL %3Ad1&amp;ud=1&amp;site=AllSites&amp;ie=UTF-8 &amp;client=www&amp;oe=UTF-8&amp;proxystylesheet=www&amp; q=lincense+plate&amp;ip=XXX.XXX.X.104 HTTP/1.1" 200 971 0 0.02 XXX.XXX.X.104 - - [10/Jul/2006:10:25:48 -0800] "GET /searchaccess=p&amp;entqr=0 &amp;output=xml_no_dtd&amp;sort=date%3AD%3AL %3Ad1&amp;ie=UTF-8&amp;client=www&amp; q=license+plate&amp;ud=1&amp;site=AllSites &amp;spell=1&amp;oe=UTF-8&amp;proxystylesheet=www&amp; ip=XXX.XXX.X.104 HTTP/1.1" 200 8283 146 0.16 What are users searching? How often are users failing? </li> <li> SSA is semantically rich data, and... </li> <li> SSA is semantically rich data, and... Queries sorted by frequency </li> <li> ...what users want--in their own words </li> <li> A little goes a long wayA handful of queries/tasks/ways to navigate/features/ documents meet the needs of your most important audiences </li> <li> A little goes a long wayA handful of queries/tasks/ways to navigate/features/ documents meet the needs of your most important audiences Not all queries are distributed equally </li> <li> A little goes a long wayA handful of queries/tasks/ways to navigate/features/ documents meet the needs of your most important audiences </li> <li> A little goes a long wayA handful of queries/tasks/ways to navigate/features/ documents meet the needs of your most important audiences Nor do they diminish gradually </li> <li> A little goes a long wayA handful of queries/tasks/ways to navigate/features/ documents meet the needs of your most important audiences </li> <li> A little goes a long wayA handful of queries/tasks/ways to navigate/features/ documents meet the needs of your most important audiences 80/20 rule isnt quite accurate </li> <li> (and the tail is quite long) </li> <li> (and the tail is quite long) </li> <li> (and the tail is quite long) </li> <li> (and the tail is quite long) </li> <li> (and the tail is quite long) The Long Tail is much longer than youd suspect </li> <li> The Zipf Distribution, textually </li> <li> Some things you can do with SSA 1.Make it harder to get lost in deep content 2.Make search smarter 3.Reduce jargon 4.Learn how your audiences differ 5.Know when to publish what 6.Own and enjoy your failures 7.Avoid disaster 8.Predict the future </li> <li> #1 Make it harder to get lost </li> <li> Start with basic SSA data: queries and query frequency Percent: volume of search activity for a unique query during a particular time period Cumulative Percent: running sum of percentages </li> <li> Tease out common content types </li> <li> Tease out common content types </li> <li> Tease out common content types Took an hour to... Analyze top 50 queries (20% of all search activity) Ask and iterate: what kind of content would users be looking for when they searched these terms? Add cumulative percentages Result: prioritized list of potential content types #1) application: 11.77% #2) reference: 10.5% #3) instructions: 8.6% #4) main/navigation pages: 5.91% #5) contact info: 5.79% #6) news/announcements: 4.27% </li> <li> Clear content types lead to better contextual navigation artist descriptions album reviews album pages artist biosdiscography TV listings </li> <li> #2 Make search smarter </li> <li> Clear content types improve search performance </li> <li> Clear content types improve search performance </li> <li> Clear content types improve search performance Content objects related to products </li> <li> Clear content types improve search performance Content objects related to products Raw search results </li> <li> Contextualizing advanced features </li> <li> Session data suggest progression and context </li> <li> Session data suggest progression and context search session patterns 1. solar energy 2. how solar energy works </li> <li> Session data suggest progression and context search session patterns 1. solar energy 2. how solar energy works search session patterns 1. solar energy 2. energy </li> <li> Session data suggest progression and context search session patterns 1. solar energy 2. how solar energy works search session patterns 1. solar energy 2. energy search session patterns 1. solar energy 2. solar energy charts </li> <li> Session data suggest progression and context search session patterns 1. solar energy 2. how solar energy works search session patterns 1. solar energy 2. energy search session patterns 1. solar energy 2. solar energy charts search session patterns 1. solar energy 2. explain solar energy </li> <li> Session data suggest progression and context search session patterns 1. solar energy 2. how solar energy works search session patterns 1. solar energy 2. energy search session patterns 1. solar energy 2. solar energy charts search session patterns 1. solar energy 2. explain solar energy search session patterns 1. solar energy 2. solar energy news </li> <li> Recognizing proper nouns, dates, and unique ID#s </li> <li> #3 Reduce jargon </li> <li> Saving the brand by killing jargon at a community college Jargon related to online education: FlexEd, COD, College on Demand Marketings solution: expensive campaign to educate public (via posters, brochures) The Numbers (from SSA): Result: content relabeled, money saved query rank query #22 online* #101 COD #259 College on Demand #389 FlexTrack *onlinepart of 213 queries </li> <li> #4 Learn how your audiences differ </li> <li> Who cares about what? </li> <li> Who cares about what? </li> <li> Who cares about what? </li> <li> Who cares about what? </li> <li> Why analyze queries by audience? Fortify your personas with data Learn about differences between audiences Open University Enquirers: 16 of 25 queries are for subjects not taught at OU Open University Students: search for course codes, topics dealing with completing program Determine whats commonly important to all audiences (these queries better work well) </li> <li> #5 Know when to publish what </li> <li> Interest in the football team: going... </li> <li> Interest in the football team: going... ...going... </li> <li> Interest in the football team: going... ...going... gone </li> <li> Interest in the football team: going... ...going... gone Time to study! </li> <li> Before Tax Day </li> <li> After Tax Day </li> <li> #6 Own and enjoy your failures </li> <li> Failed navigation? Examining unexpected searching Look for places searches happen beyond main page Whats going on? Navigational failure? Content failure? Something else? </li> <li> Where navigation is failing (Professional Resources page) Do users and AIGA mean different things by Professional Resources? </li> <li> Comparing what users nd and what they want </li> <li> Comparing what users nd and what they want </li> <li> Failed business goals? Developing custom metrics Netix asks 1. Which movies most frequently searched? (query count) 2. Which of them most frequently clicked through? (MDP views) 3. Which of them least frequently added to queue? (queue adds) </li> <li> Failed business goals? Developing custom metrics Netix asks 1. Which movies most frequently searched? (query count) 2. Which of them most frequently clicked through? (MDP views) 3. Which of them least frequently added to queue? (queue adds) </li> <li> Failed business goals? Developing custom metrics Netix asks 1. Which movies most frequently searched? (query count) 2. Which of them most frequently clicked through? (MDP views) 3. Which of them least frequently added to queue? (queue adds) </li> <li> #7 Avoid disasters </li> <li> The new and improved search engine that wasnt Vanguard used SSA to help benchmark existing search engines performance and help select new engine New search engine performed poorly But IT needed convincing to delay launch Information Architect &amp; Dev Team Meeting Search seems to have a few problems Nah . Wheres the proof? You cant tell for sure. </li> <li> What to do? Test performance of common queries Before and after testing using two sets of metrics 1.Relevance: how reliably the search engine returns the best matches rst 2.Precision: proportion of relevant results clustered at the top of the list </li> <li> Old engine (target) and new compared Note: low relevance and high precision scores are optimal More on Vanguard case study: http://bit.ly/D3B8c </li> <li> Old engine (target) and new compared Note: low relevance and high precision scores are optimal More on Vanguard case study: http://bit.ly/D3B8c uh-oh </li> <li> Old engine (target) and new compared Note: low relevance and high precision scores are optimal More on Vanguard case study: http://bit.ly/D3B8c uh-oh better </li> <li> #8 Predict the future </li> <li> Shaping the FinancialTimes editorial agenda FT compares these Spiking queries for proper nouns (i.e., people and companies) Recent editorial coverage of people and companies Discrepancy? Breaking story?! Let the editors know! Seed your </li> <li> Can SSA bring us together? </li> <li> Lous TABLE OF OVERGENERALIZED DICHOTOMIES Web Analytics User Experience What they analyze Users' behaviors (what's happening) Users' intentions and motives (why those things happen) What methods they employ Quantitative methods to determine what's happening Qualitative methods for explaining why things happen What they're trying to achieve Helps the organization meet goals (expressed as KPI) Helps users achieve goals (expressed as tasks or topics of interest) How they use data Measure performance (goal- driven analysis) Uncover patterns and surprises (emergent analysis) What kind of data they use Statistical data ("real" data in large volumes, full of errors) Descriptive data (in small volumes, gen...</li></ul>