17
Adobe Analytics White Paper Reenergizing Your Web Analytics Program By Adam Greco, Senior Partner at Web Analytics Demystified Sponsored by Adobe Introduction Web analytics, or digital analytics as some now call it, is an exciting field. Over the past decade, the online channel has exploded with almost every business and organization moving online. Whether it be via a traditional website, mobile applications, or social networks, consumers expect that they can connect to your organization any time across any device. As the use of the Internet grew, along came the need to track how consumers were using the new digital channels. This desire to see click paths, results of online marketing campaigns, and online key performance indicators (KPIs) created the web analytics industry. What started as a way to see when visitors abandoned websites or how many clicks were generated from Google or Yahoo! has evolved into a self-sustaining industry, including web analytics tools, programmers to collect the data, and analysts to provide meaningful insights. As a reader of this white paper, odds are that you manage a web analytics program or are part of a team related to web analytics. Unless you were part of the team that initially brought web analytics to your organization, it is difficult to describe the genuine excitement that surrounded most web analytics launches during the past decade. For those who have not experienced it, let me re-kindle some of the fervor that existed at that time. Imagine that your organization had spent great sums of money building a website. It involved design, functional specifications, executive approvals and so on. Once the website was up and running, the only way to determine how it was functioning was by going through rows of log files to find clues about website conversion rates and path flows. A few years later came the emergence of new web analytics tools, like Webtrends, Coremetrics, Omniture SiteCatalyst (the precursor to Adobe Analytics), and Google Analytics. Suddenly, it became possible to use JavaScript tags to collect massive amounts of data and software-as-a-service (SAAS) interfaces to visualize your website data. You could export the data to Microsoft Excel, be alerted when KPIs go up or down, and integrate web analytics data with data from your other marketing systems. These were revolutionary times! While this experience may not excite everyone, to an online marketer or a data geek, the emergence of web analytics was a watershed moment. Companies were falling over themselves to implement these new web analytics tools. Executives were writing checks left and right, and budgets were aplenty for these new, exciting technologies. Board meetings were filled with web analytics reports as it reached “flavor of the month” status previously enjoyed by customer relationship management (CRM) tools and enterprise resource planning (ERP) tools before them. So now I ask you: Do you feel that your organization has the same level of excitement about web analytics today as just described? Unfortunately, for most companies that I work with, the answer is a resounding “No!” Somewhere along the way, web analytics has fallen from the spotlight in many (not all) organizations. In most cases, those running web analytics inherited their position from someone else and were never part of the initial web analytics rollout. They never had a chance to feel the enthusiasm and optimism that once existed. It is analogous to inheriting a used car that is pretty worn out and wondering what the car was like when it was brand new. Many people I meet are left with a sagging web analytics program, relatively poor end-user adoption, low team headcount, and questions about whether additional investment should be made in web analytics staff and tools. If you are using a paid web analytics tool, the topic may arise when your contract is up for renewal. Many companies I work with dread going back to their bosses to ask for renewal money for web analytics tools that aren’t producing the return on investment they used to. Table of contents 1: Introduction 2: Why do web analytics programs fail? 5: Assessing your current state 8: Reenergizing your web analytics program 16: Conclusion

Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

Adobe Analytics White Paper

Reenergizing Your Web Analytics ProgramBy Adam Greco, Senior Partner at Web Analytics Demystified Sponsored by Adobe

IntroductionWeb analytics, or digital analytics as some now call it, is an exciting field. Over the past decade, the online channel has exploded with almost every business and organization moving online. Whether it be via a traditional website, mobile applications, or social networks, consumers expect that they can connect to your organization any time across any device.

As the use of the Internet grew, along came the need to track how consumers were using the new digital channels. This desire to see click paths, results of online marketing campaigns, and online key performance indicators (KPIs) created the web analytics industry. What started as a way to see when visitors abandoned websites or how many clicks were generated from Google or Yahoo! has evolved into a self-sustaining industry, including web analytics tools, programmers to collect the data, and analysts to provide meaningful insights.

As a reader of this white paper, odds are that you manage a web analytics program or are part of a team related to web analytics. Unless you were part of the team that initially brought web analytics to your organization, it is difficult to describe the genuine excitement that surrounded most web analytics launches during the past decade. For those who have not experienced it, let me re-kindle some of the fervor that existed at that time. Imagine that your organization had spent great sums of money building a website. It involved design, functional specifications, executive approvals and so on. Once the website was up and running, the only way to determine how it was functioning was by going through rows of log files to find clues about website conversion rates and path flows.

A few years later came the emergence of new web analytics tools, like Webtrends, Coremetrics, Omniture SiteCatalyst (the precursor to Adobe Analytics), and Google Analytics. Suddenly, it became possible to use JavaScript tags to collect massive amounts of data and software-as-a-service (SAAS) interfaces to visualize your website data. You could export the data to Microsoft Excel, be alerted when KPIs go up or down, and integrate web analytics data with data from your other marketing systems. These were revolutionary times!

While this experience may not excite everyone, to an online marketer or a data geek, the emergence of web analytics was a watershed moment. Companies were falling over themselves to implement these new web analytics tools. Executives were writing checks left and right, and budgets were aplenty for these new, exciting technologies. Board meetings were filled with web analytics reports as it reached “flavor of the month” status previously enjoyed by customer relationship management (CRM) tools and enterprise resource planning (ERP) tools before them.

So now I ask you: Do you feel that your organization has the same level of excitement about web analytics today as just described? Unfortunately, for most companies that I work with, the answer is a resounding “No!” Somewhere along the way, web analytics has fallen from the spotlight in many (not all) organizations. In most cases, those running web analytics inherited their position from someone else and were never part of the initial web analytics rollout. They never had a chance to feel the enthusiasm and optimism that once existed. It is analogous to inheriting a used car that is pretty worn out and wondering what the car was like when it was brand new.

Many people I meet are left with a sagging web analytics program, relatively poor end-user adoption, low team headcount, and questions about whether additional investment should be made in web analytics staff and tools. If you are using a paid web analytics tool, the topic may arise when your contract is up for renewal. Many companies I work with dread going back to their bosses to ask for renewal money for web analytics tools that aren’t producing the return on investment they used to.

Table of contents1: Introduction2: Why do web analytics

programs fail?5: Assessing your

current state8: Reenergizing your

web analytics program

16: Conclusion

Page 2: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

2Adobe Analytics White Paper

Believe it or not, you are not alone. As I travel the world in my consulting business, I find that many of the web analytics programs that I encounter have stagnated. The good news is that it doesn’t have to be this way.

In my time working for leading web analytics vendors, as a director of web analytics at Salesforce.com and as a senior partner at Web Analytics Demystified, I have had the opportunity to help turn around hundreds of web analytics programs. In this white paper, I share some techniques that I use to reenergize web analytics programs. Even if you are fortunate enough to have a fully functioning web analytics program, you might find these techniques useful for taking your program to a higher level of performance.

Why do web analytics programs fail?Many years ago, when I worked for Omniture (prior to the acquisition by Adobe), I was part of a special team that was sent into accounts that were struggling to get value from their investment in web analytics. For fans of the movie Pulp Fiction, you could think of me as The Wolf (played by Harvey Keitel). I was sent in to clean up situations that had gone bad. While this was a stressful role, it did afford me the opportunity to identify the most common causes of web analytics program failures.

The biggest lesson I learned in this role was that the majority of these failures had nothing to do with the web analytics tool.

I understand how easy it is to blame your stagnating web analytics program on vendors with which you partner. But this is akin to blaming your scale for the fact that you cannot lose weight. When I worked for a web analytics vendor, I would typically get an earful about all the missing features and interface issues that had led to the demise of the web analytics program. But as I walked these organizations through the reenergizing process, they came to realize that the web analytics tool was nothing more than a red herring.

While not an exhaustive list, the following are many of the culprits I have seen derail web analytics programs. Awareness of these potential issues can help you determine to what extent they have impacted your web analytics program and possibly steer clear of them in the future.

Incorrect business requirementsBy far, the most prevalent issue I find when dealing with troubled web analytics programs is a misalignment in business requirements. When working with companies, I often interview web analytics stakeholders and find that the questions they have are not congruent with the data and reports being provided by the web analytics team. I attribute this to the fact that many organizations believe that once they implement a web analytics tool, they can check it off the to-do list and then move on to the analysis phase of the web analytics program.

Unfortunately, that is not the case. Businesses change all the time, and with these changes comes a need for different and new insights. Web analytics programs (and their corresponding web analytics implementations) have to constantly adapt in order to stay relevant. This requires the web analytics team to be on the lookout for new business requirements and to have the staff and tools needed to collect new data points that are important to the company.

For example, if your web analytics implementation doesn’t take into account social network likes, shares, follows, and so on, you may not be keeping with the times. If you aren’t tracking mobile or video behavior, you may soon find yourself excluded from key meetings and presentations. While the long-term solution is to put in place processes to re-evaluate the usefulness of existing web analytics requirements, in the next sections, I will provide some short-term steps you can take to avoid business requirement misalignment.

Wrong mindset toward web analyticsWhat is the purpose of web analytics? Do you know? Does your company know? If you asked your executives and stakeholders to write down the reason your organization has invested in web analytics, how many different answers would you receive? I have tried this and can tell you the results are not pretty. The truth is that different people at your organization have different needs and expectations when it comes to what your team is supposed to provide. Some think web analytics is all about seeing the performance of online marketing campaigns. Others think its primary purpose is to inform web design decisions. No matter how much you fight it, some will just want to know how many hits their pages get (though these folks are probably still using dialup Internet access!).

Page 3: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

3Adobe Analytics White Paper

If your stakeholders look at your team as the “reporting” team that spits out the same numbers each day or week on a massive dashboard, then you are not going to be viewed as a strategic function in the company. It is incumbent upon your team to level with your constituents about why your team exists and your philosophy towards web analytics should drive all that your team does. In the next sections, I will show how to reestablish your team in the organization and hopefully regain some of the stature it had when web analytics was first introduced.

Lack of executive supportAs with any type of business function, if you don’t have executive support, making progress is difficult. Part of the reason that many web analytics programs don’t have executive support is related to items just discussed. If you are not answering relevant business questions and those in the company are unsure of your mission, it is hard to get executives to put their name and reputation on the line for you. Executives are shrewd animals in that they like to attach themselves to successful projects and teams and avoid teams that are not perceived as valuable.

When I joined Salesforce.com as the director of web analytics, the web analytics program was struggling. The team of two was reporting basic web metrics in a routinized manner and little was done with the data. Very few end-users logged in to the analytics tool, primarily because the data was not trustworthy. There were even discussions about replacing the web analytics tool to cut costs.

However, using the techniques that I will cover in this paper, within one year, we reset team expectations, realigned our business requirements, reimplemented the web analytics tool and started adding value to the company. For the first time in years, people were asking (sometimes begging) for a login to see the new data we were collecting. The perception of our team within the organization skyrocketed. We began reporting to a new person in the organization, and before we knew it, we had budget for new tools, and our headcount expanded to five people. Even the chief marketing officer (CMO) took notice of the progress we had made, and we earned the executive support we desperately needed.

Poor or inexperienced resourcesAnother factor in the success or failure of a web analytics program are the human resources involved. Since the web analytics industry is so new, you won’t find many people who majored in web analytics in college. Most people in the industry started somewhere else and accidentally ended up in web analytics. Adding to this, web analytics professionals are in such high demand that the average employee turnover rate is 18 months. This means that most web analytics teams are staffed by people with little experience, many of whom have not been with the company long enough to truly understand the business. The latter contributes to the misalignment of business requirements discussed previously.

When I work with companies struggling with web analytics, I encounter directors of web analytics who don’t understand the basics of which data can and cannot be collected via JavaScript tags, programmers with little regard for privacy concerns, and web analysts who think the only type of chart is a pie chart. And often, times these are not small companies, but rather, Fortune 500 companies. This is equivalent to fielding a major league baseball team with a bunch of guys who have never played at the professional level. You won’t win many World Series championships doing that (as a Chicago Cubs fan, I know!).

Poor data qualityLower on the list of causes of failure, but still important is data quality. If you ask your stakeholders to make important business decisions based upon data and insights your team is providing, you had better make sure that your data is correct. Over the years, I have worked with hundreds of web analytics teams and, at the onset, I always take the time to look at the data being collected to see how good or bad it is. While many don’t like to “get into the weeds,” I think it is critical since data is the foundation of your web analytics program.

When I crack open web analytics reports, I commonly see data points that have indecipherable values, incorrect values, garbage data, or sometimes no data at all. While your team might know which data elements can be trusted and which cannot, the distributed nature of web analytics tools is such that end users can log in and look at data without your knowledge. I have seen many cases in which the web analytics team was called into a manager’s office to explain an analysis done by their direct report, only to have to inform the manager that their associates’ analysis was done using invalid data or reports. Imagine how that affects the perception of your web analytics program within the enterprise. You can almost visualize the executives running away from your team.

Page 4: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

4Adobe Analytics White Paper

When I work with clients, I force them to focus on data quality and stress that they need to have the highest confidence in their web analytics data. My data quality mantra is:

“If you aren’t willing to put your name and reputation within the company behind a particular web analytics report, then for all intents and purposes, the data might as well be wrong”

However, one qualifier I will make is that it is important to keep in mind that web analytics data is inherently flawed due to cookie deletion, do-not-track browser features, and the reliance on JavaScript tags. Web analytics data collection is far from being an exact science. Therefore, my focus on data quality is not about making sure you are collecting all of the data, but rather that the data you are successfully collecting is as correct as possible. I would rather have a web analytics implementation that has 10 data elements than one with 100 data points but with questionable data quality.

I have seen many organizations fall into the trap of trying to make web analytics data perfect and match back-end operational systems. While it is worthwhile to have your web analytics data be as close to trusted operational systems as possible, such as orders in Adobe Analytics matching back-end orders, if the data matches 95% of the time, your web analyses will still be valid. This point also applies to the folks who implement two web analytics tools and spend their days trying to get the two tools to match each other (which rarely happens!).

Insufficient web analytics tool power or knowledgeThe last factor I will cover related to web analytics program derailment is a lack of web analytics tool power and knowledge. While it is popular to say that whichever web analytics tool you use and how much you know about it doesn’t make a difference, my experience contradicts this.

The first step is making sure you have the right tools. While most leading web analytics tools have a common set of features, there is still a noticeable disparity amongst them. This disparity can take the form of features one tool has that another does not, or simply the approach that a particular tool takes to accomplish the same task.

While you can debate which tool is best, the truth is that every company has different needs and there may be reasons why one tool is better for your organization than another one. But if you need to collect hundreds of data points and the tool you use only allows you to track 10, you may have a problem. If the interface of the tool you use is difficult for your end users to understand, they probably won’t use it. If you are short-staffed and require a lot of outside help, you should make sure that the tool you use has an ecosystem around it from which you can pull qualified resources if needed.

Regardless of which tool you use, the next important aspect is how well you know how to use your tools. You can have the most feature-rich web analytics tool on the planet, but if you are only using 20% of its capabilities, you still might struggle to succeed. When I was in “The Wolf” mode back in my Omniture days, I would often encounter clients who insisted that all their problems were attributable to deficiencies in the tool. After they were done complaining about the tool, I would ask them how well they thought they knew how to leverage the tool in which they had invested significantly. Brimming with confidence about their tool prowess, I would challenge them to a facetious bet:

“Would you be willing to take a winner take all bet of $5,000 of your own personal money against $5,000 of mine that you cannot tell me how to use the tool to collect the necessary data to answer three business questions I will pose?”

Suddenly, the room would get very quiet as I withdrew my checkbook and a pen from my laptop bag. I can only recall one instance in which someone at a company was brave enough to take my bet (I let them keep their money). But in the process, they realized that perhaps they hadn’t invested as much in knowing how to use their web analytics tool as they should have. Many hadn’t been through any training since the initial training provided when they first implemented the tool years earlier. Over time, I have found that companies that have people on staff who know how to fully utilize their web analytics tools tend to be more successful.

There are many reasons why web analytics programs can devolve over time. In some cases, just one of the preceding items could cause your downfall, but for the most part, I have seen that often a combination of these factors that leads to failure. Like most things in life, you cannot avoid issues completely, but you can take steps to minimize your risk along the way. In the next section, I will show how to determine how bad your situation is and then, in the final section, will share my advice on how to turn things around as quickly as possible.

Page 5: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

5Adobe Analytics White Paper

Assessing your current stateThe first step in reenergizing your web analytics program is to assess where you currently stand. I have seen cases in which web analytics teams think they are doing great, when in reality, end users find their program useless. I have also seen the converse in which companies expect too much from their web analytics program and are afraid to let anyone use for fear it isn’t ready yet. Regardless of whether you are a two out of a ten or a seven out of a ten, it is important to have a realistic understanding of where you are and your team’s perception within the enterprise.

Check the usage logs

I like to start the assessment process by looking at who is logging in to your web analytics tools and how often. While not all stakeholders will use your web analytics tools directly, I find that it is a good proxy for overall web analytics adoption. For example, let’s imagine that your organization has over 200 users with access to login and view web analytics data. However, upon deeper inspection, it turns out that fewer than 10 users account for 90% of all usage. Granted, it is entirely possible that those 10 users are presenting data to key executives, but this level of tool usage would definitely raise a red flag in my mind. Why do so many people have a login that they never use? Are they not trained? Do they not trust the data? Is the data being collected incongruent with their needs? Is your executive sponsor logging in? If you can’t get the person paying your paycheck to log in, that could be a problem. Conversely, had you found that 90% of users were logging in monthly, it would provide you with a more optimistic viewpoint on how the team is doing.

In addition to logged-in usage, the next area you should look at is how many different data points your end users are viewing. If you are collecting 80 data points related to your online properties, but only 12 are used each month, that tells you something. Perhaps your end users aren’t aware of all the interesting data you are collecting? Is training the issue? Are you collecting the right data, or are you just weighing down your web pages unnecessarily? Concurrently, you can look at which information is being shared within the organization. While each tool does this slightly differently, it is usually possible to see how many reports and dashboards are shared publicly and how often each is distributed via email and to whom. It might not bode well if you have only one dashboard shared within the enterprise. Nor does a situation in which you have 200 dashboards shared.

I am always amazed at how much you can learn by simply looking at the logs behind your web analytics tool. If you are a web analyst, you should be adept at using data to make more informed business decisions, so it should be second nature for you to check this information regularly and using it to make your web analytics program more effective. You can pull these statistics monthly and report them to your boss as one of your internal team KPIs. If your team is doing a good job at improving the web analytics program, enhancing your implementation, and adding value to the company, your perception should rise and more people will be motivated to log in to your tools.

Page 6: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

6Adobe Analytics White Paper

Voice of customerObviously, there is more to assessing your web analytics program than reviewing usage logs, just as clickstream data alone doesn’t always tell the full picture in web analytics. Therefore, you should supplement this research with voice of customer data. In this case, I am not referring to Voice of Customer tools, but rather, going out and talking to your stakeholders about what is and isn’t working for them related to your web analytics program. If you feel they will tell you the truth, schedule meetings with them and ask them how you and your team are doing. If you don’t think they will tell you face-to-face, consider doing an anonymous survey. If all else fails, you can hire an outside consulting firm to come in and conduct interviews in hopes that your stakeholders will be more honest with outsiders than with insiders.

Regardless of how you get the information, there are three simple questions that you should ask:

• To what extent do you trust the data you have seen in the web analytics reports?

• How relevant do you feel the data in the web analytics tool is to your job?

• How much value do you derive from the web analytics function at the company?

Getting honest answers to these three questions will tell you most of what you need to improve your web analytics program. The first tells you whether you have a data quality problem. The second question indicates whether or not your stakeholders think you are collecting the right information to help them do their jobs. The last question addresses a more holistic view of the team’s performance. If your program scores well on the first two questions but poorly on the third, it is likely that stakeholders don’t think your team is doing quality analysis, or you might have a customer service problem. Conversely, if you score well on the third question, but poorly on the others, you might have a great team but poor data quality and a sub-par web analytics implementation. I also recommend adding an open-ended comments field so anyone can optionally add commentary. Conducting a survey like this is inexpensive and takes about 5 minutes to create. Having already reviewed the usage logs in the previous step, you can strategically target this survey to the right audience. The results are often swift and astounding. Verbatim responses are extremely valuable and often contain examples of times that the data had been incorrect or provide feedback on what could be tracked to add more value.

While the overall results can be a tad depressing, I find that it is better to proceed with an accurate understanding of your situation than to live in a fantasy world. You should not be discouraged if the results of this process inform you that things are not going as well as you had hoped. I have found that the act of soliciting this feedback is a vital step in showing your stakeholders that you care and that you want to make things better. The scores you receive will also serve as a baseline for measuring future success. In fact, I recommend that you retain the survey and send it to your stakeholders each year thereafter. By asking the same exact questions you can see how much you have improved in the past year. I even recommend making these scores a portion of your team’s year-end bonus, providing a positive incentive for them to care about data quality, ensure that data being collected is relevant, and to be customer-service focused. And if your research shows that your team is doing great, you can use the data to ask for a raise or more budget!

When in doubt, lock them outIf, for one reason or another, it is not possible for you to pull usage logs or get honest answers from your stakeholders and you think your web analytics program is on life support, there is one other approach, albeit drastic, that you can use to see who is using your data. As the name of this section implies, you can lock end users out of your web analytics tools, and unsubscribe them from regularly scheduled reports. I have had several occasions in which I was forced to do this, and it can be a very humbling experience. Imagine locking out all of your users except your own web analytics team and then waiting for the phone to ring and for the complaints to come. Unfortunately, more often than not, the phone doesn’t ring and no one complains. The same people who a couple months earlier made you work nights and weekends to get them information might not even notice that they no longer have access to the data. If you have the stomach for this approach and have no other way to identify who is using your tools, this is a last ditch way to do research.

If you are not that brave and simply want to see which data elements your end users care about (and you don’t have log data on this), you can disable or hide individual data elements. For example, if you are collecting an attribute for “language” and don’t think it’s being used very often, you can hide this element and wait a few months to see if anyone ever complains. If someone needed it, you would hear about it, and you can then unhide it. I sometimes also use this technique when I am trying to retire esoteric data elements to make room for new ones.

Page 7: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

7Adobe Analytics White Paper

Audit your implementationFor better or worse, a big portion of your web analytics program is your web analytics tool implementation. As part of your current state assessment, I recommend that you consider an implementation audit. Implementation audits are normally conducted by those outside of your organization and allow an objective third party, to review what you are tracking and how closely it aligns with industry best practices. I conduct these types of audits regularly and have found that it helps organizations identify things they previously didn’t know they could be doing with their implementation. Often times, you “don’t know what you don’t know” and having an outsider look at your website and your current implementation can help generate new types of ideas. There have been countless times that I have provided new business requirements to clients that have ended up superseding the requirements that the web analytics team and stakeholders identified. This is often due to the fact that outside consultants and vendors have been involved with hundreds of web analytics implementations, but most people at an organization have only been associated with their own implementation.

In addition to identifying new requirements, implementation audits can often identify mistakes in your current implementation or ways to make it more efficient. For example, I may find situations in which a company has different implementations across various brands or geographies. Sometimes my audit is often the first time that an organization discovers the extent of the issues that exist with their implementation. Depending on how well you know your web analytics tool, having an outsider who specializes in your tool is a great way to ensure that what you currently have implemented is working. This audit process also provides a baseline for you to compare your new and improved implementation in the future and document the progress your team has made.

Another tool that can be used to audit your current state is a web analytics maturity model assessment. These tools ask you questions about analytics within your organization and show you areas of strength and weakness and provide suggestions for improvement (a free one provided by Adobe can be accessed here: http://myanalyticsscore.com). If you use these tools, I suggest that you have at least one key stakeholder your organization complete the tool to see how your perceptions of your team differ from those of your internal customers.

By using some or all of the preceding steps, you should be in a good position to know where your team stands internally and its perception within the enterprise. This information will help you develop a game plan for improving your stature, and the feedback will be invaluable as you move to the next stage of the reenergizing process.

Page 8: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

8Adobe Analytics White Paper

Reenergizing your web analytics programSo far, we have reviewed some of the key reasons why web analytics programs can stagnate and identified ways to assess your current web analytics program. Now I will dig into the tactical steps that you can take to reenergize your web analytics program using techniques that I have used with clients over the years.

Keep in mind that, based upon the information you found in the assessment stage, all of the following steps many not be applicable your organization. However, in order to cover all aspects of the reenergizing process, I am asking you to imagine that your web analytics program has hit rock bottom and requires a full resuscitation. Hopefully your situation is not as bad as what I have encountered at some of my clients, but reviewing all the steps involved may still prove useful in the long run.

Thermometer or thermostat?Whether your web analytics program is doing well or poorly, one important step to take is to reestablish the overall mission of your team. As described in the preceding “Wrong Mindset Towards Web Analytics” section, different stakeholders at your company have very different philosophies about web analytics and on what your team should be focused. If you think your team is supposed to be identifying conversion issues on the website and your constituents just want reports from you, problems will ensue. Unfortunately, with employee turnover being so prevalent in the web analytics field, and at companies in general, it is imperative that you continuously remind those in your organization why your team exists.

Most things in life are about setting expectations and web analytics is no exception. I am shocked to find that many web analytics teams don’t have a team mission statement related to web analytics. If your own team isn’t sure what it stands for, how can your internal customers?

To rectify this, I suggest that you decide what your team’s mission will be and formalize it in a mission statement. If you want to be reporting robots and just produce the same reports each week, that is your prerogative, and you should communicate that accordingly within the organization. Personally, I am not a fan of just providing reports since it is more reactive, or what I like to call a thermometer philosophy. Thermometers are great at telling you what the current temperature is or recording what it was yesterday and the day before. Unfortunately, thermometers cannot change the weather, but rather, can only report it. While it is ok to have a thermometer philosophy, I find it to be very limiting in the long run. Most organizations don’t place a high value on reporting and if this is all you do, it is highly likely that your budgets will be cut, and your resources may be outsourced to a lower cost region.

I recommend that you have loftier goals for web analytics. While not perfect, I like to set the mission of web analytics teams as the following:

The mission of the web analytics team is to collect and use website (or mobile app) data to improve conversion rates.

I find that this simple statement covers most of the work done by web analytics teams and helps avoid the types of work that web analytics teams shouldn’t be doing. This statement is also congruent with the web analytics philosophy I subscribe to: the thermostat approach. Unlike thermometers, thermostats allow you to control the temperature. If you are too cold, turn the dial and make it warmer. Web analytics, when done correctly, provides the ability to identify what is working and not working online so you can take action.

No matter what type of website you have, it has some form of conversion. Business-to-business sites (B2B) are often focused on lead generation. Retail sites are normally revenue based. Regardless of what conversion means to you, you should be using web analytics to improve it through its core features of pathing, correlation, fallout, conversion metrics, and so on. More importantly, under the thermometer approach, your team is a cost center, whereas with the thermostat approach, you can be a profit center. When times get tough, cost centers get cut and profit centers get investment. You want to be seen in the organization as the team that is helping to improve the business, not just report on the business. That’s how you remain relevant, get headcount, and increase budgets.

Once you decide on your web analytics philosophy, you need to take a hard look at where you stand today. Are most of your activities thermometer or thermostat related? If your team decides it wants to be a thermostat but is 90% thermometer today, you have a big change management job on your hands. More often than not, this is the case I see at clients. They want to be thermostats, but are stuck being thermometers. Changing this begins with your team and then branches out to your stakeholders. You need to extol the benefits of your team acting as a catalyst for change.

Page 9: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

9Adobe Analytics White Paper

I find the best way to start is to identify which conversions are the most important to your organization and to create a financial model that allows you to quantify the impact of website improvements. For example, if you are a B2B company, website leads and new accounts are often the primary conversions. Using data provided by finance, you can create a calculator that shows how much each new lead means to the bottom line of the business. Using this calculator, you can identify how a potential website change, identified through web analytics, leads to incremental leads and revenue. This can help you show how much money your team can influence if given the opportunity.

Let’s say you identify a few website improvement ideas your team has come up with by looking at your data and then show the financial impact (in this case leads) if your hypotheses are correct. Armed with this information, you can go to your boss and asked if he’d prefer that your team work on making these improvements to the website or spending its time reporting how the website did last week. You can guess which one most bosses will deem to be more worthwhile and with his permission, you may be permitted to work with the design team to modify the website and quantify the results in terms of leads. While not all of these ideas will be winners, hopefully you will have enough success to be able to justify your team’s cost (both employees and tools) relatively easily.

But more importantly, your team will start getting a reputation as a change agent instead of a reporting team. After that, you can approach your boss and say that you could do more of this fun “thermostat” activity only if you can shed some of your lower value thermometer activity. Once your boss is hooked, I suggest that you identify the top thermometer reports your team is currently being asked to provide and build them into a robust dashboard, which can be scheduled and sent to stakeholders on a weekly basis. This one dashboard might answer 80% of the routine questions and allow your team to spend more time on thermostat activities. This is how you get your team to start generating money instead of simply generating reports.

As you can see, something as simple as establishing your general philosophy can have a huge impact. When coupled with the other steps I will discuss, you can begin to see how your internal team perception can change and how your next year’s survey scores can improve. Therefore, I suggest that before you do anything else, you make sure that you establish your team mission and make sure everyone knows it.

There is no spoonUnfortunately, changing the perception of your team within the enterprise won’t work if your web analytics team isn’t providing the information needed to add value. The preceding example presumed that your team had relevant web analytics data to analyze and generate valid hypotheses for improvement. Hence, to be successful, you need to be sure that the information you have available is relevant to your stakeholders.

As discussed previously, the primary reason I see web analytics programs fail is that teams have incorrect business requirements. Earlier, I discussed why this happens, and now I will show you how to rectify this. Let’s imagine that you have done some internal research, and your stakeholders have indicated that they find little value in the data currently being collected by your web analytics team. This can be extremely frustrating, since it may have been the same people a year earlier telling you that they needed that data. What I have found is that many of your stakeholders don’t know what they want, so they guess about what data they think might be interesting. I often hear a request to track “every link that can be clicked on the page,” which usually means that someone doesn’t really know what they want.

So how do you solve this? You may not know your stakeholders’ business needs like they do, and they most likely don’t understand what can and cannot be collected with web analytics tools. I find that the best way to get over this impasse is to use what I call the “there is no spoon” exercise. Here’s how it works: Ask your stakeholders to come to a meeting and have them imagine that your company has no website at all. Pretend it never existed, as if they were in The Matrix. Moreover, ask them to imagine that building a website would take an exorbitant amount of money. Next, have them imagine that they are responsible for going to the CEO and asking for millions of dollars to create a website and that your CEO hates the Internet and technology in general.

The next step is to have your stakeholders brainstorm and identify the reasons they would use to justify building the website, keeping in mind that they are facing a hostile CEO who is dead set against the idea. Using our B2B example, let’s imagine that one of your stakeholders lists the ability to generate leads around the clock, even when sales people are sleeping. Playing devil’s advocate (in the role of the grumpy CEO), you may tell them that the company likes leads, but only those leads that turn into clients. This could lead to a discussion about

Page 10: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

10Adobe Analytics White Paper

whether using leads as a KPI is the right choice, since not all leads are equal. In the end, you may decide that connecting lead submissions with actual account opens and optimizing to account opens instead of leads is what would make web analytics more valuable to the organization. This could drastically change your overall approach to web analysis and may only come about by questioning your current business requirements.

By the end of this exercise, your team may have collected a much smaller but more potent list of business requirements, and I have seen cases in which stakeholders are amazed at how the “spoon” exercise helped them focus on what was really important. As a joke, you can ask the stakeholders if any of them would like to suggest tracking clicks on every link as a top requirement to the CEO, and you can imagine how embarrassed they will be that they had ever requested that in the first place.

If, even after undertaking this exercise, you still have too many business requirements, here is another trick I use to boil them down to the really important ones. When working with your stakeholders, for each requirement, ask them to provide tangible examples of how they will use the data they are asking for. It is very easy to ask for something, but when you make people tell you how they will use the data, it forces them to justify it and may cause them to reconsider their request. For example, let’s say your stakeholder says it’s critical to track User Agent String in your web analytics tool. You have your suspicions that this is yet another data point that will be worked on but never used. Try pushing back a bit and ask them to describe exactly what types of web analysis they will produce with that data to justify adding it to your implementation. If they come up with some good ideas, document them in the requirements spreadsheet. If they can’t, see if you can remove it from your list or at least deprioritize it. If the item is removed, it’s less work for you, and if it remains, you know what your web analysts should start working on as soon as it is implemented. Therefore, taking this “how will you use it” step creates a win-win situation for you and your team.

Over time, it is easy to let new, insignificant things creep into your web analytics implementation. Sometimes less is more when it comes to web analytics programs. End users can get confused when there are too many data elements and reporting becomes more of a challenge. You may find that you have to repeat the spoon exercise every few years, unless your team is vigilant about adding new data elements. Either way, I have found this exercise to be invaluable, especially in situations where your team receives low scores from stakeholders with respect to collecting relevant data. I encourage you to try it and see if it helps you discover what is really important to your organization.

Play web analytics implementation jeopardyAfter you have created a new and improved list of business requirements via the spoon exercise, the next step in the process is to play what I call “web analytics implementation jeopardy.” In the game show Jeopardy, contestants are required to answer trivia tasks in the form of a question (for reasons I could never understand). Rest assured, I am not going to force you to learn trivia, but what I will encourage you to do is to reframe your current web analytics implementation in the form of business questions.

Whatever state your web analytics implementation is in today, it is collecting various data points. At some point, either your team or your stakeholders asked for the data to be collected. However, most companies I encounter have no record of why data was collected initially.

When I was in my Wolf role, another thing I would do after companies were done complaining was to ask to see their list of business requirements. Most of the time, the only document they could produce was a technical document showing which data elements were to be captured and on which pages. When I pressed them for the reasons why those data points were collected, the silence was deafening. If they ever had actual business requirements, they were long gone and when I did receive requirements, they were nothing like the ones generated from the spoon exercise.

I am a big believer in business requirements, because they help connect stakeholders to the web analytics team. If you have solid business requirements and then deliver data that answers those business requirements, then it would be logical to conclude that you will add value to your organization. If no value is created, then you were given incorrect business requirements or did not fulfill them appropriately, both of which can be fixed if needed.

Page 11: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

11Adobe Analytics White Paper

Therefore, to rectify any issues you are currently having related to stakeholder relevancy, I suggest that you play web analytics implementation jeopardy. To do this, simply review every data element you are currently tracking and attempt to identify all business questions it can help stakeholders answer. When you are done with this process, you will have a list of business questions that your team can currently answer, assuming that the implementation is complete and the data quality is sound. Next, add this list to the new requirements you identified in the spoon exercise and to those coming out of your implementation audit. From there, you can de-duplicate the list as necessary, and assign each a unique requirement number.

I also find it beneficial to associate a department or person’s name with each business requirement. This should be the department or person that you feel cares the most about the requirement. Another step I take is to document which of the business requirements came from the spoon exercise and implementation audit versus the ones that came from your current implementation. This helps later in assessing how many new business requirements you have to implement.

Once this step is complete, you will have a draft list of business requirements that you can begin to socialize. I suggest reviewing this new list of requirements with your stakeholders and asking them to assign a priority to each. Keep in mind that when assigning a priority, there is a tendency to make everything high priority, so go back into your grumpy CEO persona and make sure that your stakeholders would still say something is a high priority in front of the CEO. You may have to add a few columns to your business requirements spreadsheet allowing different stakeholder groups to assign their own priorities. This is fine since you can simply use a formula to calculate a total priority score or weight them accordingly if some departments or people are more politically important than others. Then, all that is left to do is to get official sign-off for your requirements. This sign-off is an important step, since you are getting them to agree that if your team provides the answers to the high-priority business you jointly created, they would find value and commit to their use.

The process of web analytics implementation jeopardy, combined with the spoon exercise, is a way to open dialog between you and your stakeholders and align your goals. Your team wants to provide relevant information, and your stakeholders want to have important questions answered. I find that organizations that don’t establish this dialog leave the success of their web analytics program to chance, often with disastrous results. By taking a methodical approach of redefining the key requirements and getting agreement from your stakeholders, you are greatly minimizing your risk of failure.

Web analytics implementation take 2Armed with a new and improved (or first ever) list of prioritized business requirements, the ball is now in your court to impress your stakeholders. To accomplish this, you need to first determine what percent of your high-priority business requirements can be answered today. To do this, look at your list of business requirements and determine which data elements in your current implementation can be used to answer each requirement. I tend to document this right in the business requirements spreadsheet and create two columns: one to document which data elements you have, and another to document if you trust the data in that particular element. By going through this process for each requirement, you will be able to total up how many requirements you can act upon today and how many will have to be implemented or reimplemented if they’re not working correctly. This exercise will also help generate a list of data elements you are collecting that are associated with only low-priority business requirements. These low-priority data elements represent candidates for removal if you want to streamline your implementation and avoid unnecessary clutter.

Page 12: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

12Adobe Analytics White Paper

Soon, you will have the actual percent of requirements that are functioning and the percent that require additional implementation. Most companies I work with are shocked by how few of the newly found high-priority business requirements are currently working. However, if your program isn’t scoring high and doesn’t have high adoption, this makes sense, because if you were consistently able to answer high-priority business requirements, your scores would probably be higher.

If your scores are very low, you may want to consider the drastic step of starting the entire implementation over from scratch. If you find that you are under 20% with respect to satisfying business requirements, it may be better to start with a clean slate than to try and salvage what you have.

Kill two birds with one stoneUnless you are the exception, on completion of the preceding steps, you will most likely be staring at a sizeable work effort to implement new items or reimplement existing ones. To quantify how large of an effort this will be, I suggest that you return to the business requirements spreadsheet and quantify how many hours each business requirement will take to implement. You can use column filters to see the hours required for high-priority only, high-plus-medium priorities, and so on. When you have a total figure, you can work through your normal IT processes to get requirements implemented.

However, this reimplementation effort represents an interesting opportunity unique to the web analytics industry. For many of you, the web analytics implementation you own was done years ago before the emergence of tag management systems (TMS). For those unfamiliar with TMS, such as dynamic tag management from Adobe, these tools allow you to deploy web analytics and other tags through a centralized interface. Think of them as content management systems for JavaScript tags. The benefits of TMS include the ability for marketers to collect data without knowing JavaScript, the ability to collect data once and distribute it to several different tools, such as Adobe Analytics and Google AdWords, and, in many cases, improve page load speeds. As if these benefits aren’t enough, many TMS are free so there is no additional budget required.

The web analytics reenergizing process described here represents a golden opportunity to kill two birds with one stone by reimplementing your most important business requirements, while at the same time making a huge leap forward in terms of flexibility and future deployment options by implementing a TMS. I always recommend that any company going through this process use it as a time to implement a TMS, since you will already be cracking open your web page code and engaging with your developers.

If you implement a TMS and reimplement your requirements concurrently, you will save time and money in the long run and, when you are done, have much more flexibility to implement new requirements in the future. I can assure you that once your team becomes more relevant, adds more value, and improves its perception, you will be inundated with new business requirements. Nothing is worse than not being able to strike while the iron is hot and turn around new web analyses while there is excitement around your team. Without a TMS, you may have to tell your stakeholders that they will have to wait six months for IT resources to become available before you can provide for any new requests, which would be a change management disaster.

Page 13: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

13Adobe Analytics White Paper

Know your toolWhile on the subject of reimplementing your updated list of high-priority business requirements, I’d like to revisit the topic of web analytics tool knowledge discussed earlier. Previously, I described how I have seen that having a sub-standard tool or not knowing how to use your chosen tool can be one of the factors that derail your web analytics program.

When you are reviewing your list of business requirements, one of the key aspects of a successful implementation is the ability to map those business requirements to the optimal method of answering them in your web analytics tool. I have seen many cases in which the wrong product features were used to answer business requirements, resulting in incomplete or incorrect data collection. This in turn leads to faulty analysis, which ultimately damages the credibility of your overall web analytics program.

For example, I once worked with an online retailer who used the Adobe Analytics solution, but didn’t have anyone who knew the product very well (they wouldn’t have taken my $5,000 bet). This retailer was attempting to attribute website orders and revenue to its various onsite promotions to see which promotions were driving sales for each product category. If a visitor clicked on a hero promotion on the home page and ended up purchasing $500 worth of products, that hero promotion would receive the credit. However, when looking at the data, they the retailer noticed that most of the success was being attributed to secondary promotion codes, even though it saw a lot of clicks on the most prominent promotion codes, like those on the home and category pages.

After doing some research, I realized that this client wasn’t using Merchandising, a key Adobe Analytics feature, which resulted in the last promotion clicked receiving too much credit for success, instead of each promotion getting credit for the specific products it drove to the shopping cart. This lack of product knowledge, and the failure to understand the inner workings of their web analytics tool, led to incorrect data collection and a lack of trust in the data. A simple fix was put in place and ultimately things got better, but at that point the credibility damage had been done. This is an example of how knowing your product can make a difference when it comes to web analytics and your team’s overall perception.

One time, when working with a B2B client, I encountered the situation described in the spoon exercise section. This company had been tracking leads as the primary KPI for years, only to discover that its stakeholders really wanted to optimize towards opened accounts and revenue. During the requirements brainstorming process, a member of my client’s team responded that tracking website leads was as far as they could go using Adobe Analytics since the rest took place after the website session terminated. I later found out that this topic had surfaced many times over the years and the response was always that it could not be done. Unfortunately, that wasn’t correct.

Based upon my web analytics tool knowledge, I was able to describe the transaction ID feature in Adobe Analytics feature that would allow the client to inject opened account and revenue metrics back into Adobe Analytics in a way that closed the marketing and sales loop. This led to a fruitful conversation about the best way to do it and how to import these post-website metrics.

In the end, it revolutionized the way this client’s web analytics team operated. Optimizing to account opens and revenue later showed that many of their previously held web analytics beliefs were incorrect. In many cases, what generated leads did not generate account opens and revenue. All this progress was made possible by having someone on the team who knew all the capabilities of the web analytics tool, which is why I highly recommend that as part of your reenergizing process you double-down on your tool knowledge to increase your odds of success.

Two strikes, and you’re out!At this point, let’s imagine that you have redefined your business requirements, possibly deployed a TMS to implement many of your high-priority business requirements. You have begun to collect new data and know from your requirements spreadsheet exactly who is interested in each data element. With all this change and excitement, it is natural to want to hurry up and get the data into the hands of your stakeholders. Unfortunately, doing this may be a bit premature.

Page 14: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

14Adobe Analytics White Paper

As mentioned several times, data quality is critical to successful web analytics programs. If you are coming off of a problematic web analytics implementation, the last thing you can afford is to roll out a new implementation with data that is not high quality. Unlike in baseball, in this case, you already have one strike and if you get a second strike, you may be out of there.

For this reason, I recommend that you use extreme caution when rolling out new data elements during the reenergizing process. Instead of rushing to rollout, I recommend that you spend as much time as you can performing data quality checks. This involves debugging your data collection, opening reports with appropriate metrics to see that data is complete, and looking for unexplained data spikes and omissions.

For most of my clients, data quality is what keeps me up the most at night. To mitigate this risk, I recommend that you create an automated Excel data quality dashboard using a tool like Adobe ReportBuilder, which checks your new data points on a daily basis. Using this tool, you can compare yesterday’s data to a rolling average of the past few weeks and use conditional formatting to highlight any items that are beyond expected standard deviations. Over the first few weeks, you will likely identify many problems that would have confuse end users and can fix them before rolling out the new reports to stakeholders. While you will never have perfect data quality, your hard work will often pay off in the end as a year later, your data quality survey scores rise dramatically.

As you implement new data elements, build into your process an extensive data quality check and give each data element enough time to prove that it is working before rolling it out in a production environment. Making this part of your team’s culture will help you avoid a relapse and blowing the second chance you have asked for as part of the reenergizing process.

Change managementWhile on the subject of your new rollout, it is worthwhile to talk about change management. Most web analytics professionals are not experts in change management, but if you undertake this reenergizing process, its a skill that you must develop. In any situation where your stakeholders have a neutral or negative impression of your team, it takes time and a concerted effort to change their perception. Over the years, I have found some change management techniques that help tremendously in this process.

I advocate a slow rollout of your new web analytics implementation. In addition to the data quality reasons just discussed, I have found that most stakeholders can absorb only a certain amount of change at one time. While you may have implemented 20 of their high-priority requirements and verified the data quality, there is no reason to move all 20 to production at once. Doing so could overwhelm stakeholders and keep them from understanding how to use all the new functionality.

Even in cases where you have 10 new reports ready to go, I would launch one every two weeks. As you roll out new capabilities, one of the key aspects of adoption is the ability to train your stakeholders on how they should be used. As each requirement launched, you must provide a detailed explanation of the business requirement, the data it uses, and how web analysis can be done related to it. I suggest building a training slide for each business requirement, complete with a picture of the report and sample uses. You can then monitor each report and see which stakeholders have used it, and follow up with those that have not.

I find that one new web analytics report is often all that busy stakeholders can handle every few weeks. Conversely, your core web analytics team will have access to all new reports and can begin using them, but for change management reasons, not share things related to these reports until they are officially rolled out.

At first glance, it seems counter-intuitive to hold back on great new information your team has successfully implemented. Some might advocate more of a big bang approach, and perhaps this might work in some corporate cultures. But more often than not, I have seen success with the trickle-down approach. Another reason I prefer the latter is that it provides a steady stream of innovation coming from your team. Every few weeks, your stakeholders hear from you and are informed that yet another one of their valued business requirements is available. This demonstrates consistency and that progress is being made. It shows momentum and, in some cases, even creates positive anticipation. I have found that this has a psychological effect on your constituents, who begin to think of your team as one that keeps producing and avoids the “what have you done for me lately” syndrome. I have read studies that show that a steady stream of blog posts or podcasts can lead to an increased fan base. I believe that the same holds true when it comes to internal stakeholders.

Page 15: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

15Adobe Analytics White Paper

Another change management tactic you will want to employ is quick wins. As you rollout your new web analytics program and implementation, you should identify some great new web analyses you can provide that were previously unavailable. You should formalize these wins and find ways to shout them from the rooftops. If there is a meeting you can speak at, go and share the new things your team has learned due to the reenergizing process.

At Salesforce.com, we used an internal product called Chatter to publicize our successes and ended up gaining quite a following. I also got permission to visit industry web analytics conferences like eMetrics and share some of the cool things we were doing. On several occasions, our chief marketing officer stopped me in the hall and said that a colleague at another company had told him about the great things my team was doing in the area of web analytics at Salesforce.com. That kind of external validation helped build internal executive support for our team, almost as much as the internal buzz that we had created.

As you promote the great work that your team is doing, don’t forget to let all members of your team present their findings and quick wins. Nothing motivates a web analyst more than doing the hard work and then being able to share it with managers and executives and see their faces light up at the results. Also, be vigilant that the work being done by your team focuses on “thermostat” type reports and analyses. It will be tempting to fall back into thermometer mode and regurgitating data, but the reenergizing process represents a great opportunity to focus on more value-add analyses. Help your team avoid being pulled back into the old ways!

Another great thing to do is to make sure that your stakeholders know that their help with business requirements was critical to these new improvements, and share this publicly with their bosses. This helps get the stakeholders on your side by letting them share in your success. As noted previously, most business people want to be associated with successful projects. As your team communicates how new requirements, data, and analyses are leading to improved conversion, many will want to jump on the bandwagon. Make room for them, regardless of how helpful they were during the reenergizing process.

Since your web analytics implementation is only a small piece of the puzzle, as your team continues to roll out new functionality and quick wins, take it as an opportunity to build web analytics into your internal processes. Often times, web analytics is an afterthought when it comes to new projects. But as your team grows in stature, members might be invited to new project kick-off meetings. Suddenly, various teams might want to get your team’s input on which data should be collected to show the success of new projects and be willing to add web analytics implementation steps to the project plan. You can simply add these new requests to your business requirements document and follow the same process of mapping to the tool, implementing, and then checking data quality. As your requests grow, you will have to prioritize and ultimately get your stakeholders to tell your boss why they think your team needs more headcount—which is always better than asking for it yourself.

Concurrently, you might receive requests from internal employees to get a login to your web analytics tool. Ironically, many of these requests might come from those who had previously had logins that were taken away. Now, with momentum on your side, you should agree to give co-workers logins, but with the condition that they attend a training class. This helps weed out those who are not serious about using the data and provides an opportunity to make sure those with logins know your implementation and how to use it. Since you should have been building educational slides for each of your newly deployed business requirements along the way, creating a training class specific to your web analytics questions is as simple as compiling all of these slides into one presentation.

Before they come to a class, show potential attendees your list of business requirements and ask them to identify those in which they are most interested. This will allow you to customize the training in a way that is most meaningful to them and make the class highly relevant. In addition to implementation-specific training, you should create general analytics tool training as well. This includes common tasks, such as opening reports, adding metrics, breaking down reports, and building segments. By coupling tool knowledge with your customized implementation knowledge, trainees will have all they need to be successful. In some cases, I have created a simple exam in which I walk class attendees through a day-in-the-life script so that they can demonstrate their ability to navigate the tool and perform the tasks they would need when on their own. For those who want to get more in-depth training, you can offer more advanced classes on complimentary web analytics tools, such as Excel, Adobe ReportBuilder, and Ad Hoc Analysis. If you work for a global organization, I recommend that you take training out into the regions or create training videos to make sure that everyone has a chance to learn from your team.

Page 16: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

16Adobe Analytics White Paper

The final change management concept I recommend is the use of internal ambassadors. As your team builds momentum, you might find that the number of requests you receive grows exponentially. Each department might have new and exciting questions, but your team can’t be in all places at all times. To address this, you should identify one person in each department who is passionate about what your team is doing. Once you find this person, ask them if they would be willing to act as your ambassador to their team.

For each ambassador, your team will provide more in-depth training and give them direct access to the team. In addition, let ambassadors suggest new business requirements to your team directly instead of going through a ticketing system process. In return, you should ask ambassadors to mention your team’s capabilities in meetings where you are not present and to act as our team’s first line of defense when web analytics questions arise within their team. This helps your team minimize the amount of day-to-day support it has to provide and allows your team to attend many more meetings vicariously than it could possibly be at in person. You can also hold private brown-bag sessions in which you provide ambassadors with sneak peeks of new functionality coming and new analyses your team is working on. Overall, I find that ambassadors feel it is a fair deal, especially those in distant regions. But most importantly, ambassadors help evangelize your team throughout the company and the time investment you make in them usually pays for itself in increased adoption and higher satisfaction scores over time.

Reap the rewardsSo what can you expect if you are successful in the steps above? At most companies that I have assisted with this process, I have seen tangible benefits. To start, no one likes to work on a team that is not perceived as adding value to the company. Being part of an under-performing team leads to low morale and job dissatisfaction. I am convinced that the reason there is so much turnover in our industry is because too many people are working for web analytics programs that are under-performing. If you consider the thermometer/thermostat analogy, how many people do you think want to be in a thermometer job? Most people I talk to say that the main contributor to job satisfaction is whether or not they feel like their company cares about their work, even more so than money. If you compare a person providing the same report each week with someone who is identifying potential conversion improvements and then presenting those to executives, I promise you that the latter leads to employee loyalty.

Additionally, if you are successful in reenergizing your web analytics program, you will be given more freedom to hire people and to invest in cool new technologies. When you begin, people may be debating whether your existing web analytics tool should be abandoned, but after successfully reenergizing your program, you might be given funds to add additional tools, like A/B testing tools. These new tools provide career growth and promotion opportunities for team members. All of this leads to a team that feels it is on the upswing and part of something special, which is the best defense against headhunters and recruiters. And for those who have more material interests, following the reenergizing process can lead to promotions and increases in salaries, both of which create new opportunities for those on your team.

ConclusionRegardless of how good or bad your web analytics program is today, if you are diligent about following the steps provided here, you can improve your chances of success in the future. As I stated earlier, nothing is guaranteed, but it is my hope that the process outlined here will provide a recipe or roadmap that you can use to make improvements to your web analytics program.

By acknowledging and understanding the things that can contribute to failure, like incorrect business requirements or data quality issues, you can be proactive about avoiding them. Performing an honest assessment of where your program is today in the eyes of a third party or your internal stakeholders is a painful but necessary part of the reenergizing process.

Once you decide to engage in the process, it is important that you begin by reestablishing the purpose of your team and setting expectations accordingly with your stakeholders. Following this with a methodical process to identify what business requirements are important to you and your stakeholders will help you improve your relevancy and increase the chances that they will utilize the information you deliver in the future.

Page 17: Reenergizing Your Web Analytics Program - Adobe …success.adobe.com/.../49462_reenergizing_web_analytics.pdfweb analytics industry is so new, you won’t find many people who majored

By increasing your tool knowledge and possibly deploying a TMS to streamline deployment, you can dramatically improve on the actual data you are providing, which is the cornerstone of web analytics. Coupling this with intense data quality scrutiny and purposeful change management strategies will help your team build a new internal perception. This perception is one of a team that recognized it needed to make a change and took the time to understand exactly what its internal stakeholders needed and delivered on it. Through a steady stream of enhancements, targeted training and the highly visible promotion of quick wins, the word will spread that your team is back in business and ready to help the company achieve its goals. You can make the transition from a reporting cost center to a conversion-improving profit center. Your team will be happier and feel that they are part of something special and look forward to new requests from your stakeholders. While not every team will be able to achieve all the success described here, I have seen it happen time and time again and believe that it is a vision worth striving for.

Adam Greco is a longstanding member of the web analytics community who has consulted with hundreds of clients across every industry vertical. Mr. Greco began his web analytics career managing the website for the Chicago Mercantile Exchange and then became one of the founders of the Omniture Consulting group. While at Omniture, Mr. Greco managed accounts large and small and helped clients maximize their use of Omniture technologies. In 2012, in partnership with Adobe, Mr. Greco published the first-ever book on Adobe SiteCatalyst – The Adobe SiteCatalyst Handbook: An Insider’s Guide.

Web Analytics Demystified, founded in 2007 by internationally known author and former Jupiter research Analyst, Eric T. Peterson, provides objective strategic guidance to companies striving to realize the full potential of their investment in web analytics. By bridging the gap between measurement technology and business strategy, Web Analytics Demystified has provided guidance to hundreds of companies around the world, including many of the best known retailers, financial services institutions and media properties on the Internet.

For more information about Web Analytics Demystified please visit them on the web at www.webanalyticsdemystified.com.

Adobe Systems Incorporated 345 Park Avenue San Jose, CA 95110-2704 USA www.adobe.com

Adobe and the Adobe logo are either registered trademarks or trademarks of Adobe Systems Incorporated in the United States and/or other countries. Java is a trademark or registered trademark of Oracle and/or its affiliates. All other trademarks are the property of their respective owners.

© 2014 Adobe Systems Incorporated. All rights reserved. Printed in the USA.

2/13