Measuring Web 2.0: Web Analytics Define Standards, Suggest Strategies

Quantifying Web traffic is all the rage these days, with businesses doing more online in terms of connecting with consumers and communicating with stakeholders. Now, these businesses must be able

to collect data surrounding how consumers and stakeholders interact with sites, and how this translates into actionable strategies for improving their online experiences. That's where Web analytics

comes in.

But no organization had gone so far as to establish a consistent way to take Web analytics - the study of the behavior of Web site visitors - and standardize them to ensure that vendors and

clients alike used comparable counting methods. This posed problems, as measurements could be dismissed as irrelevant because of a lack of scientific methodology.

The Web Analytics Association (WAA) took strides in reversing this by releasing "Web Analytics Definitions" in August 2007. The report defined key terms in Web metrics, among them unique

visitors, visits/sessions and page views.

While there are still various issues with these standardized definitions, most industry professionals see them as a first step in streamlining Web measurements; thus, the definitions - and the

report's overall Web measurement implications - must become part of communicators' spoken dialect, as they add credibility to the metrics that contribute to PR's overall impact on the organization

(not to mention the fact that PR people are the gatekeepers of Web analytics management). Consider the following overview (find the complete report at webanalyticsassociation.org):

*Building Block Terms:

  • Page Views: The number of times a page (an analyst-definable unit of content) was viewed.

  • Visits/Sessions: A visit is an interaction, but an individual, with a Web site consisting of one or more requests for an analyst-definable unit of content (i.e. "page view"). If an individual

    has not taken another action (typically additional pages views) on the site within a specified time period, the visit session will terminate.

  • Unique Visitors: The number of inferred individual people (filtered for spiders/robots) within a designated reporting timeframe. Each individual is counted once.

*Other Relevant Definitions:

  • Click-thru Rate: The number of click-thrus for a specific link divided by the number of times that link was viewed.

  • Event: Any recorded action that has a specific date and time assigned to it by either the browser or server.

  • Conversion: A visitor completing a target action; the method of segmenting behavior as visitors interact with a Web property.

When reading the report, it is all too easy to suffer cognitive failure. The language is scientific and convoluted, and the definitions often seem marred with circular reasoning. However,

according to Eric Peterson, CEO of Web Analytics Demystified, it is important for PR execs to imbed these definitions into their measurement strategies.

"It is not the definition of the standards that makes a different," he says, "it is the adherence to standards ... that will provide the portability of skills, knowledge and solutions so desired

by many in our industry."

Peterson speaks of technology vendors abiding by these new standards, but there are serious implications for communicators as well. For example, what happens when a company switches metrics tools

and subsequently changes both the terms and the standards they use to define their analytics?

"Ironically, cost isn't the main problem," Peterson says. "The impact on existing customers who would be forced to learn new definitions and suffer from potentially dramatic changes in data

collection and reporting is the main problem."

Clearly, communications executives must act as gatekeepers when it comes to managing and measuring Web analytics. Having an understanding of the terms vendors use in quantifying data is a start,

but taking proactive steps to collect consistent, relevant data is also required. So, what best practices should be employed?

*Think beyond hits: Whether you use a Web analytics vendor or not, don't just focus on the number of hits to your site. Segment and track page views, entry pages (the first page of a visit) and

keywords, but also delve deeper into Key Performance Indicators (KPIs) - financial and non-financial metrics that quantify objectives to reflect an organization's strategic performance. These

indicators are actually tied to strategies (and often appear on corporate scorecards) rather than just providing data that must then be shaped into actionable items.

*Look to links: Consider referrers (the page URL that originally generated the request for the current page view) and segment them. The WAA Standards separates referrers into the following

categories: internal, external, search, visit and original. Knowing which type of referrer is driving visitors to your site helps shape strategies for generating meaningful traffic and identifying

holes in your current approach.

*Segment social media: Once you know where your referrers are coming from, further categorize them by social media, including blogs, other Web sites, search engines and social networks.

Understanding how each segment converts to page views will put your current exposure into context.

*Ask questions: According to Jim Sterne, president of Target Marketing, during a 2007 interview with Stone Temple Consulting's Eric Enge: "People look at the reports that come out of Web analytics

tools, expecting them to have all the answers. But, instead of reading the reports that come out, you should be turning to the Web analytics tool and asking questions. How do I improve this?"

*Know the nuances: For example, don't confuse the page exit ratio with the bounce rate. The former is the number of exits from a page divided by the total number of page views for that page, and

it offers insight into which pages lose eyeballs most often. The latter simply indicates the single-page-view visits to your site. If this rate is high, you need to rework your Web site. Says

Sterne, "Marketing people are not raised in IT, and they don't know what the different numbers mean. Somebody has to be the connection between IT and marketing; somebody has to be able to understand

the importance of the value and the differences in the data, and the business application of those nuances. That's the big challenge." PRN

CONTACTS:

Eric Peterson, [email protected]; Jim Sterne, [email protected]

The Accuracy of Web Analytics, Under Fire

In August 2007, Stone Temple Consulting released the results of its 2007 Analytics Shoot out, in which the firm compared major Web analytics packages and evaluated their ease of implementation,

use and reporting. However, more interesting to communications executives looking for a Web measurement solution, the "Shootout" compared the accuracy in Web analytics data across a variety of

vendors. The results were surprising:

  • Web analytics packages installed on the same Web site and configured the same way produced different numbers;

  • The biggest source of error in analytics is that of implementation. Web analytics implementation must be treated like a software development project, with due diligence in terms of scrutiny

    and testing;

  • Two major factors drive different results: the placement of JavaScript on the site (thus resulting in traffic that isn't counted), and the differences in definition of what each analytics

    package is counting;

  • Page views tend to have a smaller level of variance;

  • To help address accuracy problems, calibrate with other tools and measurement techniques when possible, which helps quantify the nature of inaccuracies; and,

  • One of the basic lessons is learning what analytics software packages are good at, and what they aren't.