Newsosaur: Digital Publishing Metrics—What’s Real?
Posted: 6/12/2014  |  By: Alan D. Mutter
The ecstasy of digital publishing is that it enables the granular measurement of everything from traffic to ad clicks. The agony is trying to figure out which metrics matter. That’s the vexing issue we’re going to tackle today, but, first, let’s get real.            

There are more questions than answers and more opinions than facts. Given ongoing advances in technology and analytics, best practices for audience measurement not only will continue to evolve but also to provoke ongoing and vigorous debate. The latest thinking on audience measurement is described below, but you can be sure it won’t be the last word.            

As messy as this topic is, it behooves publishers to pay attention to improving audience measurement so they can effectively and strategically manage their businesses in an ever-more demanding business environment.            

With that said, here’s what we know about the state of the art—and I do mean art, because audience measurement is anything but an exact science.  

Unique visitors
The most basic metric in measuring traffic is the number of individuals who frequent a digital destination, but the raw number captured by the typical server is deceptively high. The reason “uniques” are overstated is that most servers count a user as one person when he or she uses the Firefox browser to access a given site on a laptop, as a second person when he or she goes back to the same site on the Safari browser on a smartphone and as a third user when he or she visits the site from the Explorer browser at an office. If the user clears the cookies on one or more of his or her browsers, the user can be counted as a new unique all over again.             

Given the number of devices that most of us use, the raw figures collected on internal servers are “probably more than five times too high,” says Andrew Lipsman, a vice president of comScore, which sells a widely used service that aims to deliver a more accurate count. Combining data on the actual web activities of 1 million volunteers with additional data and analysis, comScore says it can give a truer count of unique individuals across all digital platforms than is possible by using only raw server data. While comScore’s data is widely accepted in the publishing and advertising industries, it is important to note that its tallies are no more than projections based on a statistical construct. ComScore numbers are more like a public opinion poll taken prior to an election than the actual ballot count itself. As we all know, pre-election polls aren’t always right.  

Page views
The most unambiguous way to measure traffic is by counting the number of pages served to consumers. This metric draws perhaps the most relentless focus from digital publishers seeking to maximize revenue from the ads they embed in each page.            
A direct carryover from the volume-driven way that the legacy print and broadcast media have sold advertising since time immemorial, page views can be lofted legitimately by posting valuable new content or artificially through all manner of gimmickry.            

The problem with concentrating on page views, as discussed more fully below, is that neither publishers, nor advertisers, can be sure that a page served to a consumer actually was viewed by the consumer—or that the consumer paid any heed to the content or ads presented on it.   

Social-media shares
In the age of social media, many publishers and marketers put a high priority on increasing not only the number of friends and followers tallied on their Facebook and Twitter pages but also in maximizing pass-along readership.            

While word-of-mouth generally is considered to be the most valuable form of endorsement for an article or product, Tony Haile, the CEO of Chartbeat, a traffic-analytics company, took to Twitter earlier this year to say that his research has found that there is “effectively no correlation between social shares and people actually reading” the article they tweet.  On the other hand, Upworthy, a digital publisher that has elevated the viral distribution of grabby articles to a science, reports that people who read to the bottom of an article are more likely to share it than those who scan just the top of it.            

Summing up the kerfuffle over the value of sharing, The Verge, a tech blog, tartly observed: “So if you see someone tweet an article, it likely means they either didn’t really read it, or they read every word.”  

Ad clickthrough 
The most crucial measure for marketers—and the publishers who depend on their patronage—is whether their ads are working. And the chief way ad effectiveness has been measured in the short but intense history of the web is the frequency with which they are clicked. Unfortunately, there’s plenty of controversy about the accuracy of this widely followed metric.            

Solve Media, a company selling anti-fraud technology to advertisers, reported that up to 61 percent of the ad clicks on the web in the final quarter of 2013 were “suspicious,” a sharp advance from the 51 percent rate of questionable clicks it detected in the third quarter. While there is no way of knowing if this assertion is too extreme, tech companies and ad networks widely acknowledge that they are in a never-ending battle with click-through bandits.            

The number of questionable clicks appears to be formidable on mobile devices, too. GoldSpot Media, an ad-tech company, issued a “Fat Finger Report” in 2012 stating that up to 38 percent of static banners were clicked accidently on mobile devices. The Fat Finger study has not been replicated since companies like Google, a dominant player in the ad serving business, acted to reduce the susceptibility of mobile ads to inadvertent clicks. So, the number of Fat Finger episodes today may be higher or lower today than it was in 2012.            

Meantime, comScore advises publishers and advertisers not to worry about weak or errant click-through rates. Saying that banner ads enhance brand awareness and prompt subsequent on- and off-line purchases, comScore asserted in a recent presentation that “the click is a misleading measure of a campaign’s effectiveness.”  

Time on site
Medium, the long-form web publisher known as Matter, believes that the amount of time an individual stays engaged with its articles is, by far, the most important metric. This also is one of the key metrics monitored at Alexa.Com, an analytics service owned by Amazon, which reports that the average time spent on Facebook is 30 minutes per session vs. 3 minutes or less at the typical newspaper site.            

Medium measures “every interaction with every post” by tracking how users scroll through stories, explains the publisher in its blog. “We pipe this data into our data warehouse, where offline processing aggregates the time spent reading (or our best guess of it): we infer when a reader starts reading, when they paused and when they stopped altogether. This methodology allows us to correct for periods of inactivity (such as having a post open in a different tab, walking the dog, checking your phone).”             

The issue with this methodology, as Medium admits, it that it requires a certain amount of inference and statistical massaging. Upcoming advances in technology may improve the prospects and outcomes for this type of analysis. Samsung and other smartphone companies are working on screens that will actively track user eye movements to see where they go on a page—and how long they stay.        

Total attention measurement
To overcome the inherent limitations of the various individual methodologies discussed above, a small but growing number of digital publishers and technology companies are mixing and matching metrics to develop what they hope will be more authentic views of their audience.            

Chartbeat, a company selling next-generation analytics systems, has created a dashboard that dynamically graphs site activity so editors can see which stories are driving traffic—and why. For a look at how the system is used at the Journal Record in White Plains, N.Y., go to http://tinyurl.com/cbstudy.             

Going beyond the simple aggregation of metrics, Upworthy closely measures and analyzes such behaviors as where users moves their mouse, how far they scroll into an article and how long they stick with a video. Illustrating the concept in a recent blog post at tinyurl.com/upattn, Upworthy said different articles attracting a similar number of page views drew wildly disparate amounts of actual and measurable attention.

As publishers accumulate ever more user data, the most enlightened among them are sharing the information widely with their staffs in the belief that audience engagement is everyone’s job. And they are right to do so. Because it is.   

Alan D. Mutter is a former newspaper editor and Silicon Valley CEO who today serves as a technology consultant to media companies. He blogs at Reflections of a Newsosaur (newsosaur.blogspot.com).