An intriguing ‘publishing platform for readers’
An unprecedented collaboration between two leading newspapers and the non-profit Mozilla tech community aims to build a “publishing platform for readers” that could go a long way toward revolutionizing the way we get and give news.
Inasmuch as the undertaking was announced only last week by the New York Times, the Washington Post, Mozilla and the Knight Foundation, there’s no way of knowing how ambitious the project will be – or how well it will be executed.
But the project offers the tantalizing prospect of perhaps the best system yet to combine the values, discipline and skills of professional journalism with the equally formidable energy, insights and skills of everyone else.
As discussed in a moment, the project also could help the legacy media catch up to the native digital publishers that have siphoned readers and revenues away from newspapers and magazines. First, the background:
Starting with little more than a general idea of the path they will travel together, the New York Times and Washington Post are joining with Mozilla, the non-profit global technology community that created the popular Firefox browser, to build a system to enable the creation, sharing and monitoring of comments, articles, media and other citizen-generated content. The two-year effort is being funded with a $3.9 million grant from the Knight Foundation to a non-profit entity called Knight-Mozilla OpenNews.
“With the platform, publishers will be able to easily include reader contributions in their content cycle, manage their communities and gather valuable user data,” said a press release from the NYT. “The platform will also use reputation scores, self-policing and other tools to make it easier for news organizations to monitor comments.”
The platform, which will be shared at no cost with participating publishers, “isn’t another commenting platform,” said Greg Barber of the Washington Post in the press release. “It is a publishing platform for readers.”
If OpenNews lives up to its mission statement, we could be on the way to seeing a platform that expands the scope and scale of traditional journalism by enfranchising, encouraging and empowering the voice of anyone who wants to contribute to the public discourse. At the same time, the project could well advance the quantity, quality and credibility of content generated by non-professional contributors.
Because it would be easier for citizens to contribute to media sites, the number and caliber of contributions would increase. Because multiple media sites would use the platform, citizen contributions would gain wider audiences at the same time the media sites were able to host fuller discussions than generally available today. With more opportunities for their contributions to be seen and heard, more citizens would take the time to post them.
While boosting the interactivity of mainstream media sites undoubtedly will lead to unpleasant kerfuffles from time to time, the platform could well offer the cleanest and best-lit place yet for constructive and sustainable community journalism.
But wait, there could be more:
If the initial incarnation of the platform were successful, it could be the basis for solving some of the most pressing (pun slightly intended) problems facing the legacy media.
Because most media companies lack the financial resources and technical chops to build state-of-the-art digital publishing systems, their sites and apps pale (and typically trail) in comparison to those powered by the superior engagement tools and analytics enjoyed by well-heeled digital natives like Vox Media, Bleacher Report, BuzzFeed, Upworthy and others. (Fun fact: The more than 130 million monthly unique visitors at BuzzFeed are almost as great in number as the collective 161 million uniques visiting all the 1,300-plus newspaper websites in the land.)
If the initial phase of OpenNews were successful, here’s how it could further help the technically-straggling legacy media:
:: Content management – Assuming OpenNews were architected to responsively display all manner of media across all known and likely platforms (and it's hard to imagine it won’t be), the system could be extended to not just manage comments but also to become a default content-management system for publishers great and small. As such, it would house all media assets in one place for ease in discovery, editing and publication across platforms ranging from smartphones to smart crockpots.
:: Content curation – Because the future of digital publishing requires nearly all comers to augment their own offerings with relevant content aggregated from elsewhere, the process of discovering and curating material could be made more efficient – and more effective – if the content from multiple publishers were archived in a single, readily searchable, cloud-based system.
:: Content promotion – With publishers easily mixing and matching content with one another, each will help to vastly expand the audience for all their work. This would provide broader content for publishers, a better experience for readers and more premium advertising opportunities.
:: Content personalization – If the activities and interests of readers could be tracked across the many sites and apps that they visit, the user experience could be taken to the next level by customizing content to suit an individual’s particular needs. This feature would require privacy safeguards and, ideally, a provision requiring consumers to opt-in to such a service. Assuming safeguards are in place, think how slick it would be to click a button to personalize your Firefox browser by linking it to your contacts, calendar, stock portfolio and Amazon wish list.
:: Content merchandising – If the platform were built to protect copyrighted content, it could serve as the basis for an intra-publisher payment system to allocate syndication fees among the originators of premium content wherever their work appears on the web. The system could enable not only paywalls but also newsletters, special reports, databases, multimedia packages and more.
With the above functions capturing detailed data about individual users as they traffic participating sites, legacy publishers finally could become serious digital competitors at a time that marketers increasingly are using segmentation algorithms and exacting analytics to deliver individually crafted messages to precisely targeted consumers. (Another fun fact: Procter & Gamble aims to buy 70%-plus of its advertising this year through data-centric, real-time bidding systems.)
Even though granular customer data unquestionably has become the Holy Grail of modern digital marketing and advertising, most legacy media are ill equipped to compete with the native digital publishing and advertising services that are pecking away at them.
Unfortunately, publishers for the most part have a poor track record for collaborating in their own best interests. Because they absolutely must have a modern, nimble, holistic and multi-brand platform to compete successfully with the well-heeled masters of the digital universe, they really need to coalesce around OpenNews – or something awfully similar to it.
Digital publishing metrics: What’s real?
The ecstasy of digital publishing is that it enables the granular measurement of everything from traffic to ad clicks. The agony is trying to figure out which metrics matter. That’s the vexing issue we’re going to tackle today, but, first, let’s get real:
There are more questions than answers and more opinions than facts. Given ongoing advances in technology and analytics, best practices for audience measurement not only will continue to evolve but also to provoke ongoing and vigorous debate. The latest thinking on audience measurement is described below, but you can be sure it won’t be the last word.
As messy as this topic is, it behooves publishers to pay attention to improving audience measurement so they can effectively and strategically manage their businesses in an ever more demanding business environment.
With that said, here’s what we know about the state of the art – and I do mean art, because audience measurement is anything but an exact science.
Unique visitors
The most basic metric in measuring traffic is the number of individuals who frequent a digital destination, but the raw number captured by the typical server is deceptively high. The reason uniques are overstated is that most servers count a user as one person when she uses the Firefox browser to access a given site on her laptop, as a second person when she goes back to the same site on the Safari browser on her smartphone and as a third user when she visits the site from the Explorer browser at her office. If the user clears the cookies on one or more of her browsers, she can be counted as a new unique all over again.
Given the number of devices that most of us use, the raw figures collected on internal servers are “probably more than five times too high,” says Andrew Lipsman, a vice president of comScore, which sells a widely used service that aims to deliver a more accurate count. Combining data on the actual web activities of 1 million volunteers with additional data and analysis, comScore says it can give a truer count of unique individuals across all digital platforms than is possible by using only raw server data. While comScore’s data is widely accepted in the publishing and advertising industries, it is important to note that its tallies are no more than projections based on a statistical construct. ComScore numbers are more like a public opinion poll taken prior to an election than the actual ballot count itself. As we all know, pre-election polls aren’t always right.
Page views
The most unambiguous way to measure traffic is by counting the number of pages served to consumers. This metric draws perhaps the most relentless focus from digital publishers seeking to maximize revenue from the ads they embed in each page. A direct carryover from the volume-driven way that the legacy print and broadcast media have sold advertising since time immemorial, page views can be lofted legitimately by posting valuable new content or artificially through all manner of gimmickry.
The problem with concentrating on page views, as discussed more fully below, is that neither publishers, nor advertisers, can be sure that a page served to a consumer actually was viewed by her – or that she paid any heed to the content or ads presented on it.
Social-media shares
In the age of social media, many publishers and marketers put a high priority on increasing not only the number of friends and followers tallied on their Facebook and Twitter pages but also in maximizing pass-along readership.
While word-of-mouth generally is considered to be the most valuable form of endorsement for an article or product, Tony Haile, the CEO of Chartbeat, a traffic-analytics company, took to Twitter earlier this year to say that his research found that there is “effectively no correlation between social shares and people actually reading” the article they tweet. On the other hand, Upworthy, a digital publisher that has elevated the viral distribution of grabby articles to a science, reports that people who read to the bottom of an article are more likely to share it than those who scan just the top of it.
Summing up the kerfuffle over the value of sharing, The Verge, a tech blog, tartly observed: “So if you see someone tweet an article, it likely means they either didn’t really read it, or they read every word.”
Ad clickthrough
The most crucial measure for marketers – and the publishers who depend on their patronage – is whether their ads are working. And the chief way ad effectiveness has been measured in the short but intense history of the web is the frequency with which they are clicked. Unfortunately, there’s plenty of controversy about the accuracy of this widely followed metric.
Solve Media, a company selling anti-fraud technology to advertisers, reported that up to 61% of the ad clicks on the web in the final quarter of 2013 were “suspicious,” a sharp advance from the 51% rate of questionable clicks it detected in the third quarter. While there is no way of knowing if this assertion is too extreme, tech companies and ad networks widely acknowledge that they are in a never-ending battle with clickthrough bandits.
The number of questionable clicks appears to be formidable on mobile devices, too. GoldSpot Media, an ad-tech company, issued a Fat Finger Report in 2012 stating that up to 38% of static banners were clicked accidentally on mobile devices. The Fat Finger study has not been replicated since companies like Google, a dominant player in the ad serving business, acted to reduce the susceptibility of mobile ads to inadvertent clicks. So, the number of Fat Finger episodes today may be higher or lower today than it was in 2012.
Meantime, comScore advises publishers and advertiser’s not to worry about weak or errant clickthrough rates. Saying that banner ads enhance brand awareness and prompt subsequent on- and off-line purchases, comScore asserted in a recent presentation that “the click is a misleading measure of a campaign’s effectiveness.”
Time on site
Medium, the long-form web publisher also known as Matter, believes that the amount of time an individual stays engaged with its articles is, by far, the most important metric. This also is one of the key metrics monitored at Alexa.Com, an analytics service owned by Amazon, which reports that the average time spent on Facebook is 30 minutes per session vs. 3 minutes or less at the typical newspaper site.
Medium measures “every interaction with every post” by tracking how users scroll through stories, explains the publisher in its blog. “We pipe this data into our data warehouse, where offline processing aggregates the time spent reading (or our best guess of it): we infer when a reader starts reading, when they paused and when they stopped altogether. This methodology allows us to correct for periods of inactivity (such as having a post open in a different tab, walking the dog, checking your phone).”
The issue with this methodology, as Medium admits, it that it requires a certain amount of inference and statistical massaging. Upcoming advances in technology may improve the prospects and outcomes for this type of analysis. Samsung and other smartphone companies are working on screens that will actively track user eye movements to see where they go on a page – and how long they stay.
Total attention measurement
To overcome the inherent limitations of the various individual methodologies discussed above, a small but growing number of digital publishers and technology companies are mixing and matching metrics to develop what they hope will be more authentic views of their audience.
Chartbeat, a company selling next-generation analytics systems, has created a dashboard that dynamically graphs site activity so editors can see which stories are driving traffic – and why. For a look at how the system is used at the Journal Record in White Plains, NY, see this.
Going beyond the simple aggregation of metrics, Upworthy closely measures and analyzes such behaviors as where a user moves her mouse, how far she scrolls into an article and how long she sticks with a video. Illustrating the concept in this recent blog post, Upworthy said different articles attracting a similar number of page views drew wildly disparate amounts of actual and measurable attention.
As publishers accumulate ever more user data, the most enlightened among them are sharing the information widely with their staffs in the belief that audience engagement is everyone’s job. And they are right to do so. Because it is.
© 2014 Editor & Publisher