Scott zooms in on engagement (time spent) and loyalty (repeat visits by the same person) as performance metrics for news sites, and points out that the Drudge Report scores higher than CNN, Yahoo News, Gannett newspapers, the NY Times on both counts. The reason, Scott ventures, is that the Drudge is mainly an aggregation play, while the other sites are not and receive much of their traffic from other sites.
I agree. As I've argued many times in this blog, brand loyalty is built through aggregates and not through content (unless you accept my motto that 'aggregation is content'). But this doesn't mean that mainstream sites have fewer readers than the Drudge. It only means that loyal readers account for a smaller percentage of their audience – and they are able to monetise both. The important question is which of the two models is more profitable (in terms of rate-of-return) and scalable.
(Scott also ventures that mainstream sites' traffic comes largely from aggregation sites like the Drudge and search engines like Google. From my own experience in these matters I suspect the latter is significant while the former is less so. But this is not essential to his argument.)
While the Drudge's overall audience is far smaller than that of CNN et al, its average reader is presumably more profitable. And the same may be true of it as an operation in terms of rate-of-return ($54m pa with one member of staff, by Scott's reckoning). Further, Scott points out that the Drudge's audience is one of the largest among web-only news-only sites.
This supports the argument in my previous post. To recap, this was that web-only plays have the flexibility of restricting their content to only what they can't find elsewhere while linking to others for everything else. By contrast, outlets with a print (or broadcast) operation can't do this, because they need to produce (or buy) content for the old-media version. I concluded that "the game for incumbents is managing to produce enough content to publish a decent print paper even as the associated revenues decline, without being able to face the disruptors at their own game." Comparing newspapers' and the Drudge's rates of return underscores this point.
But then I think Scott makes a mistake. He recommends that mainstream sites "put a continuously updated news aggregation on the homepage," so that readers can monitor these pages and stay across "the best stories" from mainstream sites in addition to a site's own content. Left unspecified, this could mean just adding a machine that algorithmically trawls news sites and somehow selects "the best" is enough. Even if that were possible, you would be competing with Google News on the basis of technology – and there you have no chance. Further, as Tim Buden implies in a comment to Scott's post (in my reading of it), you would be diluting your brand. A newspaper is known for its editorial voice, not for giving you "the best".
What the Drudge offers is not "the best" links. It is a distinct editorial voice (idiosyncratic, tongue-in-cheek) aimed at a specific readership, just like any newspaper does, but by using only curation. From the reader's point of view this is not that different from a site that did this while relying only on its own content (although from the publisher the difference is enormous). And it is because of this that mainstream sites have chance at aggregation (as Scott implies), but only if they set their best editors to the task. I repeat: aggregation is content.
There are many more connections to be made between my previous posts and what Scott and his commentators say, but if this interests you I suggest you read both blogs and draw your own conclusions.
Update: Well I was wrong. In a later post, Scott Karp makes it clear that when he calls for newspapers to "put a continuously updated news aggregation on the homepage" he is not talking about algorithms. He notes: "Giving over the function of choosing links, of filtering the web, to an algorithm is an implicit devaluation of the quality of human judgment, of what makes an individual editor's perspective so interesting". I agree with that, but I don't share his concern (later in his post) that that human editors will find it difficult to compete with machines on exhaustiveness (which leads him to call for collaboration between journalists so that, between them, they can trawl the web). I don't think that is the point. What human link-editors offer is not a 'service' to monitor the web or the world, but a view, a that reflects a community's concerns, not the world. Millions of events happen every minute, more and more of them are getting reported somewhere on the web, and for different people different events are 'news' while others aren't. The view comes first, the events second. And the view has to do with being a certain kind of person, a member of a certain community, with certain shared concerns. Link journalism is not about offering endless choice, but about keeping your readers in the loop about a few key things that reflect (or set) a community's key concerns- and these are limited, because a community can't be talking about a million things. In my humble view, the notion that journalism is just about relying information is a myth, a myth that is necessary to the ethics of journalism (how could you talk about honesty otherwise?), but a myth nonetheless. "Focussed and honest storytelling" is closer to the mark.