Skip to content

Snowpocalypse: by the numbers

August 20, 2011

A week of wild NZ weather has wrapped up. How did the country’s news websites perform? Let’s take a look.

First up, here’s a chart of the weekday domestic Unique Browsers on New Zealand news sites.* and did the best.** Just like in my analysis of royal wedding traffic, websites which focus on news generally did better than the portal sites like MSN and YahooXtra. That suggests again that when something major happens, these are the sites people go to to find out what’s going on.
Monday, 15 August 2011 41347 171740 266992 439907 349123 137191
Difference from average 17396 33856 -3524 90135 79720 39461
% above average 73% 25% -1% 26% 30% 40%

You can see while Stuff had the biggest raw increase, the NZHerald wasn’t far behind. Again, TV3 saw a traffic spike that was proportionately huge, but in raw terms pretty minor.

Yahoo’s news site did a lot better than MSN here. Yahoo has been doing more of their own news lately, and the numbers suggest this helped on Monday but not on the following days (UBs fell back to about normal). MSN recorded lower UBs than an average weekday – not a good sign on the biggest news event so far this month. Read more…


11 tips for getting your first media job

August 13, 2011

Nurul Izzah Anwar surrounded by reporters during the Malaysian general election, 2008Looking to eject from your current job into the precarious exciting world of journalism? Here are some tips.

1. You don’t need to go to journalism school – You can pick up the skills you need by doing it. Experience is the best teacher, and you’ll be judged by your portfolio and recommendations.

2. Know your niche – You won’t get a job as a general reporter, there’s just too much competition from recent journo graduates. You have to build up specialist knowledge in an area you’re passionate about – preferably one where there’s lots of advertising.  Do this:

  • Know the people – Try to meet as many people as possible, both in person and online. They’ll be the people tipping you off in future. They’ll be the people you can turn to when you need help. So hang out in the forums and the real-world events and network, network, network.
  • Know the field – Understand the field you’re covering as well as you can – both the technical aspects and the major players and issues. You’ll be able to ask better questions, get better stories, make fewer mistakes, and you’ll be more respected by the community you’re covering (people prefer to talk to reporters who know their stuff).
  • Know the publications and the journalists – Who’s good, who sucks, and in what aspects. Which publications in your chosen niche cover what and how their angles differ. You’ll be asking these companies for jobs, and they expect you to be able to talk intelligently about their work, their staff and their competition.

3. Practice, practice, practice – Approach the free community papers and websites and ask them what you can do. You’ll write your fair share of dull-as-dugong articles, but it’s all good practice. If the editor thinks you’re worth keeping their help you hone your craft until you can land a paying gig.

4. Promote, promote, promote – Start a blog talking about your chosen niche and interact with the community. Connect with people in the niche on Twitter and promote the hell out of yourself. Comment on other blogs in your chosen field.

Read more…

5 things the media needs to get good at

June 11, 2011
Counting to 5

Courtesy of

As the media landscape changes ever faster, companies have to adapt to survive. Here are 5 areas that will shape the media business in future.

1. Product development – People are getting their news in increasingly different ways, and there’s a new device every few years for companies to get their heads around. A company that can’t build and tweak products quickly, efficiently and smartly is going to be left behind.

If your company doesn’t have a good process for generating ideas, then bringing them to market, it’s going to lose out to companies that do.

Just look at how tech companies like Google and Facebook have out-manoeuvered and sometimes displaced completely areas that have traditionally belonged to publishers.

2. Data – Both for their readers (eg database journalism) and of their readers (eg targetting ads and content). Publishers have to be able to give readers the content they’re after, and advertisers the readers they’re after, to survive in an age where search engines and social networks watch every word you type.

3. Research and Analytics – Part of giving customers what they want is knowing what they want. Publishers need to collect as much data on how people use their products as they can, through both web tracking and surveys.  That’s the best way to make their products better, and when your product is increasingly only one of many ways to get what the customer is after, having this info is key. That means investing in the technology and staff to gather and interpret the information.

4. Marketing services – SEO, Social media, web marketing strategies – online publishers have had to get good in these areas. And they’re all areas where most normal businesses don’t have a clue.

That’s why a wide range of publishers – including niche ones like IDG, and also big ones like Conde Nast and The Tribune Group – have started their own divisions to meet these needs. Publishers are already the first point of call for many of these business, and they’re best placed to sell these types of services.

Publishers can leverage what they’re good at – online advertising – to create a whole new revenue stream. If they don’t, a middleman will jump in, and publishers will have a hard time catching up in an increasingly specialised area.

5. Failure – It’s still a dirty word in much of the publishing world, because publishers are usually large public corporations that use business models suitable for large public corporations. There’s not a lot of experimenting being done at most companies because it’s expensive, has uncertain rewards, and may end up damaging the existing business. CEOs at big companies generally won’t take risks until they’re forced to.

Experimentation and failure are inherent to a startup, but it’s a hard mentality for many corporate types to understand. Very few managers are encouraged to fail, and in an area that’s moving as fast as online publishing, that’s actually a big problem.

These companies won’t take the risks necessary to get into emerging spaces.

Revenues from traditional media are declining, and they’ll keep declining for a while yet. The way out isn’t to manage that decline, it’ s to develop new ways of funding high-quality content.

What big data means for media

May 22, 2011

Big data is changing the ways companies do business. So how will it affect the media?

THE COLOURS: A data visualisation of Wikipedia editing from IBM via Wikipedia

Two ways:

1. Journalism – as more datasets are released and software to query and visualise it becomes more common, data analysis will drive more and more real journalism. We’ll see:

  • More data spawning investigative articles
  • More data providing context to articles (eg crimes in different areas)
  • More data as standalone journalism. A lot of times the reader doesn’t actually need a story to provide context, and the data visualization does a better job by itself. Let the reader and their social media buddies decide what’s interesting instead of a reporter.

2. Business models – Data is following its own version of Moore’s Law right now. Collecting, storing and analysing information is becoming cheaper and faster, and at an accelerating pace. That means it’s going to get easier and more economical to track more of what you read/watch/hear/do online. It’s also going to get easier for companies to link up their independent databases to gain new insights into consumer behaviour, and exteremely targetted advertising.

The data should be anonymised – in both meanings of the word “should” – but you’re not being paranoid if it makes you uneasy.

Datafication will also accelerate the changes encouraged by the internet, where page views and other metrics were brought down from abstractions to provide real insight (and KPIs) to ordinary journalists. Before the internet, editorial judgement and occassional reader surveys were the way to gauge the popularity of different content. Now you can see what rates and what doesn’t in real time. There are upsides and downsides to this, as we’ve all read about.

But there’s a more fundamental change on the horizon. Being able to track everything each reader does on your site – and maybe some info on what they do on others – is another nail in the coffin of traditional editorial judgement.

In the bad old days it was the editors’ jobs to figure out what went on what newspaper page. Nowadays it’s web editors deciding what should lead the site or a section homepage. But when everything is tracked, the data does that for us. Web editing will increasingly become low-skill content loading, which will quickly be replaced by technology. I’m a programming noob but know enough to write an algorithm which ranks stories based on their recent clicks.

Within 10 years I expect online newsrooms to start whittling down their web editing staff and putting more resources to reporting – exactly how it should be. There should be as few staff as possible between a reporter and their audience. Web editors will be there to massage the system and give it personality, a veneer of humanity over a machine that handles all the grunt work.

Another point: News will soon enough be able to be completely personalised. Each person has a different set of stories served to them. Soon enough, editorial judgement for a mass audience won’t be a very marketable skill anymore.

So what’s a web editor to do? Knowing how to load stories and extra content won’t be enough to keep you in a job. You have to learn skills that technology won’t be able to easily duplicate in the medium-term, like growing commmunity and encouraging discussions that keep readers on site.

Also expect the commodification of news websites serving similar audiences. As designers understand what makes people click, sites will increasingly optimise their designs to extract every last click, every last second of engagement from users. The same way most newspapers fit into a handful of design categories, news websites will too soon enough. Even more so, because the traditional editor’s discretion (“I like that and not this.”) won’t count for squat. The data will prove what works best.

Right now the media world has hundreds of different content management systems to handle its content, because there are so many different ways media sites want to show that info. But as design starts to homogenise, these CMSes will get whittled down to a handful of major players who can best (usually most cheaply) serve up content into those popular templates. This will slow down news design innovation online until things are broken up by the next revolution.

And once the design and CMS approach have been cemented into only a few real choices, you’ll be able to buy your major news website out of the box as a managed service. tweaking the colours to suit your brand.

The structure of individual news articles will also change to optimise readers and revenues. The technology is already here to serve up two different versions of a page to see what rates best, it’s just not economical really yet to do it for specific articles. But as data costs drop, it will become common for companies to experiment with different article forms, and decide which forms are most efficient (lowest cost/revenue) for each subject.

Subjects too will become increasingly tailored to revenue rather than raw audience size, as they are now. For example, covering rugby in New Zealand is guaranteed to bring in lots of readers, but is it worth taking one of your several reporters away to cover ice hockey once a week? That’s easy to find out if you know the average output of each reporter and ad returns of each topic. We can do this now, but it’s usually not economic. In the future though, the numbers will decide what topics journalists cover, and how much they are covered.

That might paint a bleak picture of the future of journalism for you, but dataification is not a bad thing. Computers aren’t going to replace human beings except in areas where they should replace them. Computers are good at repetitive tasks, and not at all good and understanding how people think and behave, or understanding what a story means.

And that’s the kind of skills online journalists in the future will need – more psychology than programming (though as much as possible you should know both). Find a field and be expert enough to provide context, and your job will be safe.

All the grunt work, that can and should and will be outsourced to machines.

Where’s all my Twitter traffic?

May 9, 2011

Interesting study out from Pew on referral traffic to news sites, and sure enough, Facebook comes up trumps in social media. Twitter iconMore interesting for me, though, is the traffic from Twitter – or the lack of it. For most sites in the Pew study, Twitter “barely registers as a traffic source”, says the AFP article in the link.

I love Twitter and think it’s a great source for the news. Day by day it’s encroaching on my other news-fetching habits, and I’m not alone. Hell, I found out about this study from Twitter (thanks @Lavrusik).

So where the murgatroid is the traffic? If Twitter has 200 million users generating 65 million tweets a day, why isn’t it rivalling traffic from Facebook’s 600 million active users?

It’s a dirty little secret many media companies have. Twitter, despite all its general wonderfulness at spreading the news, refers very little traffic to most news sites (only one in the Pew survey cracked 1%). Some niche news sites actually do a lot better out of Twitter, but these target digitally savvy people (eg online journalists).

For a mainstream news site, Facebook is a much bigger chunk of delicious traffic-berry pie.

Why? What’s going on?

For one, Facebook quotes active user numbers. Twitter doesn’t. Lots of accounts on Twitter are tweet bots (and not the good kind), a much higher proportion than Facebook, I’d wager. I don’t know how many of those 65 million daily tweets are spam, but if Twitter won’t say, I’d bet it’s too high for their comfort.

Some sneaky estimating puts the number of active tweeters at about 21 million. That’s nothing to sneeze at, but nowhere near Facebook. That means, though, that 9 in 10 Twitter accounts are inactive or spam bots.

But wait, there’s more. This report shows the makeup of a sample of tweets from 2009. Only about 4% are relating to the news, just less than spam. The biggest categories by far are “conversational” and “pointless babble”, which combined make up more than 75% of tweets.

Granted that was two years ago, but after the Mumbai attacks where Twitter played such an important role in breaking the news.

Now some quick maths: 65 million tweets * 4% news = 2.6 million tweets about the news. A lot of these will also be retweets that will appear a few times in people’s streams. And if most tweeters are like me, they use Twitter to hunt for interesting stories, ignoring most of the prey but feasting when they find a juicy morsel.

I don’t have any lovely numbers to prove it, but my hunch is that automated feeds from news sites are largely ignored. We mostly find out news on Twitter from other people instead – one account run by a member of the community is worth 100 automated feeds, because an endorsement from a person you know is the best reference a story can get. Most of the automated accounts are firing tweets unread into the ether.

Now the numbers are starting to make sense, and Twitter’s real role comes into focus – spreading news among the digital savvy. You have to be online a lot to get good payback out of Twitter, you have to want to be plugged into the worldwide info stream. For normal people, they just don’t see the point, and probably never will.

The lesson: Concentrate on humanising your Twitter profile, and using the service as an info source and community builder instead of a major referral source.

Smart traffic strategy is more than a numbers game.

Royal wedding – winners and losers in NZ media

May 4, 2011
royal family on balcony post wedding

Photo by Magnus D, via Wikipedia

The royal wedding was a right royal media pimp-out. Here’s how NZ’s media did.

What did they do? Twitter widgets, live streams, message walls, and photos photos photos. Also, psychic predictions from tea leaves, because English people drink lots of tea, you know. So Wills, be careful of your lower back on your honeymoon…

Lots of dedicated royal wedding sections, no surprises there, and Stuff went all out with a pretty static html landing page (Disclosure: I work for Stuff).

How did they do? TV3  and Stuff impressed by experimenting with Facebook chats, which lets people log in with their Facebook profile to comment inside a widget. It’s basically unmoderated too, so a bit scary, but really the only way to keep up with livestreams without someone full-time moderating. Here’s Stuff‘s, now unfortunately closed off, but it was fun while it was going.

The NZHerald took a different approach, using the Cover It Live software to host a reader chat. Stuff uses these all these time (like yesterday) so I dipped in a few times to see how it was going. Too much Herald staff, not enough readers for my taste, but I like the idea of sitting on your laptop chatting about what you’re watching on TV.

It’s tough to mix talking about the event with coverage of the event. But in this case, since most people were apparently watching it on TV, I think a lot of sites worried too much about the play by play, not enough about providing places for people to talk about what they were watching.

Also the NZHerald’s live stream wasn’t live streaming for most of the event, as far as I could tell, despite being prominently promoted. So plus one but minus two for the NZHerald, I’m afraid.

So did did all this whizbangery bring in lots of punters on the big night?

No. No it didn’t.

Traffic to the big NZ news websites was actually down on Friday vs previous Fridays. Here are the lovely numbers from Nielsen/NetRatings, showing domestic unique visitors:
Friday, April 29th 457019 261996 20039 307872 248272 95389
Difference from average Friday -405.375 1231.875 -237.875 -1273.63 -9225.5 -1691.25

So what gives? My hypothesis is that people were at home, so watched the “historic” event live on their TV, instead of tuning in online. If the event was during work hours, many of these people would have livestreamed it, and the internet in this country would have collapsed like the hearts of so many young women when William said “I do”.

It was an average Friday, no big news stories to lure people to the site, so traffic ended up below average when people didn’t come that night, instead choosing to spend the evening in front of the telly. I just hope they didn’t get brain damage from listening to the vapid chatter the bookended the TV3 broadcast of the event. I think I’ve been permanently stupided by it.

But the story doesn’t end there. Here’s Saturday’s domestic unique visitors from Nielsen/NetRatings:
Saturday, April 30th 388595 215786 17800 236263 186516 82035
Difference from average Saturday 6925.125 6049 -3455.5 7112.25 -23447.1 -918.75

So traffic to some sites was up on Saturday, significantly higher than average. What’s going on?

Well people watched the event on TV, but TV isn’t the internet. It’s inferior in many ways, because all it can send you is the words and pictures. There’s slim to no interactivity.

Most importantly, you can’t look at extra content if you’re interested. They only have time for a small slice of video clips available. And photos, forget about it.

So people tuned out on Friday, then came back in droves for more content after the event was over to watch extra clips, views photo galleries and read the pundits. That’s when the effort and depth of a site really mattered.

Beforehand we see the same kind of thing. Domestic unique visitors on Wednesday and Thursday were up at most sites vs average:
Wednesday 500264 288198 19251 324036 259979 94052
Difference from average Wednesday 12385.6 7653.71 2234.71 3979.43 -4312.1 4876.71
Thursday 498263 286706 19220 326447 262735 100212
Difference from average Thursday 17019.1 10039.7 1280.86 9291.86 1320 10110.1

I wish I could release Stuff’s hourly stats publicly, because they clearly show the dips and peaks to support when people tuned out and came back. But sadly, duty prevails of neato!-ness.

What’s the lesson in all this for online media? For one, don’t worry so much about live streaming or live blogging events that are on TV outside of work hours. The vast majority of your audience has TVs and prefer to watch it there (it’s also cheaper – my livestream of the event Tysoned my data cap).

People are more interested in stuff beyond that compliments what they’re watching or have watched earlier on TV – photos in particular.

It’s the extra content before and after an event like this that your readers want. Plan your strategy around this to win big.

What the atomisation of news means for NZ

April 28, 2011

In New Zealand, like in other small countries, web traffic is a bit unusual. If you’re interested in daily news from the country, you have to tap in to Stuff, NZ Herald, TVNZ, TV3, Radio NZ, or a similar site to get it. There’s just not a lot of sources out there to get it.

That’s why such a large proportion of NZ site traffic is direct through the homepage, vs referred by social media or search engines. The homepages of these sites are destinations where people go to get NZ news.

But there’s a worrying sign on the horizon: more and more people are using other sites and services as news portals, instead of a media site’s homepage. They rely on Facebook, Twitter, RSS feeds or apps that curate and filter news web-wide. The trend is called “atomisation”, since people are consuming news not as a package of stories put together by a site’s editor, but one story from this site, one from another, etc. It’s one step closer to the holy grail of online news: the Daily Me – stories individually selected for my interests and location from every publication on the web.

I’m a big fan of iPad atomiser apps like FlipBoard and Zite, less so of News360 and Newsy, and I rarely use Pulse or other RSS apps anymore. I’m interested in quite specific things – online media, science, maths and NZ news – that no one destination site caters for. These apps give me what I’m after in a format that’s really pleasant to read. Zite is almost scary, the way it finds articles from all over the web that really do interest me.

Atomisation both frightens and fascinates publishers. On one hand, apps like Zite (or the slightly scary Chrome plugin Super Google Reader) strip out ads and so incur the legal wrath of media orgs. On the other, it’s a great way to get your news. Readers love it, and because of that it’s not going away anytime soon.

But there’s another issue. The more readers who atomise your content, the fewer that hit your homepage. After all, that’s really all a homepage is: an aggregation of your site’s best content, as chosen (usually) by humans. It’s a package of news. Atomising it is just using a different package.

But the homepage is the plum in the fruit basket of a media website. It attracts the most ad dollars, it’s the page that reaches most of your readers, and it’s the most important page for branding.

So fewer readers on your homepage means fewer ad dollars. How will publishers cope?

They’ll need to move ad spend away from the homepage, toward more targeted advertising that commands better rates. They need to find out what readers want and deliver packages and ads to suit. They need to find out what advertisers are after and create or buy the technology to deliver it.

But wait, aggregators can do those same things. In fact, they can do them better, because they’re technology companies, usually start-ups. That means they can adopt new tech faster, are happier taking risks, and have no legacy relationships with advertisers (or newspapers) to worry about protecting.

An aggregator also knows who you are. Most ask for your Facebook and Twitter profile, which means they can track every story you read, what your friends are sharing with you, and any other info you’ve made public. Compared to a mass-market site like Stuff or the NZ Herald, which either have no logins or not much incentive to login, that’s a huge advantage because it lets the apps target ads. Even apps that don’t have your Twitter login can still passively track what you read because your device ID is unique to your phone or tablet, and these are much more accurate identifiers than the cookies used to denote a “unique browser”. The device ID persists over time, allowing apps to build up a picture of what you’re interested in.

The rarity of news is no defense against atomisation – in fact, it could work against sites in small countries like NZ, because they can’t compete with the amount of news offered by big overseas competitors.

Sites that service tech-savvy niche verticals are already seeing this. More and more traffic is coming via RSS and Twitter. That’s how their readers consume the news. The homepage isn’t yet an anachronism but is heading that way.

And where you live, well, that’s just another niche vertical.

So NZ publishers are at quite a big disadvantage around atomisation. The question now is where things will stabilise. Will any NZ readers treat news sites as destinations in 10 years time, or will everything be atomised?

One thing is clear: atomisation is only going to get more popular. All news sites – but especially those catering to small countries or niche topics – need to embrace it as a core part of their business in years to come.

If they plan now for the changes on the horizon, they’ll be able to adapt when their homepage traffic gives way. If they rely on legal threats and reader inertia, they’ll be caught out and left clinging to another out-dated business model.