... or why you should take Twitter lists with a pinch of salt. There's nothing a geek likes more than a good list and as Twitter is full of geeks, there's nothing us geeks like more than a good list about Twitter. It's pretty common to see lists of top Twitterers on certain topics or locations.
Of course the lists also can provide a useful guide to who's who and who's getting it right, especially where brands are concerned, especially as more and more companies realise it's worth being on Twitter.
Earlier today Brand Republic released a list of the most mentioned brands on Twitter. It was interesting stuff and looked like a pretty comprehensive list of who was getting Twitter right.
Except it wasn't. It was a useful snapshot, but shouldn't be viewed as the be all and end all as there were more than a few flaws.
A quick disclosure at this point, as the following may sound like sour grapes on my part. The company I work for, ITV, wasn't on the list, whereas the BBC and Channel 4 (3rd and 28th respectively) were.
This struck me as slightly odd. We've been on Twitter for over a year now (unlike many of the brands in the list ), and have 4,778 followers. This is more than Amazon, Ford and eBay, all of whom appear in the top 15 (of course followers don't necessarily equal mentions).
What's more, I know ITV gets between 50-100 mentions on a quiet day because I have assorted Tweet Beep alerts set up. Even allowing for a very quiet few days, I'd comfortably expect us to be above Dulux on 208 mentions.
Again, at the risk of sounding like a sulky teenager who realises there's a party that they're not invited to, it does seem there's some serious flaws in this research. For a start, there's no sign of Facebook anywhere on the list, which is an even more surprising omission than ITV.
First of all, there's no word what the methodology is, so it's difficult to work out how Jam, the agency that carried out the research, came to decide who to monitor and who didn't. What qualifies as a brand and what doesn't?
Also, there are thousands of brands out there, so it would be useful to know the scope of research and monitoring. Were they just given 100 brands to monitor? 200? What were the parameters? There's a wide and varied range of companies on the list, so it's safe to assume the scope was pretty wide.
Then there's the way the brands were monitored - over three days in April this year. This is also problematic. The short timescale and lack of repetition increases the likelihood of a fluctuation in Twitter mentions for a brand that could be regarded as an anomaly in the Top 100.
For example, at the height of the Swine Flu panic, you'd expect Tamiflu to pick up quite a few mentions. If you're including Chelsea FC as a brand (which I would), they'd trend very highly this week. When Woolworths went into administration, mentions alone would probably have placed it in the top ten.
The research doesn't allow for rinsing out these random results. If the timescale were longer - say three months rather than days - you'd probably get a more accurate picture of which brands were mentioned the most. Or you could repeat the three day monitoring over, say, three weeks and see which brands consistently trended higher. The point in, a brand that finds itself in the news - unexpectedly or otherwise - will probably make it onto this list.
These are the main flaws, but - and although this probably goes byond that rather narrow parameters of the research commissioned - the list itself is probably more useful to the brands not on Twitter than those who already are. But mentions themselves don't tell much about how the brand engages on Twitter.
Sure, they may get plenty of mentions, but is the brand passive or active? Also, it's impossible to tell if the mentions are good or bad. For example, GMail had a brief hiccup early today. It would probably have made a significant spike in mentions of Google, which would a) as likely be negative and b) beyond Google's control on Twitter.
Again, I'm well aware this sounds like moaning - and, yes, this does somewhat influence it. But it ties into a more general problem I have with these kind of lists.
Brand Republic's Top 100 is useful as a snapshot, providing we accept the flaws. It also may provide the catalyst for some slightly sounder, more detailed research. But it's also slightly misleading.
The list itself doesn't mention the three-day limit until right at the end, and below an advert. It would be easy enough for people to look at the list, see ITV aren't on there and assume we're doing nothing on Twitter, in comparison to the BBC and Channel 4, which then gives the online reputation a bit of a dent.
There's nothing wrong with these type of lists - they're interesting, useful and generate a good amount of discussion both within and outside the brand. But if there's no preamble to place it in context, there's a danger they could be taken in the wrong way.
It also comes into the fringes of a pet grumble of mine - badly designed surveys and data collection. I'm a bit of a stats geek and number cruncher and have a firmly held belief that if you're going to do research then you should at least open up your methodology and let the rest of us poke around for holes and flaws.
Ok, so it's not exactly hard science, but there's still science in there and if you give the research a good going over, you can either make it stronger or disprove it.
Which is somewhat of a lengthy way of saying there's potential for some significant objective research of brands on Twitter (which would be tricky, but there's no reason, with the right design, why it couldn't be done). As opposed to a list like this which is interesting but not very useful as a piece of research.
 And even then I'm convinced I've seen Twitter accounts for a few of the brands on the list who aren't meant to have a Twitter presence.