Klout vs PeerIndex: An Experiment

Photo by Paul Clarke

Over the past several weeks I’ve been conducting an experiment. Ethically I probably ought to have mentioned this to my twitter followers but I didn’t, as I rather suspect no-one was harmed in the experiment in any lasting way. But I did it as a response to several client queries (and social media practitioner speculation) about the value of services like Klout and PeerIndex in measuring individuals’ authority online. Actually what first triggered the idea was a challenge I never had any intention of meeting, but which intrigued me to investigate further. It was back in February at the Social Media Week Like Minds panel on social influence that PeerIndex founder Azeem Azhar threw out the challenge for anyone to manually raise their score to 80 by the end of March (or maybe it was the end of June, I don’t remember exactly).

The very idea of seeking to increase one’s ranking on any kind of list just seems twee to me. But I liked the idea of manually comparing the differences between what Klout measures and what PeerIndex measures. This seemed to me a good way of informing myself (and my clients) about what was actually being measured and what that would mean in terms of communication strategies, public relations and business process improvement.

To be honest the last few months have been spectacularly busy for me so I haven’t had a chance to explore the issue very far. But it has been bubbling along in the back of my mind. And a few weeks back I decided to act in a small way to start testing.

Theory

Now neither Klout nor PeerIndex have specified precisely what it is that contributes to your score. They give rather generic (almost Barnum/Forer effect) advice on how to improve your ranking, but they don’t specify what’s happening. So the only way I could test what influences ranking has been through trial and error.

Of course, I started with theories. I looked at the feeds of people who were ranked highly at the start of the experiment, and I looked at those who were rated perhaps 20 points lower. And besides frequency of content creation, I noticed that those with a higher Klout score seemed to be those who shared a lot of links to content. The same thing seemed to be happening on PeerIndex. I theorized that sharing of linked content was considered value creation and a basis for augmentation of perceived authority. So for my experiment, I figured this was as good a place to start as any.

As a result, about 6 weeks ago I began to do what I call ‘locking and loading’. This involves doing a very small amount of research on the RSS feeds of sites I trust, and using Google news to search for recent news items on keywords of interest. I then take between 6-12 news items of interest and load these tweets into Hootsuite, and allow them to seep out during the course of a working day on a scheduled basis. I respond to comments as they come in throughout the day, but essentially I stop doing research early in the morning and let the tweet scheduling do the ‘hard work’.

Result

The results of the experiment have been telling, not least because I’ve actually been too busy to do much blogging over this period. This is an important point and will become clear later.

Let’s start with Klout. Over the past 8 weeks my Klout score has risen by 5 points. 3 of those points have come in during the last 30 days. The only period where my Klout score drops is when I was teaching in India and didn’t have much of an internet connection so I couldn’t share much in the way of links.

JJ klout score

Now let’s look at PeerIndex. When Azeem first put out the challenge, my PeerIndex score was 61 (see top photo in this post). It’s now down to 58, and importantly the ‘Authority’ element in PeerIndex is down to 55.

JJ PeerIndex score

While PeerIndex still notes that I’m more likely to tweet my own website as a source, I haven’t been blogging much lately so I haven’t been tweeting about my own website content. But I have been tweeting content from a broad range of sources. And of course, PeerIndex produces scores on activity over an extended period (4 months) so it’s possible that the recent activity Klout has already registered will in fact show up in PeerIndex – in 3 months time.

Conclusions

Of course any conclusions I reach for this experiment are seriously limited not only by the haphazard methodology I’m using, and the limited data I’m generating by only testing my own account, but there’s also the fact that I can only speculate about possible causes for changes to scores. However I still think it’s worth reporting here, as perhaps this can act as a means of crowd-sourcing others’ experiences of influence scoring utilities and further testing factors contributing to influence.

So this is what I speculate about the two dominant influence scoring facilities:

1. Klout operates on daily influence, where PeerIndex scores are calculated over extended periods. If you’re tracking an account for corporate purposes, you may want to see how a user operates both in short term and long term periods. But it does make PeerIndex a difficult prospect when tracking short term campaigns (probably a good thing).

2. Klout score seems to be influenced by the ‘quality’ of content you’re sharing, as defined by its likelihood to be retweeted or commented upon by others. The source of that shared content doesn’t seem to impact on that ‘quality’.

3. PeerIndex seems to tie authority with amplification of content shared from a few key sources. So if you are sharing a lot of content from a single source and that content gets commented upon or retweeted, this may raise your authority.

4. I suspect that PeerIndex is ranking sources of shared content as much as individuals. And thus if you share content from sources that are regarded as reputable, this is more likely to increase a PeerIndex score over time. Too much tweeting from a source that is not highly regarded may in fact reduce your score. Sharing content from a wide range of sources may also negatively impact on scores.

Other factors may have affected the reduction in the PeerIndex score – irregularity of tweets, increased numbers of PeerIndex users (normalising scores somewhat), lack of conversation, and so on. But I suspect that the source of content posted on twitter has more of an influence in PeerIndex than it does in Klout.

I plan to continue this experiment, and will report again in a couple of months. But at this stage there’s much food for thought on what is being measured.

Personally, I’m still not convinced that these scoring systems are particularly meaningful for business purposes. But at least now when people ask me what they measure, I have some ideas about the differences between the systems.

Be Sociable, Share!
This entry was posted in social media, tools and tagged , . Bookmark the permalink.