Category Archives: Measuring Social Media

After the Tweet: Exploring Twitter Click Through Rate Benchmarking to Measure Engagement

One of the things that struck me when I broke down three months of Zappos tweets is that while it was interesting to see how they used Twitter to engage users, by some measures less than 1% of followers seemed to interact.

Here’s the data point I was talking about – Zappos tweeting about an Ellen video on April 24th: “Fun video that a few employees put together to try to get Zappos.com on the Ellen show – http://bit.ly/zapposellen”. On that day, Zappos had 487,448 followers (based on Twitter Counter – see chart below) and the video has only 4,132 views to date. That would imply a 0.84% click thru or interaction rate.

But we know that denominator is probably high for two reasons:

  1. Followers typically don’t unfollow (unless you are tweeting bad content or too often), thus what percentage of the followers are really actively following? And
  2. how many followers are actually still on Twitter – the churn rate is reportedly high (churn is as much as 60% for Twitter, which is far greater than Facebook and MySpace).

So Twitter Count also shows that follower growth has been fairly consistent over the last three months, averaging 4,068 per day (no real hockey stick), which makes it reasonable to assume that the after churn, only about 40% of the Zappos Twitter users are retained, giving Zappos just under 195,000 “active” followers on April 24th. That takes us to about 2.11% of active followers engaging with a tweet.

Some more data points on company posted items via Twitter and the implied engagement:

Tweeter Date Tweet ”Active” Followers Views Interaction Rate
@Zappos 5/3/09 Employees made a rap video about the Zappos golf cart. You know, just another day at Zappos offices – http://bit.ly/zgolf cart 215,600 2,607 1.21%
@Zappos 5/20/09 http://twitpic.com/5khqc – Headshaving day at Zappos! Employees shaving each others’ heads. I will be completely bald later! 249,108 4,805 1.93%
@Jet Blue 5/21/09 Boarding our inaugural flt to Montego Bay, Jamaica. Everyone’s excited to get to the white sand beaches! http://yfrog.com/15p1nj 217,597 30 0.01%
@SouthwestAir 6/2/09 Picking shirt designs for LGA launch! http://twitpic.com/6gulu 12,426 386 3.11%
@SouthwestAir 5/29/09 Yet another video shoot! This time no puppies or kitties! http://twitpic.com/66×74 12,008 314 2.61%
@ZyngaPoker 5/27/09 Why do we love Zynga Poker!? Check out what some of us on the team have to say about it!
http://tinyurl.com/p98nad
13,000* 2,190 16.85%

For the most part it looks like 1-3% of the “active” followers actually click-through to view the content (pictures or videos), but of course there are a lot of caveats (e.g. there may be other channels where the items are promoted, content of the 140 character Tweet may be more enticing than others, we are looking at videos and photos, not offers or deep links into things on the company’s website, we’ve applied an across-the-board churn rate but it is likely that it is different across different brands). That said, a couple things that standout though are the big 0% for Jet Blue and the near 17% for ZyngaPoker.

Zappos, SouthwestAir and JetBlue all have had their accounts for about two years but the posting frequency is a bit different: Since the accounts’ inception, Southwest posts about 100 times per month, Zappos 70 and JetBlue a scant 34 per month. Additionally, JetBlue’s tweets focus a lot on travel tips, some alerts, and occasional product mentions whereas the SouthwestAir has a bit more personality and peppy attitude coming through – they’re the ones you have more of a connection to and more likely to want to click through to see what they are doing.

On the other end of the spectrum is ZyngaPoker, which is the Twitter account for users addicted to Zynga’s Texas Holdem Application on Facebook, MySpace, iphone and other platforms. This account is brand-spanking new (there’s no history on either twittercounter or twitterholic) and has only had 15 tweets since coming online May 20th. So in this case I did NOT reduce the number of active users by the 60% churn (I consider them all active) and I also verified that there wasn’t co-promotion on the Zynga application Fan Page on Facebook (there wasn’t).

Interestingly, the quick jump to over 14,000 followers in a couple weeks might be a good proxy for understanding the Facebook and Twitter intersection. According to Developer Analytics, there are 2.46 million DAILY active users of the Zynga application and 1.46 million fans of the fan page. Determining the number of users who saw the promotion on Facebook requires a similar exercise: understand churn of those 1.46 million users (40% according to Nielsen) and determine what percentage opted in to receive fan page alerts within their news feed (let’s assume 90%) which brings us to a potential 788,400 people who might have seen the promotion of the Twitter account. With 14,516 followers, that suggests 1.84% of the fans who saw the message on Facebook had/established a Twitter account and began following Zynga. Again, lots of caveats here, but the range of engagement is similar to that coming from Tweets.

I realize this is a very small sampling and there are a lot of variables here, but I believe this provides a good starting point for benchmarking on-going active engagement metrics from social media campaigns. With these sort of benchmarks in place (ideally automated by social media campaign tools), marketers will be able to dive beyond gathering a number of followers or fans and start assessing, prioritizing and optimizing social media tactics to drive lifetime customer value.

Different Input, Different Results: Is it Real-Time User Sentiment without Facebook?

While I was blogging about the huge negative sentiment in the number of posts for Adam Lambert just before the American Idol results were announced, another company was using much better methodology, but showing a much smaller backlash in the making. Crimson Hexagon was featured on CNN (link to video) and showed true sentiment for the candidates – a lead for Adam Lambert over Kris Allen on a positive level, and a scant 3% negative impression for Adam Lambert with no real backlash at all for Kris.

Why did I, using a very rudimentary love/hate modifier with Vitrue’s Social Media Index, see a huge backlash in relative terms for Adam vs Kris while Crimson Hexagon saw a much smaller difference?

For one, my modifiers were rudimentary – there are many more ways to show whether you are strongly for, for, against or strongly against something. My sample was only a very small slice of the posts out there and the data in the Vitrue Social Media Index (SMI) isn’t designed to show positives and negatives.

But what the SMI does appear to have, that Crimson Hexagon does not, is Facebook data.

I’m a big fan of Crimson Hexagon’s methodology to defining user sentiment (see why I think automating post sentiment is such a critical component of managing social media). But without Facebook status and wall feeds as part of its data input, I believe it misses some critical understanding of real-time sentiment among the masses.

Yes, Twitter is part of the micro-blogging component of real-time sentiment, but I don’t think users are as addicted or connected to the medium as they are with Facebook. In general, I believe you say things among your friends (e.g. among Facebook circle and within the walled garden) that you may not want to broadcast from the top of a hill for all to hear (e.g. Twitter). I would also argue that Twitter is much more about gathering data (their founders mentioned it as more of an info dissemination tool) and reading for the great wave of newbies than it is about truly interacting with other users.

Thus to really get a feeling for real-time sentiment, Facebook is a critical component. I believe lacking the Facebook feeds contributed to Crimson Hexagon’s negative measures being more muted than what I saw with the Vitrue SMI, and illustrates why marketers need to understand the underlying inputs in any system or technology they choose to leverage to view real-time sentiment.

Automating Classification of User Sentiment is the Key to Unlocking Social Media

Rush Limbaugh has 33% more buzz online than Jay Leno according to the Vitrue Social Media Index. Social game company Zynga has tweets by posters with 3 times as many followers as their competitor Playfish according to the data over the last 30 days from Radian6. But it doesn’t really mean anything unless you know what percentage of that buzz or those tweets are positive or negative.

Sentiment is the guiding light that helps marketers and their organization know what to do with the mountain of social media that is growing exponentially through Facebook and Twitter. We need to understand our current positive to negative ratios, drill down to uncover what’s driving it, then put programs in place to mitigate the negative and foster growth of the positive. Just like we determine the ROI on capital improvements, with an accurate measurement of positive and negative sentiment we can measure the return on investing in programs to improve consumer sentiment about our brands.

So the vision thing is great, but the reality of going through thousands of posts and tweets and determining sentiment is the biggest impediment facing marketers and their organizations to moving forward. The industry average in being able to automate this process of categorizing the sentiment of user posts is about 60%, based on a conversation with Radian6.

Radian6 is a great workflow tool, but today they only offer users the ability to hand code the sentiment of each post. Their goal is to introduce automated sentiment attribution into the Radian6 dashboard this summer to get to 70 or 75% accuracy.

Crimson Hexagon has shown through research by co-founder and Harvard Professor Gary King that its approach to automatically categorizing the percentage of posts in blogs is higher than hand-coding or strictly counting the number of words. “Crimson Hexagon doesn’t count words, which can mislead; it amplifies human judgment to give the percentage in each category accurately,” noted King in a recent tweet. One of King’s colleagues noted that Crimson Hexagon is close to 80-88% accuracy for positives and negatives using their approach.

Very few marketers and organizations will spend the resources to hand-code responses (in fact King’s research suggests that one shouldn’t and typically see diminishing returns after 500). We will more and more rely on automated tools to do this work for us. Thus when the time comes to pick vendors, agencies and tools to help us measure sentiment, it is critical for us to understand the underlying data, methodology and resulting accuracy rates.

To date Crimson Hexagon’s methodology seems to provide the most promise. What other tools are you using to identify positive and negative sentiment of your brand and what is their underlying methodology?