Social media listening tools can provide powerful insights when they’re used to find answers to really good actionable questions.
But recently I’ve noticed a move to start making absolute statements based on such analysis. I highlighted one such area earlier this year in relation to the UK general election. Some people even suggested Twitter could predict the outcome. They were wrong.
The thing is, as much as social data can be powerful and seem vast in scope, you still need to keep a sense of perspective.
It’s been estimated that every day people speak an average of around 16,000 words. With this in mind I thought I’d try and make a quick estimate of the proportion of people’s conversation in North America and the UK that social media monitoring data represents.
Answer? 0.16 per cent*
And that’s before we get into issues like spam accounts, bias towards power users’ output, questions about whether tweets and posts are truly an authentic reflection of what people think and feel, demographic bias and the online disenfranchised.
I based my estimate on Twitter and Facebook, as they represent the majority of conversation that such tools access. We could add Reddit, blog posts, comments on online articles and YouTube videos, forums etc, and if anyone fancies doing so, be my guest! But I don’t expect you’ll get to a much bigger number.
Particularly as on the other side of the equation we could add to what people say other forms of conversation that aren’t accessible to social listening: emails, messaging apps and collaboration tools like Slack to name a few.
So does this make social listening as an insight tool a waste of time?
No, of course not. I’ve spent enough time buried deep in social data to know that it can provide hugely valuable insights. But to achieve this you need to be extremely focussed.
Ask good questions
Structure questions that take into account the limitations of the data. “Who does Twitter conversation suggest is going to win the UK general election?” does not fall into this category. Also ensure the answer doesn’t lead to a “so what” moment, but provides a genuine basis to take more action.
Say no to pretty noise
Pretty dashboards that pluck results out of the ether aren’t the answer. Make sure you understand exactly who you’re listening to – who is behind the data.You need this audience perspective to be confident what you’re seeing is real insight and to address what I call the four (f)laws of social listening.
Sometimes social media analysis gives you an answer you didn’t expect, one that differs from your existing world view. It’s crucial you don’t dismiss such answers as they could be the most valuable insights you’ll ever get. Equally, don’t naively just accept them at face value. Challenge. Try and triangulate the answer from another source. Try asking the question in a different way and compare the answers. Sometimes you can be surprised.
* You can see my back of an envelope calc here. The estimated variables are editable in the “Try your own” sheet (highlighted in blue) so you can have a play to work out your own figures. In simple terms we’re comparing:
Talking: c. 422 million people across US, Canada and UK using 16,000 words per day = 6.75 trillion words.
Twitter: c. 137 million tweets (N. American and UK users assumed at 27.5 per cent of active users multiplied by 500 million tweets per day) assumed to contain an average of 25 words = 3.4 billion words
Facebook: c. 707 million Facebook posts per day (N. American and UK users assumed at 16.4 per cent of users multiplied by 4,320 million posts per day) assumed to contain an average of 50 words = 35 billion words. Only 20 per cent of these posts assumed to be accessible by social listening tools. I have no specific basis for the level of this last assumption, though clearly it is the case that social listening tools can’t access all Facebook data – though Datasift’s PYLON offering provides a potential solution to this privacy issue. However even if you assume all posts accessible the result only increases to 0.57 per cent.