Social Media Sentiment Monitoring, how acurate is it and should it be automated?
When looking at brands with a large amount of coverage the sheer volume you have to deal with counts. I can see that if you are dealing with very large brands in a fast moving environment that waiting for it to be done by humans could take a long time and you could be overtaken by events. Nickburcher.com sets out why automated Social Media Sentiment can be usful http://www.nickburcher.com/2008/10/brandwatch-social-media-sentiment.html
Social media monitoring and sentiment tracking is becoming more important for brands. Anyone can publish anything now and if a citizen journalist mistakenly declares that your CEO has had a heart attack (Apple’s share price tanked before climbing back up) or a random reporter mistakenly informs Bloomberg that you are going into bankruptcy (United Airlines company value dropped by $1bn), then you need to know quickly and respond accordingly.
– Volume of posts analysed is different at each point so it’s important to consider how this affects results
– What are the sources? Is it across the whole blogosphere (where conservatives tend to be more active) or is it just across Labour supporting blogs?
– Is there an opinion weighting factor? (Based on links and audience, are some opinions given more credit than others?)
– Are the news sources UK only or global?
– What is the context of the sentiment? Is it around personality, policy or events?
– Is there one particular thing driving positive / negative sentiment or is this an average across all mentions?
– Competitive context is also very important (and perhaps the most important factor.) Is this good or bad vs David Cameron / Nick Clegg etc?
Amber Naslund said
We all crave the technology that can automatically tell us whether a post we read and track is positive/negative/neutral (and the holy grail would be something that could make next step recommendations). ………
What I mark as positive:
- Blatant and direct compliments or recommendations, without competitors mentioned. Can include product compliments or positive statements about service and support.
- Posts that contain superlatives in direct reference to our company or product (good, great, awesome)
- Reviews that are clearly complimentary, even if they contain a few improvements we could make
- If the post is a Digg, Stumble, or Delicious (someone found it valuable enough to vote on or bookmark)
- Retweets or links to any of the above posts
- Retweets of our events or publicity (implied endorsement)
- Posts that announce/feature our inclusion in a list, ranking, or otherwise, including along with competitors
- Posts that recommend us alongside competitors
- Inquiries about getting a demo and/or trialing the product (implies good enough impression to ask to see more)
- Retweets or links to any of the above types of posts
- Any tweets that are company outreach (from our employees). This helps to not sway the snapshot of what our community is saying, for better or worse
- Links to our website with no commentary at all
- Passing mentions of us in conversation unless they meet pos/neg criteria
- Statements like “checking out Radian6″ without other commentary
- Factual information about our product/brand without reaction or comment (including retweets)
- Links or retweets to our blog, events, etc. that don’t include commentary.
- Troubleshooting inquiries that are simply technical in nature
- Retweets or links from the community to third-party posts that contain criticism (passive endorsement of the negative content)
- Posts that contain criticisms of our product or service coupled with compliments or positive statements, if the negative seems to outweigh the positive
- Sarcastic comments that allude to a negative experience but without a blatant callout
- Troubleshooting inquiries that include statements of frustration
- Clear criticisms or complaints about our product or service. These are usually pretty obvious.
The thing as a blogger and speaking wiht the bloggers I know bloggers like to give a balanced review, the posative and the negative, how do you automate what is important in the conversation.
“Let me put it another way at Monitoring Social Media 09 (MSM09) Giles Palmer talk was by all accounts exceptional, I only got to see the first 10 mins which was good. On the other hand his moustache for charity was horrendous. ” Note, look out for an interview with Giles here tomorrow.
Now if you took this to have both positive and negative sentiment you would be right. However who cares about what I thought of his moustache, it is his talk that is important. How would an automated sentiment monitoring tool deal with that.
Do you have experiance of automated Social Media Sentiment Monitoring? Do you find it reliable?