Each quarter Snoo.ws tackles a new series topic to provide our audience with insight, intelligence, examples and ideas. This month is Monitoring Month where we will cover everything from basic monitoring 101 to the top monitoring tools to monitoring’s role in your overall strategy. So, stay tuned – there’s plenty to come.
Messing up is a part of learning. In general, mistakes are bound to happen from time to time. While aiming for perfection might be too lofty a goal, constantly making errors isn’t a great business model either. Learning from the mistakes others have already made is a great way to ramp up your own abilities. This article aims to help you educate yourself on some of the more commonly-committed monitoring oopsies.
Generally speaking, you need to have the right tools to monitor. Have a plan going into it: what keywords will you use? What are you trying to gain by monitoring? What platforms will you be using and do you know how to properly use them?
I found this great white paper online, “Listening 2.0,” from Converseon, available for free download. It’s an excellent resource that really served as inspiration for this article, so go and check it out here on your own! Canada Newswire has this white paper that helped too, and finally I used this article from Listen Completely – all in conjunction with my own knowledge and experience (just want to make sure to go ahead and give credit where credit is due, right up front).
Poor Choice of Keywords
Sometimes a search keyword is too vague. Another mistake that’s easily made is choosing a keyword that could be connected to something else – another company, a different product, or just part of casual conversation. Sorting through the masses of hits you get from monitoring can be really overwhelming, often we take a look at the total number of mentions and keep that in mind as an indicator of a term’s popularity online. If you’re using a keyword that could be connected to all types of mentions that aren’t focused on your company, the amount of results could be misleading.
Therefore, using a truly unique keyword can vastly improve the quality of your monitoring. Making sure you take the context of the mentions into consideration is important too. Also, you need to think of how your customers refer to your brand. If your product has an official name, but customers often use a nickname or misspelled version, that’s worth taking into consideration. Like “Coke” instead of “Coca-Cola,” or saying “McD’s” and “Micky D’s” instead of “McDonald’s.”
When using various tools, you’re able to adjust the specificity of your search terms. Sometimes a search that’s too simplistic can produce an overabundance of results. It will catch too many mentions. Likewise, creating a search that’s too detailed and too specific can omit a lot of useful information.
I think it’s worthwhile to do some test searches and get a feel for what kinds of queries yield what kind of results. The Converseon essay words it well: “someone on your team must learn to create advanced Boolean queries.” Boolean is a type of math-based search that basis its finding on a true or false mechanism, it’s considered the best method of data searching and most monitoring platforms use a Boolean basis for how content is pulled.
Inaccurate Sentiment Analysis
Did you read the exclusive interview I had with Tim Shier of BrandsEye? In chatting with Tim I learned so much. BrandsEye is a super innovative company and they have a lot going on in the world of monitoring, they’re totally revolutionizing the way brands are able to listen to the web. They Crowdsource sentimization of mentions the pull via automated searches. Why? Because robots and computers can recognize words, but not always the context of what’s been said. Slang, regional uses of a word or phrase, sarcasm, jokes, satire, misunderstanding – all of these reasons point out how software based monitoring can give you the wrong idea regarding sentiment.
You need to spend some time assessing samples and really finding platforms you can trust. If every time someone talks about your brand in a sardonic manner, you don’t want to count that as a good, positive mention. It’s misleading and can skew your results.
Spambots may use one of your keywords, and even though what they’re posting sounds good, seems positive, it’s just spam and doesn’t really reflect what actual consumers think about your brand. “Existing automated sentiment solutions yield 60% sentiment
accuracy at best, despite vendor claims to the contrary,” (Listening 2.0, Converseon white paper, 2010). The article on Listen Completely offered similar stats: “Most, on average, seem stuck in the 60-70% accuracy range through Boolean keyword matching. Others promise closer to 80-90%, but these often require natural language processing (NLP) technology that doesn’t yet exist in most SMM products.”
Don’t Monitor in a Vacuum
You have to take everything you discover in the monitoring data and apply to the real world. What happened that could be influencing a spike in mentions? Did you just release a product? Was your brand mentioned in the press? Did a blogger bring up your name and now others are just replying? Getting raw data is excellent, but you have to be able to see how the pieces fit into the whole puzzle. Listen Completely asks, “Can you correlate sentiment trends with other activity??
Likewise, it’s important not use any one monitoring tool in a vacuum. I love using multiple tools to perform a search and then comparing the results to get a fuller, more accurate picture of what’s really being said. It’s so easy to get tunnel vision, to focus in on one thing at a time – but being a successful listener is all about balance and fairness.
During a crisis or during a peak period of business, it’s easy to get caught up in what’s right in front of you. Crisis management is super important, but what about customers who weren’t exposed to the issue – are they still happy? Are people only talking about you because you launched a new version of your product? It’s a great idea to compare data sets to previous searches. Try and keep a progression of results so that you’re familiar with trends, and you’re more readily able to notice something odd, new, or uncommon.
Of course there are many many mistakes that can be made in this type of work. Social media itself is rather new, and all of us professionals working with social platforms, data harvesting, digital marketing – we’re still learning. To name every possible error would be obscene, this article would go on for days and days. But hopefully I’ve pointed out some of the bigger, more commonly made mistakes so that you can better prepare your own monitoring activities and try your best not to make them. In the very least, when you’re assessing your data, you can look out for certain trends or explanations for wacky results.