Market Research + Sentiment Analysis = New Insight

Q & A with Tom Anderson, Anderson Analytics and OdinText


Posted October 24, 2012

Tom Anderson is Founder & Managing Partner at Anderson Analytics - OdinText

Tom is one of four authorities who graciously agreed to respond to a series of questions exploring the role of sentiment analysis, text analytics, and emerging social-intelligence technologies in support of next-generation market research. The context is a market-research focus at the up-coming Sentiment Analysis Symposium, October 30, 2012 in San Francisco. Read the Tom's responses to questions from Seth Grimes and then click here for the full set of interviews.

Q1> What impact is sentiment analysis having in the market-research world, whether applied for surveys or to social media?

Tom Anderson> Well it seems to be very important/popular in social media monitoring, I guess because there is so little other data. If we consider that most 'social' data currently seems to be coming from twitter, which is just 140 characters (minus tags and urls), and a time and date stamp, then automates sentiment becomes rather important.

On other types of data from survey research to call center logs we often have much richer data which is accompanied by structured data, so sentiment in these cases soften becomes less relevant and often simply isn't needed.

Of course there are many other types of unstructured data, and in some cases like more detailed content analysis, sentiment again can be useful. However, we've usually found in those cases that rather than sentiment (positive/negative), various emotional and/or psychographic text analysis measures can be for more interesting. Of course arguably this could perhaps also be called 'sentiment', but it's more complex than the type I usually see being applied in social media monitoring which is limited to negative, positive and neutral or some derivation of that.

Q2> How effective is text analytics at getting at sentiment, and to what extent should researchers continue to rely on human coding, whether by experts or crowd-sourced? Could you describe one or two things you or your clients have learned, via automated text/sentiment analysis, that you wouldn't have discovered otherwise?

Tom> To me, or in the work we tend to focus most on, automated sentiment is just one part of the text analytics engine. Comparing human coding to automated coding is really not possible. To understand this you have to really think about the data and output.

I recall our first validation project for OdinText. I hired a company that just focused on coding marketing research survey open ends. We sent over two very large studies which we were using OdinText to analyze and wanted to see how human coding would compare. I should have known better thinking back as I'm so familiar with that type of data/coding, but at the time I purposefully gave no instructions. I just said, "please code these data as you would for any normal client".

More or less what came back in each case was a table of 50-100 "likes", and 50-100 "dislikes". It looked completely different from the richness that was available via OdinText, and we really couldn't do any thing with it. It was so limited. We couldn't run any math on it. It's just what stood out to the analyst and noting else. This along with the inconsistency of human coders is why I'm rather strongly against human coding these days.

I would say that in the vast majority of cases we really no longer need human coders. That's not to say we don't need human analysts who read and understand working iteratively with the software. But you technically need only one more highly skilled individual, the days of coding shops is gone in my opinion.

It's also not to say that in some cases setting up an important project for a client you would not want to check, or 'triangulate' for accuracy as I had previously written a lot about. But this is different than using humans to code.

Human coding can be useful on small projects. I wouldn't bother with text analytics (with some exceptions), if I had 300 comments to analyze. But the difference is sort of like using an abacus instead of an excel spreadsheet. Human coding is so much more limited. Once you do text analytics you never go back.

Q3> What factors should companies consider when selecting text and sentiment analysis tools for marketing research?

Tom> For marketing research or for any other category, domain expertise is critical. I'm not talking about industry expertise really, this is something the client has. I'm talking about the genesis/purpose of the software. If it was designed for social media monitoring, then that's probably what it does best or is limited to. If it was designed to detect fraud/terrorism, then it's not going to very be useful in understanding and monitoring what certain customer groups think about a certain issues (the latter is obviously is something a text analytics software for market research software should do well).

Also, please think about your data. If you want something enterprise wide that dumps any unstructured data into one engine, then you're probably not looking for what I would call a text analytics solution, you're looking for some sort of intrantet search tool. Text analysis of totally disparate sources is likely to be an extremely fruitless boondoggle. For some reason it's still happening though, especially when purchasing is involved in the decision, partly because they lack skillset to identify domain specific value.

If you're a consumer insights professional, You can not select the best tool for you, PR, accounting, legal, human resources, and your CRM team all at once. The best tools have specificity.

Return to the full set of interviews.

Our Sponsors


Bloomberg logo

Lexalytics logo

Accenture logo


Gnip logo


Converseon logo


AA/OdinText logo