Wednesday, May 20, 2015

Making Sense of Customer Words

Image courtesy of Pierre Metivier
I originally wrote today's post for Confirmit in May 2013. I've made some modifications.

How do you make sense of your customers' words?

There are not only a ton of different customer listening posts these days, but the types of customer data are equally as varied and voluminous. Data come in all different shapes and sizes: structured, unstructured, solicited, unsolicited…oh my! A lot is written about survey data and analyzing structured quantitative data, but let’s take a look at unstructured data.

What is unstructured data?

According to Wikipedia, unstructured data is: information that either does not have a pre-defined data model or is not organized in a pre-defined manner. Unstructured information is typically text-heavy, but may contain data such as dates, numbers, and facts as well. This results in irregularities and ambiguities that make it difficult to understand using traditional programs as compared to data stored in fielded form in databases or annotated (semantically tagged) in documents.

Techopedia puts it into simpler terms: Unstructured data represents any data that does not have a recognizable structure. It is unorganized and raw and can be non-textual or textual.

We know this much: unstructured data comes from a variety of sources, i.e., customer feedback (surveys, etc.), employee feedback about their own experience or about the customer experience, call center interactions, account manager conversations, blogs, tweets, shares, online reviews, medical records, books, and more.

You have a ton of great data from, and about, your customers, but how do you make sense of it all? How do you glean insights from all of the unstructured data that you’ve amassed?

The answer: get yourself a great text mining or text analytics tool. In its simplest form, text analytics tools turn your qualitative data into quantitative data, thereby allowing you to use that data for cross-tabbing, filtering, and a variety of other analytical approaches. Text analytics tools are not a manual approach to making sense of the data; they take a machine approach to categorizing comments and identifying sentiment of customer comments and other unstructured textual data.

I think it's pretty fair to say that I’ve simplified the definition and that there’s much more to it than that.

Other than the obvious "making sense of something that doesn't make sense" reason, why else use text analysis tools?
  1. You can shorten your surveys by asking open-ended questions, knowing that you’ll have some systematic (and not manual) way to transform and analyze the data.
  2. The trade-off to shortening surveys is that you get more robust feedback in the respondent’s own words, rather than in words that you selected.
  3. Once open-ended data is categorized, it can then be used for deeper analysis with your existing quantitative data.
  4. When you’re analyzing call center or social media conversations, for example, you may identify current or emerging issues long before they would have ever been uncovered otherwise.
  5. Most importantly, on a survey, asking follow-up, open-ended questions is necessary to understanding why something happened and to understand in the customer's voice what would make the experience better for him. We need to continue to ask these open-ended questions, but we need a more simplistic and automated way to analyze those responses.
There's a caveat and a balance with all of these. There really is nothing like reading verbatims to get the tone, the pain, the delight, the rich detail of the experience. I would strongly advise continuing to do that. But I also know that when there are thousands of data points, that's difficult to do.

So, let me shift to surveys for the moment and say, just because you have a way of analyzing and categorizing your qualitative data doesn’t mean you can ask more open-ended questions on a survey. You still need to be conservative with your approach here, and more importantly, ask direct questions that elicit direct responses, i.e., responses that actually tell you what you need to know rather than just vagaries and ambiguous responses. The “garbage in-garbage out” rule still applies.

The words. Why did they have to exist? Without them, there wouldn't be any of this. -Markus Zusak, The Book Thief


4 comments:

  1. Hi Annette,
    You say that text analytics tools can help with "making sense of something that doesn't make sense". Forgive me for being slightly contrary here but if you asked all of the customers who provided the feedback they would say that what they said made perfect sense to them.

    Is there a danger with generalised analytics that you only pick up sentiment, mood, tone, frequency of words used etc etc ? (Don't get me wrong....that can all be good stuff) But, by only using analytics to deal with a large volume of qual feedback don't we risk losing the potential 'gold' that is in the actual fully formed and considered individual response?

    Adrian

    ReplyDelete
    Replies
    1. Thanks, Adrian. The spirit of that thought is more along the lines of simplifying for some audiences who don't have the time to read all comments. Also, it's hard to cross tab and run various analyses on verbatims, but if we can condense for meaningful analysis, that's important. I don't advocate asking open ends and then not distributing them to the appropriate people to read and act.

      Even if we use text analytic tools and report based on their outcomes, I like to share representative verbatims to really add color, emotion, etc.

      Delete
  2. Annette, We use text mining from time to time. I have to say I think it always reinforces the blindingly obvious (we are too expensive and slow or something similar).

    So my challenge is what are people doing with the information they have rather than how can they get more

    James

    ReplyDelete
    Replies
    1. I agree...most important is what is done with the feedback. How do they act on it? Or do they even act on it?

      Delete