Tuesday, January 31, 2012

A Comparison of Successful vs. Failed VOC Initiatives

I thought it might be helpful to come up with a comparison of successful and failed VOC initiatives. The table below shows the things I've jotted down so far.


Let me know if I've missed anything!

Friday, January 27, 2012

Take a CX Field Trip

Have you ever wondered what retailers do with the feedback they gather from those receipt-based post-transactional surveys?  I have! And I work in this industry!

Any time you buy a product or use a service, you are asked to complete a survey about your experience. Whether you're buying shoes, staying at a hotel, getting your garbage picked up, or going for a cab ride. Everyone wants to know if you were satisfied with the experience.

I took a little field trip today.

I confess. Occasionally, I shop at Kohl's. I have a Kohl's credit card. I am a Kohl's MVP. I get the 15/20/30% off coupons in the mail from them every week. I've gotten Kohl's cash. Wow! All they do to acquire a customer, get you in the store, and make a sale! But what do they do for you once you get in the door? Quite frankly, it's a mixed bag. Sometimes the experience is good; other times, it's exhausting. But, hey, I can get all of my kids' school clothes and a new vacuum cleaner for great prices minus 30%, so what do I care?!

Well, I actually do care. And not only because of my profession. You see, there's a Target about two minutes down the street from Kohl's. I can get a pretty sweet deal on school clothes there, too... and get my groceries at the same time!

(Sidebar: JC Penney has been competing heavily with Kohl's and their marketing tactics over the last several months, but to finally trump Kohl's, they are now getting rid of their sales and slashing prices by 40%! And it appears that they are thinking about the customer experience, too.)

OK, back to my field trip.

I went to Kohl's today to talk to the store manager about how they use my feedback. (I haven't taken their survey recently, but I have a couple of times in the past.) I arrived at the store and decided to go upstairs to the Customer Service desk to ask for the manager. 

(Sidebar: I have always wondered why they have the Customer Service desk upstairs, in the furthest back corner of the store. Are they trying to make it difficult for people to return items or to get questions answered?)

On my way to the escalator, I passed a sales associate, and she asked me if I needed help finding anything. (Yay for that!) I politely declined and went up the elevator and to the far back corner. At the Customer Service desk, I asked the associate behind the counter where I could find the manager. She asked me why I wanted to speak with him and then paged him. While she was waiting for him to call back, she started to tell me about how they track numbers... ring, OK, the manager is downstairs at the cash registers.

Back down the escalator I go, and sure enough, the manager was at the cash registers! He was also the only person at the cash registers, checking out a line of waiting customers! I got in line because, despite that, I figured I'd just ask my questions quickly and take no more of his time than it would take to ring up a customer who was buying something. Luckily, another cashier came along, but there were six people behind me at that point.

When it was my turn, I quickly walked up to him and asked him how he uses the feedback that customers provide when they complete the survey that is presented on the sales receipt. He seemed momentarily tongue-tied, like no one had ever asked him that before (probably not, though more of us should), and then he proceeded to tell me that:
  • they use a 3rd-party vendor to collect the feedback
  • the vendor parses out the feedback for each individual store
  • there is no incentive for customers to complete the survey because they feel that this gives them true feedback (as a researcher, I read into that: "not biased")
  • they review the feedback in their weekly huddles
I appreciated that much information but wanted to get to the meat of it, yet still trying to be mindful of the people waiting in line to pay. I threw out a couple of doozy questions for him, and before he answered them, he needed to know who I was. Why was I asking? I told myself before entering the store that I'd stick to the "interested customer" character, but I ended up letting him know that I work in this industry and was curious to find out how companies are using our feedback down at the store level.

I asked him if he followed up with individual customers about their feedback, and he responded with, "Yes, depending on the situation." (If I had more time and was truly conducting a fuller interview with him, I'd have gotten clarification on which situations.) Note that Macy's follows up with every single one of their respondents, over a million a year.

I also asked him if he created action plans to resolve recurring issues, and he said he did. The example of a recurring issue I posed was long lines at the registers, which was the situation while I was talking to him, but I didn't feel right questioning him about the effectiveness of his action plans at that point.

The bottom line: I wish I had had more time to speak with him, but in that brief conversation, he gave me "the right answers." (Sorry if I sound skeptical; recall, I noted earlier that the experience there is hit or miss, not consistently good.) The one concern that I have is that they are likely just focused on the number.
  • Reviewing the feedback weekly is great, but honestly, it needs to be reviewed daily. If you're not, then you're missing out on hot spots, as they're happening. If you're reviewing feedback weekly, then that leads me to suspect that you're focused on the number. Set the number aside and look what customers are saying happened in the store yesterday/today. Take corrective actions immediately.
  • The Customer Service desk associate started to tell me about how they look at the numbers. I'd much prefer that frontline staff tell me that they review the feedback, discuss the issues, consider root causes, and talk about what they'll need to do to deliver a better customer experience. After all, that's why they are collecting the feedback. 
Have you ever stopped to ask a store manager, who is asking you to complete a survey about your experience, how your feedback is being used? We should all do that. And better yet, companies should be sharing with us regularly what changes and improvements they've made as a result of our feedback!

Wednesday, January 25, 2012

Delivering the Ultimate Customer Experience

A while back, I wrote about trust and the customer experience. Trust is just one aspect of the customer experience. I'll add three more attributes and then share an experience that exemplifies all of the above.
  1. Personalized
  2. Consistent
  3. Memorable
The catalyst for this blog post comes from my recent experiences at my favorite restaurant, The Winery Restaurant & Wine Bar in Tustin, CA. I'll share with you my last two experiences there, and you'll quickly understand why I call out those three attributes as being key to delighting customers and creating great customer experiences.

I've been to The Winery Restaurant four or five times in the last year or so, and every time I've been there, I know exactly what to expect:  friendly and professional staff, impeccable service, excellent food, relaxed atmosphere, and the best wine. It's always the same, and I love that!

My birthday was at the end of December. My parents were visiting for the holidays, and I was excited to take them to The Winery Restaurant for my birthday dinner. It's not the type of place they normally dine at; e they live on a farm in Ohio, and there aren't any restaurants like this close by! In the middle of the meal, my dad said, "The next time we come here..." Aha! Hooked! My parents enjoyed it, too. I knew they would.

When you enter The Winery Restaurant and announce your arrival at the front desk, the staff is warm and inviting. I've never had to wait more than a minute or two to be seated. Once seated, if you're wearing black pants, they immediately replace the white napkins on the table with black ones, which I love. Until last night, they have always seated me (and whoever I'm with) at the same table (now "my table," perhaps?), and we've always had the same waiter (Kevin). Kevin remembers me by name and, since he has previously taken the time to ask questions to get to know me, he now asks about my situation, my kids, etc. every time I'm there, even if it's been months in between visits. 

On my birthday, Kevin wasn't our main waiter, but he came by and chatted and helped out if our main waiter was busy. We literally had a team of people waiting on us, including the restaurant's Managing Partners, William Lewis (who is also the Sommelier) and JC Clow. When we ordered wine, William presented it, gave us a little bit of history on the wine, and poured it. After the meal, we had coffee with our dessert, and JC got the cups and saucers and poured our coffee!

The food was amazing, as always. It was plated wonderfully (if you're an Iron Chef America fan, I gave 20 points for plating!), and the portions were perfect. There's a bit of presentation that goes into the offering of the condiments, including various flavors of salt. (It's not pretentious, just fun.) Our main waiter was honest about what she preferred and didn't like about the food. I like to get those kinds of candid recommendations, as it makes it easier to think about what you probably don't want to order.

When you make your reservations or when you are first seated, they ask if you are celebrating a special occasion. Since they had learned that it was my birthday, they prepared a beautifully-plated and personalized dessert for me. It was really a surprising treat and a fun way to end a fabulous evening. 


Before we walked out the door, I texted my best friend and told her we'd be going there for her birthday. It was a memorable evening for us all. When I called my mom yesterday to tell her I was having dinner there again for my friend's birthday, I could hear perhaps a hint of "wish I was there" in her response. 

And that brings me to last night's experience. My friend and I were both so excited to go to The Winery Restaurant for dinner. As I mentioned earlier, I always know exactly what to expect there, and last night did not disappoint. The experience was no different from the previous times we'd been there. And that's a good thing!

I learned a little more about The Winery with this latest visit, particularly how they are able to personalize the experience. Normally I make my reservations through Open Table; however, Open Table didn't show any availability for the time that we wanted to go, so I called the restaurant directly to make the reservation. They asked me for my last name, and before I even finished spelling it, they said, "Annette, right?" Aha, I'm in their computer. Well, their secret is safe with me. Personalization makes it that much better!

We sat at a different table and had a different waiter last night, but Kevin was there and stopped by to say "hello" and to chat briefly. He remembered my last visit a month ago, who I was with, why I was there, etc. Love that.

The Winery also does other unexpected things. There's a certain salad my friend and I had two years ago that they no longer have on the menu. We asked about it, as we'd done in the past, and they made it for us!  They removed rules (in this case, a menu) from constricting the customer experience. They did the unexpected! Give the customer what she wants.

The service was amazing. My water glass was never empty, and neither was my wine glass. They anticipated my needs, and I never had to ask for anything. And, of course, there was a lovely birthday dessert, beautifully plated, for my friend at the end of the meal. It was another memorable evening!

I can say without a doubt that I am a raving fan of The Winery Restaurant! If you have time, visit their website; you'll see that the experience begins there. And if you're ever in the OC, have dinner at The Winery Restaurant. You won't regret it!

Tuesday, January 24, 2012

Creating a Service Culture

I was looking through some old files last week and came across something I had written a couple of years ago about how to go from collecting feedback to creating a service culture, especially among your frontline staff/teams.

My thoughts haven't really changed much since then. Let's get started!

Hire and build it.
  • Hire the right people - those passionate about customer service.
  • Like Zappos, ensure there's a good culture fit, too.
  • Train and empower your staff.
  • Encourage ownership of execution at the front lines; ask staff to define the ultimate customer service experience.
  • Remove employees who don’t want to adopt this new culture.
  • Treat your employees the same way you want your customers treated.
  • The customer experience is driven by the employee experience.
Announce and socialize it.
  • Make sure everyone understands why this initiative is so important.
  • Create a “greater cause” mentality in all staff segments.
  • Create a language around your VOC initiative; brand it, give it a name, etc.
Live it.
  • Mystery shop your own products and your service. You. Yes you!
  • Call your customer service line to see what your customers are experiencing.
  • Ask yourself, “Would I enjoy being, or want to be, a customer of this store?”
Operationalize it.
  • Prioritize key metrics and communicate them to the team.
  • Define your ultimate customer experience - taking into account feedback from your customers.
  • Outline your "Truly Outstanding Customer Contact" and train the team on what that means; better yet, let them define what it means so they own it (see above).
  • Realize that this is not about quick fixes; it’s a life-long endeavor (for you and your customers).
  • It is a way of doing business, not just the initiative of the day.
  • Talk about your scores in every team meeting.
  • Be sure to share comments with your frontline and let the voice of the customer be heard!
  • Communicate process changes internally and externally; close the loop with customers and let them know what process changes have been implemented as a result of their feedback; communicate changes to employees.  Make “visible” your commitment to listening to and acting on feedback.
  • Look for ways to be proactive in communicating with customers about new services, products, etc.
  • Be proactive in correcting an issue; don’t wait for a customer to call it to your attention.
  • Provide great service to everyone:  prospects, customers, employees, vendors/partners.
  • Fix issues quickly, and close the loop with all involved.
  • Create and maintain a best practices log/manual. Document it all: suggestions, solutions implemented, culture designed by employees, etc. Use it for coaching, training, onboarding, etc.
  • Treat customers as you would like to be treated!
  • Remember that customers are the reason that you are in business
  • (Over-)deliver on the brand promise!  Every day.  Every interaction.
Celebrate it.
  • Incentivize key staff for improvements.
  • Celebrate great service! Reward, recognize, and share examples.
  • Develop a competitive spirit! Have fun!
  • Find ways to show your appreciation… for customers and for staff.

“A customer is the most important visitor on our premises. He is not dependent on us. We are dependent on him. He is not an interruption in our work. He is the purpose of it. He is not an outsider in our business. He is part of it. We are not doing him a favor by serving him. He is doing us a favor by giving us an opportunity to do so.” - this quote has been attributed to Mahatma Gandhi

Tuesday, January 17, 2012

14 Tips for Creating Your Best Survey Emails

Last week, I posted a mini-series on maximizing survey response rates in which I mentioned the importance of survey invitation and reminder content, deliverability, and timing. In today's post, I'll delve a bit deeper into content and deliverability and list some guidelines to follow when creating your survey emails. (I'm assuming these emails are going to your customers, not to a third-party list or panel.)

The content of your emails impacts deliverability (gets to the recipient, doesn't get stuck in spam filters/folders), readability (ensures the recipient wants to read it), and response rates (entices the recipient to click the link to participate in the survey). The emails are the gatekeepers to the success of the survey campaign!

1. Ensure the From Name is recognizable and not one that recipients will likely ignore; the same holds true for the From Email Address. And the From Email Address can't be sent through an open relay; when the receiving server does a reverse lookup and the domains don't match, your message will bounce or get stuck in a spam filter. Use a real email address.

2. Write a compelling subject line that is truthful and free from words that are considered spam flags.

For both #1 and #2, you'll likely want to run some tests to see what works best for you.

3. The email should be personalized, i.e., when the recipient opens the message, they should see "Dear Jane Doe," not "Dear Valued Customer." (How valued does THAT make you feel?)

4. The content should be customized to the individual. Include information relevant to the trigger event or to your relationship that (a) sets the stage for the recipient and (b) confirms what they should be thinking about for the survey that follows.

5. Your message should be compelling, i.e., compel the reader to want to take the survey. You'll want to outline why (topic, objective) you are conducting the survey, why they should participate, and what you'll be doing with the feedback. Customize the message based on the audience; that might mean writing several different versions of content.

If you have been conducting the survey for a while and have made changes to the survey, think about refreshing the content of the emails, as well. And include a mention that you've made changes to the survey; this will keep respondents from saying, "Oh, not that survey again."

6. The email should state how long the survey will be available. When does the survey link expire?

7. State how long the survey will take to complete, either by providing the number of questions or the estimated length in minutes. If you choose the latter, give an honest assessment; you'll only frustrate your customers if you tell them a 20-minute survey will take them only three minutes to complete! This does impact completion rates and future response rates. It's about trust!

8. If the survey will be confidential and/or anonymous, indicate that in the email. If it will not be, don't state it!

9. I used to recommend that clients send their invites as text rather than HTML, but HTML is fine; go "light" on the code and don't use a ton of crazy images and unnecessary HTML. Make it look nice, preferably using branding that is seamless with the rest of your organization's branding.

10. Avoid spam flags, i.e., words that will likely get your message caught in spam filters/folders. This applies to the subject line and to the text of the body. Limit the use of words like "free," "winner," "opinion," and even "survey." (I had a client who wrote an entire invite without use the word "survey" once!) And don't use crazy symbols, excessive exclamation marks, or all caps. I've also learned that using "click here," especially excessive use of that phrase, will land an email in a spam filter.

Run the content through an email deliverability or "spam checker/scoring" tool to ensure there are no "offensive" words or triggers that you'll need to avoid. Pilot test the content with your team or a small subset of your target audience.

11. You'll need to ensure that your emails comply with the CAN-SPAM Act. In order to do so, I always make sure that clients' emails include the following:
  • Company name and physical address
  • Privacy policy links for both the client and the survey vendor
  • Opt-out link where recipients can unsubscribe from receiving further surveys (Be sure to opt them out; don't just have a link for the sake of having a link.)
It's good to revisit the guidelines now and then to ensure you're meeting the standards.

12. The email signatory should be relevant to the trigger event or reason for the survey. An example of a disconnect would be a post-support survey whose invitation is signed by the VP of Marketing.

13. Translate the email into the recipient's preferred language.

14. Keep the email short and sweet.

You might have some other tips to follow when creating emails for your surveys. I'd love to hear them!

Thursday, January 12, 2012

Maximizing Survey Response Rates - Part 2: 10 Tips to Achieve Your Goal

This is the second part of my mini-series on how to maximize your survey responses rates. Thanks for reading the first part and coming back for this one!  Let's dive right in.

If you execute well on the following 10 items, you should have great success in achieving optimal response rates for your survey and, subsequently, feeling confident about your findings and recommendations. These tips apply to any type of survey you conduct: employee, partner, customer, etc.

1. Your List/Audience 
  • Survey the right/relevant audience
  • Ensure list quality, accuracy, and validity
  • Realize that different audiences elicit different response rates (for a variety of reasons), including B2C versus B2B

2. Pre-Survey Communication
Before you launch your survey, let customers know what your objectives are:
  • Why is their participation/feedback important?
  • What will you be asking them about?
  • How will the survey be delivered/conducted?
  • When will they get the survey - and how frequently?
  • How will you be using the feedback and will you close the loop?

3. Survey Invitation
The survey invitation is critical to the success of your survey deployment. I'll devote a blog post to your invites in the very near future, but in the meantime, the key aspects to keep in mind are:
  • Maximize deliverability
  • Customize the content
  • Optimize the timing of the deployment (hour, day, date, etc.)

4. Touch Rules/Timing
Touch rules refer to when and how often you will survey someone. Consider how often you will conduct the survey, the interval or time between surveys, and the different sources of surveys within your organization. Your touch rules may vary by survey type and number of surveys in the initiative.

Another aspect is timing of the survey; this relates to recency of the event and freshness of the experience. Both impact the likelihood of a customer to respond to the survey.

5. Survey Design
If you have followed the tips I gave in my earlier blog post on survey design, you're well on your way. All of the items outlined in the post drive response rates; in addition consider:
  • Personalizing/customizing the survey to make it more relevant
  • Appearance counts
  • Give respondents adequate time to respond
  • Consider alternate modes of data collection to supplement your online survey efforts

6. Pilot Test
Conduct a pilot test of your survey before going out to the larger audience in order to get a preview of potential response rates and to test:
  • Invitation deliverability
  • Overall design
  • Navigation
  • Data integrity

7. Survey Reminders
Reminding customers and employees about outstanding surveys is critical to maximizing response rates; I've seen response rates as much as double (over what was achieved with the invitation alone) after the reminder was sent.

For reminders, you should consider the same things as noted in #3. Here are some general guidelines on number of reminders and timing when conducting transactional or relationship surveys. If you're conducting other types of marketing surveys, I would remain conservative on the number of reminders.

Transactional Surveys
# reminders = 1
Interval = 3-5 days after invite

Relationship Surveys
# reminders = 1-2
Interval = 5-7+ days after invite

8. Refresh
If you've been conducting the same survey(s) to the same people using the same invitation(s), etc. for the last several years, it may be time for a refresh. As a matter of fact, you'll likely want to revisit survey (and email) content regularly to ensure that you are still asking relevant questions, capturing metrics on emerging trends, keeping it fresh, etc. Your communications should announce the changes so that respondents are aware that it's not the "same ol' survey again."

9. Incentives
I'd put incentives dead last on this list if putting them at #9 wasn't a good setup for #10 - or I would exclude them completely. But I do list them because if I don't, some would say it was a glaring omission and question my sanity. OK, so I'll just address my opinion on incentives here and now.

First this: Incentives are more common for B2C surveys than for B2B surveys, and they are used for certain types of surveys or data collection methodologies (e.g., panel, focus groups) more often than others.

My stance is this: try conducting the survey without the incentives first. Incentives can/do increase response rates, but at what cost? Response bias? And literally at what cost? They can be expensive to manage and administer.

Know this: The best incentive is to act on the feedback, make improvements, and close the loop!

And that's a good segue into #10.

10. Act, Improve, Close the Loop
Close the loop with your respondents. If they feel that their feedback is being heard and used, they will provide feedback again. Follow up on their feedback. Use it for service recovery. Make product and process improvements. But most important, you must let your respondents know what you've done! Communication is key!

Afterall, why are you collecting feedback, if not to create and to deliver superior customer experiences!


Wednesday, January 11, 2012

Maximizing Survey Response Rates - Part 1: Defining Concepts

Yesterday I wrote about guidelines for proper survey design. Today's post is the first in a two-part series about how to maximize your returns on that well-designed survey. I've written on this topic before, namely for an article for CustomerSat's blog several years ago. I also gave a presentation on maximizing response rates at last year's Allegiance Engage Summit. For those of you who missed all that, here it is in black and white.

First, let me clarify two terms: response rate and response volume. They are certainly related, but statistical validity is based on the number of responses (volume) and not the rate. Clients always ask me what the best response rates are, but consider this: when you have a 50% response rate on a sample of 100 versus a sample of 10,000, it's going to mean two different things for your confidence in those findings.

So, assuming you have a solid population size and sampling method, I feel comfortable using the term "rate" going forward. Good response rates are essential for accurate, useful results. Low response rates and insufficient sample sizes:

  • Erode the validity of your results
  • Force you to qualify your reports and conclusions
  • Lower confidence in your findings and recommendations

Second, let me clarify the difference between response rate and completion rate.

  • The response rate is the percentage of people who respond to your survey, whether they submit just one page of the survey or all pages. 
    • To calculate: # responses / # invited*
  • The completion rate is the percentage who complete the entire survey—that is, they answer all  questions relevant to them and submit the last page of the survey. Completers are a subset of responders, since not all respondents will complete the entire survey. 
    • To calculate: # completed responses / # invited*
* # Invited will likely exclude bounces, wrong numbers, etc. (depending on your data collection methodology). You can certainly calculate the rates both ways, with those excluded or included; either way, I'd footnote your approach.

The reasons for differentiating response rates and completions rates are varied; and on the heels of my post about survey design, I'll focus on the impact survey design and respondent engagement (your relationship with the customer combined with how you've chosen to survey the customer) have on achieving your response goals. The graphic below shows their relationship. I think it's self-explanatory and points out that if you focus on good survey design combined with good respondent engagement, your response and completion rates will be high.


Both rates are important and ultimately determine how confident you can be in your results, including how representative your data are of your population. When you're confident in your results, you will also be confident about:

  • the investment your company has made to support or deny its hypotheses
  • presenting your findings to executives and to various other stakeholders
  • making recommendations and basing strategic initiatives and direction on your findings
So, the obvious next question is, "What do I need to do to ensure that I get the maximum response rates for my survey?"  Good question. Stay tuned for tomorrow's post when I'll outline 10 tips to help you achieve your goal.

Tuesday, January 10, 2012

22 Tips for Proper Survey Design

Need help designing a survey? Look no further. I've compiled a fairly comprehensive set of guidelines to get you on your way!

I've put together some survey design tips that I hope you'll find helpful. I realize that there are other variables to consider depending on the type of survey or the data collection methodology, but these general guidelines should apply regardless.

General Survey Guidelines
1. First and foremost, define and know your objective! As the saying goes, "garbage in, garbage out." If you don't have an objective in mind, your survey initiative will fail. Think about how you will analyze the responses and ask the questions in an appropriate manner.

2. Open your survey with a brief introduction, and I would state your objective (in customer-friendly terms) here, as well. Respondents want to know why you're conducting this survey and what you're going to be doing with their responses. Don't set expectations about actions and follow-up here that won't be executed. Also give an honest indication of how long the survey is or how long it will take.

3. Think about survey/question flow.  Start with questions that warm up the respondent to the topic. As you dive into the survey, put questions in a natural, logic flow and in sections rather than jumping around in some illogical sequence.  For example, in a post-transactional survey, ask questions in the flow of the experience; and when you are conducting brand awareness surveys, they come with their own set of requirements for how questions should be asked.

4. Know the reason for, and impact of, question placement. If you ask overall satisfaction at the beginning of the survey, you are getting a top-of-mind rating. If you place the question at the end of the survey, you have taken the respondent through the experience again via the flow of the survey and the questions asked, so the overall satisfaction rating will reflect that experience. You will get two different scores, depending on placement. Several years ago, I tested this theory on seven different surveys for seven different clients, and when the osat question was asked first, the score was always lower. I had a client who insisted on moving the question from the end of the survey to the beginning after years of having it at the end. I warned that the score would drop if we did that; the client still chose to move the question, and in the end, their osat score dropped one full point (on a 10-point scale) from the previous year! (Know that any discussion around placement of the osat question can be a "religious" one, and there could be a variety of differing views and opinions on this topic.)

5. Be mindful of survey length. Transactional surveys can be brief, e.g., 10-15 questions max, whereas relationship surveys can be a bit longer, e.g., 50 questions (where respondents only see those questions relevant to them, in essence making the survey shorter). Other methodologies may call for longer surveys. Use attribute grids to logically (questions that belong together) group questions with same rating scales. And don't forget progress meters to let respondents know where they are.

6. Ask a mix of closed-ended and open-ended questions. It is not necessary to ask an open-ended question after every closed-ended question, e.g., every rating question. As a matter of fact, I strongly suggest you limit the number of open-ended questions in your survey. You need to have at least one, but don't have 20!

7. Impacting survey length is question relevance, which each of the following will help with.
  • Don't ask things you already know about the customer, e.g., last purchase date, product purchased, date of support call, etc. 
  • Only ask questions that are relevant to that customer and his/her experience. For example, if you know the customer owns Product X and Product Y and recently called about support for Product X, don't ask questions about Product Y, too. Or don't ask questions about marketing materials in a support post-transactional survey.
  • Don't allow other groups or departments to commandeer the survey by adding questions that are not relevant to the survey objective. 
  • Use smart survey techniques to skip questions not relevant based on responses to previous questions.
8. Don't use company or industry lingo/language that your customers don't know or understand. Just like the customer experience, think about the survey from the customer's perspective. If you must use such jargon, be sure to define it in customer terms.

9. Speaking of language, if your survey is going out to a global audience, be sure to offer respondents the option to take the survey in their preferred languages.

10. Remember that you cannot collect personal information from anyone under 13 without parental consent.

When in doubt about general survey and sampling guidelines, follow the CASRO Code of Standards.

Question Writing Guidelines
1. Don't ask double-barreled or compound questions. That means, keep your question to just one thought and not a couple. For example, if you ask about "quality and timeliness of issue resolution," I'm not really sure how to answer that. You have just asked me about two concepts: quality and timeliness. What if the quality was great, but it took you forever to resolve the issue?

2. Make sure your questions are not ambiguous. Write questions clearly. If a respondent pauses and says, "What do they mean by that," then the question is poorly constructed.

3. Ensure that the questions are actionable. Ask yourself, "If someone rated that question poorly, what would I fix as a result of that?" If you can't answer that question, then throw out the question.

4. Similarly, every question should have an owner. If you can't attribute the question to a department or individual who owns its response or rating, pitch it. You're just asking for the sake of asking. (Granted, there will be some questions, e.g., demographics, that don't fit that requirement and will be needed to make the survey analysis more robust and the data actionable.)

5. Your question response choices and rating scales should be mutually exclusive. And do your homework; make sure you provide a complete list of response choices. I hate when the one answer that should be there is missing. Be sure to provide an "Other (please specify)" when appropriate.

6. Don't asking leading or biased questions. "We know you loved our new soft drink. How much did you love it?"

7. Use randomization of response choices to avoid positioning bias; but use this judiciously, i.e., doesn't make sense for every response choice list.

8. Use proper grammar and make sure you spell check!

9. Offer an "out" for questions, where appropriate. For example, not everyone wants to tell you their household income or about their children, and you may ask some questions for which they genuinely don't have an answer. Similarly, do not make every question in the survey required. This really makes for an awful respondent experience.

10. For open-ended questions, be specific. Ask exactly what you want to know, e.g., "What can we do to ensure you rate us a 10 on overall satisfaction next time?" Or, "Tell us the most important reason you recommended us to your friends."

11. And, last but certainly not least, I'll briefly address question scales. Like placement of the osat question, question scales are a religious discussion. Get 10 researchers in a room and get 10 different views of which scale is best and when. My point on scales will be this: be consistent on your use of scales within a survey. Clients have handed me surveys to review that have five different scales within each survey. That's a disaster for a variety of reasons, not the least of which is the respondent experience.

12. Don't forget to thank your respondents for their time at the end of the survey!

I hope these tips are helpful. The main thing to keep in mind... as CX professionals, we know we need to think about the experience with a company from the customer perspective. The survey design process is no different: think about the customer experience as you design the surveys. After all, surveys in their simplest form are just another touchpoint that you'll want to execute flawlessly.

Come back for my next post, when I outline how to maximize response rates.

Wednesday, January 4, 2012

Key Components of a CX Framework - Links to 5-Part Series

In case you missed my five-part series on the "Key Components of a CX Framework" and are looking for the links, here they are:

Part 1: Set the Stagecreate awareness and get buy-in, not only from the top but from across the organization
Part 2: Define CX outline the customer experience from a variety of angles
Part 3: Gather Data this stage is about more than just surveys; it's about any and all data that pertains to the customer and the customer experience
Part 4: Engage Employees: remember that the employee experience drives the customer experience - if your employees aren't happily engaged, it will be very difficult for them to delight your customers
Part 5: Put it to Work: time to centralize, socialize, analyze, strategize, and operationalize - transform the organization

Components of CX Framework

I'd love to hear your thoughts on the full series. Stay tuned for more-detailed posts on various items mentioned within each component.  Lots more to come in 2012!

Key Components of a CX Framework - Part 5

This is the fifth post in a five-part series about the key components of a CX framework. 

It's time to put the finishing touches on this five-part series and write about the fifth and final component of the CX transformation, Putting it to Work.

Put it to Work
Once you've covered your bases on the steps I've previously outlined...
  1. Set the Stage
  2. Define CX
  3. Gather Data
  4. Engage Employees
... it's time to put it all to work. It's time to centralize, socialize, analyze, strategize, and operationalize. It's time to transform your organization.  Let's take a look at each one of those.

Centralize
When I say it's time to centralize, I don't mean that the effort is ethnocentric and that HQ of a multinational organization has say on all aspects. Actually, I mean quite the opposite.  By centralizing, I am referring to one body that oversees the initiative - and this body consists of a cross-functional/cross-BU/cross-regional team that manages its execution. This initiative is not owned by one person - it can't be - it does, however, require that every corner of the organization is represented in this centralized body.

This is the group that keeps the effort moving and cohesive. They ensure that actions are taken in their respective departments, BUs, etc. This body is critical to the success of each of the following items I will outline below, as well as the success of the overall initiative.

(By the way, this is a good time to think about a centralized tool/platform for all of your data, as well.)

Socialize
The central body of individuals will also help to socialize the initiative.  While the executives may have done their part at the outset and at a higher level perhaps, the centralized team will socialize the efforts on a daily basis. Socializing means the members of the centralized body will communicate, educate, and build ownership and buy-in among employees in their departments. They'll energize the grass roots efforts among employees to come up with their own suggestions on how to improve the customer experience. 

A communication plan, as well as adoption strategies, will be critical to successfully socialize these efforts. I'll do a more-detailed post on both of these in the very near future, but they are a must to successfully engage the troops in these efforts.

Analyze
While you are gathering data from the many sources outlined in Part 3, the two critical tools you'll need are (1) a repository or a platform to bring all of this data together, and (2) a way to analyze the data.

Analysis will come in many forms, simply because there will be many different types of data. You'll need a way to crosstab, predict, identify key drivers, and prioritize improvements with survey data; mine and analyze text/comments; and track and review social media inputs and influencers. You'll conduct linkage analysis, to link customer and employee data, customer feedback with operational metrics, and all data to financial measures.

And, finally, one of the most important things you'll be doing is a root cause analysis. A popular technique to conduct such an analysis is the Five Whys. It's an iterative process that drills down to the root of the issue to help you fix the real problem. It involves deep thinking through questioning and can be quite effective.

Strategize
Closed-loop processes, continuous improvement efforts, and strategic initiatives as a result of feedback and analysis must be defined. To strategize means to define your strategy, which can involve both tactical and big picture efforts.
  • Tactical measures are how you will respond to each and every customer, each and every piece of feedback, on a daily basis. These include both service recovery and customer appreciation efforts. Closed-loop processes are typically defined as tactical measures.
  • Strategic improvements are how the business will respond.  What operational, product, and process changes will you make to address the bigger picture. These involve a bit of heavy lifting (human and capital resources) and often take several months (or years) to execute and implement.
Action plans outline the sequence of steps to be taken for a strategy to succeed. Action plans include the specific tasks that need to be completed and by whom, in what time frame, and how much it will cost. Action planning is necessary to help make resource allocation decisions and to provide a roadmap.

Operationalize
Getting the right feedback at the right time from the right customers, then gleaning insights, creating action plans, and driving it all back to the right departments and right employees who take action at the right touchpoints at the right time... and then measuring those efforts to start all over again. (Close the loop on your own change management process.)

Making the customer experience part of every employee's daily routine and thought processes - that's operationalizing. As the saying goes, "Customer service is not a department; it's an attitude." And - it's everyone's "job."