340 likes | 458 Views
Integrating Survey and Social Media Research. Understanding the Voice of the Customer (VOC) in mobile devices using Maritz Product Launch Monitor study. Michael House Vice President, Text Analytics Maritz Research Sentiment Symposium April 12, 2011. Today’s Session.
E N D
Integrating Survey and Social Media Research Understanding the Voice of the Customer (VOC) in mobile devices using Maritz Product Launch Monitor study. Michael House Vice President, Text Analytics Maritz Research Sentiment Symposium April 12, 2011
Today’s Session • Briefly review how organizations struggle when it comes to “putting the voice of the customer (VOC) to work” • Discuss a best practice in VOC integration, a Uniform Set of Customer Experience Categories, that enables comparing and contrasting multiple VOC sources • Examine preliminary results from the PLM study • Sentiment of Sources • A Look at Two Brands: Blackberry and HTC • Wrap Up
The struggle to “Put the Voice of the Customer (VOC) to Work” 1
Assessing Organizations Success in Using VOC • Maritz conducted the Maritz Voice of the Customer Practices & Challenges Survey* in 2009 with managers • ‘Blue Chip’ companies • 131 managers responded • Multiple industry sectors * Presented by Dr. D. Randall Brandt Vice President, Customer Experience & Loyalty Maritz Research American Marketing Association May 12 2009
Where Organizations Still Struggle • Integrating multiple sources of VOC data to define priorities for improvement • Demonstrating the link between customer and financial metrics • Linking the VOC to internal operational and service metrics • Integrating the VOC and the Voice of Employees • Clarifying survey-based action items so that their owners know what to do or fix • Pinpoint practices or business processes that must be improved • Communicating action plans to address VOC-driven issues • Getting managers/partners to act on VOC-driven issues
In a Nutshell... • Managers desire to know the positive and negative aspects of a product or service to drive improvements and marketing activities • But most organizations are not where they want to be when it comes to integrating and deploying insights drawn from the Voice of the Customer Integrating VOC Data Link VOC to Business Results Analyze VOC Data Capture VOC Data Customer survey metrics
Our Vision for an Integrated CEM/CGM Offering Customer experience sources today can broadly be grouped into: Primary customer research and other ‘internal’ sources Public media, including social media Each source has its biases and limitations. None have an exclusive license to the “truth”. The best, most robust picture emerges from the integration of all sources. This integration should be designed into a customer experience measurement and management process (Integrated Architecture): Built upon a holistic understanding of the business process and how those processes produce measureable business results (Blueprinting and Linkage) Supported by a common framework for combining findings across category – a Uniform Set of Customer Experience Categories
Bottom Line… Maritz perspective is social media monitoring coupled with survey research can help managers and marketers maximize success and make well-founded, informed decisions.
A Look at VOC Sources Using a Uniform Set of Customer Experience Categories 3
The Smartphone Product Launch Monitor • Maritz monitored online media and surveyed over 3,000 US consumers in the evolving Smartphone market from November and December 2010 • Maritz and evolve24, a Maritz Research company, used a combination of: • Social media identification (looking for mentions of interest in or buying Smartphone's on Twitter or Facebook) • Surveys done among consumer panels • Topics include awareness, consideration, and product likes/dislikes of Smartphone's among those who recently purchased or plan to in the near future • Devices included those currently available, as well as those announced, but not yet released • We developed a list of common categories using social media and our expertise in the market to compare text results
The Survey Open End Questions – Driving Sentiment into the Comments • “You said that you eliminated [Smartphone] from those that you had been thinking about.” (Model Considered) • “What did you like about it that made you consider it in the first place?” • “And what didn’t you like about it that made you eliminate it from your list?” • “What model did you purchase/would you buy?” (Model Purchased) • “What do you like about it that makes you choose this model?” • “Are there things that you don’t like about this model?” For these comments and social media, we examine the volume and sentiment of responses by categories.
Smartphone price Price of voice plan Price of data service plan Wireless coverage quality Phone call sound quality Picture-taking capabilities/camera Dropped Calls Video-taking capabilities Music-playing services and capabilities Video-playing services and capabilities Availability of video games Look and feel of the device The quality of the applications loaded The keypad Brand name Operating system Ease of use The number of available applications loaded Technical performance Ability to synchronize Office applications Quality of the available technical support After sales service Battery life The screen resolution GPS capabilities The size of the screen Ability to access e-mail The app store that comes with the device We Also Asked Which Categories were Most Important in the Consideration Phase “Among the list below, please indicate the item you think is most important to your choice of Smartphone.” We look at the percentage of respondents.
Comment Sentiment • For survey questions, focusing on dislikes and likes of Smartphone's yields expected sentiment • Overall average across the four questions is 0.63 • For Model Purchased, the average is slightly higher at 0.66. • This approach can be valuable in online, mobile, and mail survey modalities to identify both positive and negative information • For social media, we extract sentiment at the topic level • Average is moderately lower than survey likes/dislikes average
Brands: Blackberry HTC 3B
Approach • For the ‘likes/dislikes’ question, we identified the top 5 categories assigned by brand • In all cases, the 5 categories added up to more than 70% of the total number of assignments. • For the survey quantitative approach, 10 or more categories are needed to cover that number of respondents • We also look at the top 5 categories by brand for social media
Brands: Blackberry 3B
4 out of 5 categories are consistent • User interface stands out: • Ease of use • Keypad • Look & feel • E-mail also consistent
Fewer categories are consistent • The user interface also stands out: • Look & feel • Screen size • Keypad • Ease of use • Some concern about price and performance
From the Quantitative Perspective, • Also see the user interface may be an issue, but doesn’t stand out: • Keypad • Ease of use • Price shows up more • E-mail, while not on top, does show up
BTW: This increase demonstrates the need for monitoring social media to catch escalating topics… • For Nov/Dec, the survey timeframe, we see ‘ease of use’ is the only user interface category – it may be getting lost in the mix • E-mail is consistent with what customers ‘like’ about Blackberry • The discussion of other features point to a more ‘functional’ focus
Large number of tweets referring to getting ‘free’ products…
For Blackberry… • User interface shows up as an important area in what consumers like/dislike in the survey • UI does show up in the most important consideration, but not a top issue • In social media, less emphasis on UI and more focus on features – may be related to less ‘hands on’ experience
Brands: HTC 3B
4 of 5 categories consistent • The user interface shows up: • Look & feel • Screen size • Performance is important • Technical • OS • Price of device is also important
Very consistent results • UI is still a concern: • Look & feel • Keypad • Size of screen • Battery life surfaces • Price of device is a concern
Performance surfaces as a top issue (technical, OS) • Price does shows up more • UI and battery life do not show up at all
Same issue with HTC as Blackberry… • UI does not surface • Performance shows up in OS • Battery life is consistent with comments in survey, but not the checklist
For HTC… • User interface shows up as an important area in what consumers like/dislike only in the survey comments • OS may be a core area for HTC – shows up in all three sources • Battery life surfaces in comments and social media, and may be an area to look into
Wrap UP 4
Summary… • Uniform Common Categories provides a basis for comparing information from multiple text sources • Survey data is a balanced approach for discerning topic areas impacting products and services • Asking likes/dislikes helps drive sentiment into comments to get a balanced view • A consideration for ‘rating websites’? • Social media measured dynamically is critical for identifying escalating topics and measuring effects of actions taken • All three sources add value and insight into the ‘picture’ for a given Smartphone