1 / 72

Enhancing Nielsen's Measurement Systems & Data Quality

Opening comments from Billy McDowell and George Ivie on MRC's work with Nielsen, suggestions for improving data collection & integration, and optimizing the use of measurement assets. Focus on Nielsen's R&D and collaborations.

Download Presentation

Enhancing Nielsen's Measurement Systems & Data Quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CRE Quarterly MeetingJune 16, 2016

  2. Opening comments BILLY MCDOWELL

  3. words from the mrc GEORGE IVIE

  4. A Few Suggestions Prepared for: The Council for Research Excellence June 16, 2016

  5. MRC’s Work with Nielsen Core Product Examinations: • NSI LPM, NSI Set-Meter (not accredited for demographics), NSI Diary (not accredited), NPM, ACM Ratings • Nielsen Audio PPM (26 markets accredited), Audio Diary, RADAR (not accredited), Nationwide (not accredited) • Nielsen Online DAR (accredited, children removed) • Nielsen Scarborough (not accredited) Beyond the Core, Dynamic Changes: • Large-Scale Improvement Activities: Local Viewer Assignment, UAA, National Sample Expansion and Viewer Assignment (NPX) • New Technologies: GTAM, STPM, Code Reader, Mobile SDK, Watermarks • Add-On Panel Collection: Extended Home • Other Digital Work: DPR, CTVR, MTVR, Nielsen Social • Others: Monitor Plus, Nielsen Total Audience

  6. Formation of the CREIssue: Nielsen’s R&D on Measurement Improvement Committee on Commerce Science and Transportation U.S. Senate Hearing on S. 1372 – The Fair Ratings Act; Wednesday July 27, 2005 Susan Whiting: I do want to mention one initiative that came about from our work with both our task force and the MRC, the creation of a Special Council for Research Excellence. We created this council in order to involve the industry in setting the direction of basic research and development. Nielsen has committed an additional two and a half million dollars annually for special research as recommended by the council. It is comprised of 40 clients representing the entire television industry and chaired by Mark Kaline, Global Media Manager for Ford Motor Company, one of the the largest buyers of television advertising time in the United States.

  7. Actions • Focus on assisting and improving Nielsen: • TV, Audio, Digital, Print, International, Multi-Media/Cross Media • Revisit critical CRE work to push implications to Nielsen • Non-Response bias, local measurement effectiveness, device usage, etc. • Develop and maintain a list of Nielsen improvements driven by CRE Are we delivering the on the role that Susan committed to Congress?

  8. A few Suggestions Withholding MRC audit findings/issues –

  9. Suggestions 1. Improving Measurement of Children and Teens • Difficult area; legally restricted • A very significant issue in digital; but exists everywhere Specific Areas: • Gaining cooperation/permission • Sufficiency of data collected; sample sizes • Device sharing, co-viewing • Collection compliance; “gamification” of UIs • Ongoing need for child specialists; family specialists

  10. Suggestions 2. How much financial incentive is too much? • Linked with measurement burden, for example for the PPM, other metering devices, digital meters or SDK usage • Linked with potential untoward influence – inauthentic media usage, non-compliance or cheating, etc. Specific Areas: • What are the limits? Financial? Respondent contacts? • Allocation of improvement efforts to areas other than financial payments • Quality of interviewers/installers; in-person coaching and contact; refusal specialists; no say, no play; DST, lessening respondent task, matching interviewer to respondent; focusing on more successful times; quality of ongoing relationships with respondents, multi-mode contact processes, etc. • Detection of issues with respondents

  11. Suggestions 3. Integration of Return Path Data (RPD) • Difficult process to control • Data quality and adjustment processes are critical • Nielsen embarking on this process Specific Areas • What calibration benchmarks should be retained from the meter panels? • Compensating for missing MVPDs – should these be derived from the panel or from collected RPD? • Data adjustments – should these be derived from the panel or from collected RPD or from a lab?

  12. Suggestions 4. Optimizing the Use of Nielsen and Partner Measurement/Information Assets • Cataloguing direct and indirect (partner) assets at Nielsen’s disposal with assessments of each about how they might contribute to the core currency services; improving quality or utility • Recent MRC suggestion about IVT filtration (leveraging FB, IAS and Nielsen assets) Specific Areas: • Development and validation of transaction data sets • Attributing audience or other targeting characteristics • Recruitment sources or frames • Common processes for linking information

  13. Suggestions Other areas: • Diary service retirement strategies • BBO household integration in local; UE estimation • Techniques for ratings stabilization • Better of leveraging of the ACS in Nielsen custom-UE processes • OOH measurement of television – fusion, direct measurement, etc. • Extended home (students) – developing more stable measures • Returning in-car measurement to audio metrics • Values and exploration of social metrics planned by Nielsen • Improving reporting products and user interfaces • Learning from international processes and experiences

  14. George Ivie: givie@mediaratingcouncil.org (212) 972-0300 Thank You!

  15. Nielsen r&D UPDATE CHRISTINE PIERCE

  16. Nielsen R&D update Christine Pierce 6/16/2016

  17. Local tv measurement: Why are we evolving? • EXTERNAL FACTORS • Increased Fragmentation • Expanded consumer choices (VOD, SVOD, Tablets, Smartphones) • INDUSTRY NEEDS • More stability • Fewer Zero Ratings • More frequent updates Measurement solutions will need to include solutions beyond panel and use sophisticated techniques to solve for missing data and link data sets

  18. Local tv measurement vision • Currency Grade Panels • All Devices • Big Data 210 DMAs All 210 DMAs OOH • All Locations • Consumer data

  19. LOCAL MEASUREMENT STRATEGIES

  20. CRE AGENDA 2017 AND BEYOND

  21. STEERING COMMITTEE CERIL SHAGRIN

  22. CRE FINANCES RICHARD ZACKON

  23. INSIGHTS TO PRACTICE PETE DOE

  24. NEUROMETRICS HOWARD SHIMMEL

  25. Media consumption in an age of distraction Proposal for In-Home Consumer Research Nielsen Consumer Neuroscience June 15th, 2016

  26. Business applications for Phase 2 in-home research HYPOTHESIS STATEMENT: As more households include multi-platform devices as part of their regular viewing behavior, the industry definition of TV viewing may need to expand to accommodate additional behaviors beyond the current “watching” and “listening”. CURRENT SITUATION: • Current definitions of attention to TV include: audible exposure “in-room”, “watching in room”, and finally “listening or watching” where being in the room is not required. • How does the proliferation of multi-platform devices impact future descriptions of “watching” or “listening”? ? BUSINESS APPLICATION: This research explores the current definitions of ‘watching’ or ‘listening’ and identify ways to improve in-home media measurement and inform a model of distraction.

  27. Research questions for phase 2 in-home RESEARCH QUESTIONS

  28. Informing a model to measure distraction FULL DEPLOYMENT 1 2A 2B BENCHMARK PILOT • In-Lab to build a model of distracted behavior in a clean environment. • Provide a detailed breakdown of ads vs. content and also attribute emotional response. • Target audiences include all levels of distraction (solo, solo w/2nd screen, co-viewing, co-viewing w/2nd screen, and People Meter panelists w/2nd screen). • Inform executional considerations for Phase 2 via limited recruit with full deployment of all technologies among Convergence Panelists & Non Convergence Households. • Tracked recruitment process, panelist compliance, technology installation, and data capture to identify barriers & considerations for full Phase 2B. • Objective is to explore current definition and identify improvements in measurement in a distracted viewing environment. • Use In-lab data as a benchmark to compare against and to uncover differences. • In-home observations viewing behavior in a household over a 2 day period. • Compare behavior from People Meter households vs. Non People Meter households. IN-LAB IN - HOME

  29. In-lab insights feed demand to dive deeper in-home FULL DEPLOYMENT 2A 2B PILOT 1 BENCHMARK Variations of “watching” and “listening” influence attention to branding & emotional engagement with content during viewing experience. Instructions for People Meter button pushing may need to change – to account for other elements of distraction while watching or listening. Distractionsoccur regularly, from second to second – while watching content & ads. Important to see if these behaviors are duplicated in the home and determine incidence and occurrence of each IN-LAB IN-HOME

  30. Building a model of distraction Incidence of these events help inform variations in “watching” & “listening” at home HEAD DOWN Attention on 2nd screen HEAD OVER Attention to co-viewing partner but TV is on HEAD OVER & DOWN, & DELAYED CO-VIEWINGAttention to co-viewing partner & 2nd screen HEAD UP CHANGING THE CHANNEL LOGGING IN TO PEOPLE METER DEVICE ENTERING OR EXITING THE ROOM TV is still on; Observations will be made for primary & secondary members of household.

  31. Evening of tv viewing behavior captured for one individual Participant Level Data Captured During the In-Home Pilot Test Head Up Changing Channels Head Down Head Over – Adult Co-viewer(s) Head Over & Down – Adult Co-viewer(s) Other No Television

  32. During In-Home Phase 2B – All Data will be Aggregated and Presented Across a Sample Cell Tv Viewing behavior deep dive – second by second Participant Level Data Captured During the In-Home Pilot Test

  33. 2nd screen behavior during typical viewing evening Data Output for One Individual During In-Home Pilot This data represents Android level data. During In-Home Phase 2B – All Data will be Aggregated and Presented Across a Sample Cell

  34. Sample people meter button pushing behavior Participant Level Data Captured During the In-Home Pilot Test Log in During In-Home Phase 2B – All Data will be Aggregated and Presented Across a Sample Cell

  35. Research costs have increased to accommodate cost of recruiting convergence panel households Pricing assumes proposed study design and is subject to change with revisions the research design and additional details regarding sample.

  36. THANK YOU! Dr. Carl Marci Chief Neuroscientist Carl.Marci@Nielsen.com Bill Moult Consultant 617.877.3191 Bill.Moult.Consultant@nielsen.com Patti Wakeling Sr. Vice President 203-770-7548 Patti.Wakeling.Consultant@Nielsen.com Naomi Nuta Vice President, Client Services 617-904-3306 Naomi.Nuta@nielsen.com Leah Christian Director, Data Science Leah.Christian@Nielsen.com Kelly Bristol Manager, Data Science Kelly.Bristol@Nielsen.com Megan Sever Senior Data Scientist Megan.Sever@Nielsen.com

  37. Research questions for phase 2 in-home Influence of distraction via 2nd screens & co-viewing: • How do co-viewer distractions influence television consumption & ad exposure? • How do distraction patterns differ during content relative to ads? • What are the difference in viewing behavior and conversations between co-viewing household members and the impact it has on ad exposure, ad avoidance and program content? • How does emotional response change under conditions of watching or watching & listening? Viewing Behavior During Transitional Hours Compared to Prime Time: • How do behaviors change during transitional hours of viewing (5-8) compared to prime time hours (8-11)? Digital Consumption on 2nd Screen Devices: • What apps & websites are used more frequently under periods of watching compared to listening? People Meter Button Pushing: • What is the frequency of button pushing when in a distracted vs. non – distracted environment? • How often do members of a Convergence Household log-in & log-out when watching or listening to TV? • Is there a delay between lights flashing & confirmation at the 42 minute reminder moment?

  38. Participant experience: In-home prime time viewing behavior Equipment Installation & Training Participants recruited in two cells featuring • new homes representative of People Meter HH • Nielsen Convergence panel or FTOs Natural Environment Record behaviors during transitional & prime time only.   Observe the HH people meter usage and media behaviors over 2 consecutive days to understand incidence of watching & listening, whether solo, co-viewing or distracted, multi-device behavior and usage of the people meter.  Conscious Feedback Participants respond to short survey questions throughout the experience to record their attitudes associated with devices and content to which they have been exposed. DAY 1-2 DAY 1 * Actual People Meter HH will not be used for this research. If CRE is interested in recording behaviors throughout the day, the timeline and cost will need to be revisited.

  39. Two cells of participants will be recruited to represent the differences between qualified PM homes and homes without PM experience. summary of in-home experience: transitional & prime time

  40. Phase 2 Timeline for in-home experience Timeline is for estimation purposes only – a detailed timeline will be created upon project approval.

  41. People meter Household Participation Log in and out Whenever you start or stop watching or listening to television simply press your People Meter button. Respond to reminder messages The people meter periodically prompts to verify who is still watching or listening. Make any changes and then confirm who is watching. Reminder: Log in to the people meter if you are watching or listening, even if… You are in a different room You are doing other things such as cooking dinner or using the computer Watch TV as normal. CONFIDENTIAL AND PROPRIETARY

  42. Behavioral coding at the second by second level records watching & listening

  43. Phase 2B: Execution and cost implications Collected from People Meter households in Pilot Study

  44. BIG DATA STACEY SCHULMAN

  45. Goals and Benefits CRE MODELED TARGET VALIDATION STUDY In-Market Test Goals • Determine ability of modeled targets to deliver true, authenticated target audiences • Evaluate how true target delivery varies across intermediary targeting vendors, primarily digital networks and ad exchanges • Deeper understanding modeled target process and key drivers Benefits • Provide insight into key drivers of modeled target quality • Inform advertisers of best practices for model target development that lead to superior targeting and ad effectiveness • Direct advertiser involvement enhances CRE stature

  46. CRE MODELED TARGET VALIDATION STUDY Goals and Benefits 1 Cookie-Based Truth File Matching 2 Survey • Equal impressions across all targeting partners • Cookie match with Truth File database • High Truth File Match Rate = Superior Tgt Modeling • Additional target/demo validation • Brand/category consumption • 5-6 questions Modeled Target Placement Auto Intender Incidence A B C D Ad Network/Exchange 500,000 Ad Impressions 500,000 500,000 500,000 500,000 Truth File Match (Uniques) 22,699 32,568 10,997 13,478 3% 1% 5% 7% 2% 3% Truth File Match Rate

  47. ADVERTISER PARTNERS ANA + CRE = Advertiser Engagement

  48. REVISED TIMING • Colgate and Aetna • Committed to project, July/August Prep, September execution, October results • Mastercard and Subway • Still in discussions/awaiting sign-off. Expected prep to begin August. October execution. November results.

  49. REVISED COSTS Project Costs • Media display impression allocation (5 targeting partners, 4 brands) • Previous: $112,000 (16 million display impressions @ $7 CPM) • Revised: $70,000 (10 million display impressions @ $7 CPM) • Brand survey (5 targeting partners, 4 brands) • n= 2,000 (100 per target partner x 5 x 4 brands) • $145,000 • Additional Funding Requested: $102,000 (difference between savings in reduced display advertising and additional cost for brand survey)

  50. break

More Related