130 likes | 145 Views
“The State of Set-Top Box Viewing Data as of December 2009”. Set Top Box Committee of the Council for Research Excellence. March 11, 2010. Set Top Box Committee. Agenda. Mission and Membership Review Project Time Line
E N D
“The State of Set-Top Box Viewing Data as of December 2009” Set Top Box Committee of the Council for Research Excellence March 11, 2010
Set Top Box Committee Agenda • Mission and Membership Review • Project Time Line • Participants: Who Was and Who Wasn’t • Key Findings -2-
Set Top Box Committee Our Mission Obtain learning as to the viability of using STB data as a measure of video tuning behavior. Conduct a comprehensive examination of how STBs capture and report tuning How Will the Industry Respond? • Broader use and acceptance? • Innovative ways of applying data? • Practical guidelines? -3-
Set Top Box Committee Members Through countless phone calls and meetings, this determined group always contributed thought provoking ideas, time and energy! Michele Buslik* TargetCast TCM SVP, Director Media Research Alex Corteselli* Cox Reps VP, Research & Programming Susan Cuccinello* TVB SVP, Research Jon Cogan OMD Dir, Investment Res. & Insights Colleen Fahey Rush* MTV Networks EVP, Research Nancy Gallagher* NBC Universal SVP, News, Sports, Affil Res. Pat Liguori (Chair)* ABC Owned TV Stations SVP, Res & E-Measurement Susan Nathan Turner Corporate VP, Media Currency Lyle Schwartz* Mediaedge:CIA SVP, Media, Dir of Broadcast Noreen Simmons* Unilever Dir, Media Strategy & Operations Ira Sussman* CAB SVP, Research & Insight Bruce Tyroler Scripps Networks VP, Research Analytics Richard Zackon CRE Facilitator * CRE Board member Thank you!!! -4-
“The State of Set-Top Box Viewing Data as of December 2009” Project Time Line Apr-May’08Formulate and finalize Mission Statement July’08 Create RFP Aug’08 RFP finalized and released 8/15/08 Sep’08 Proposals due mid-Sep’08 Oct’08 Subcommittee review of proposals Nov’08 a) Project approved and funding granted by CRE b) Project awarded to Stu Gray/Tim Brooks and Jim Dennison Dec’08 Strategy meeting with consultants; questionnaire formulated Jan’09 Ongoing review of questionnaire Feb’09 Segment subject companies by function Mar’09 Finalize 4 questionnaires & field to key executives at 30 companies Apr-Jun’09 Follow up with key executives at each company Sep’09 a) Extend contracts with consultants to 12/31/09 b) Status: 14 Yes/ 11 No/ 5 Undecided Nov’09 Review report outline/format Dec’09 Final report due 12/31/09 Feb’10Final Report delivered 2/24/10 Phase 1 Phase 2 Phase 3 -5-
“The State of Set-Top Box Viewing Data as of December 2009” Companies Invited to Participate *agreed to participate AT&T Echostar/Dish Rovi* Brighthouse Google* Star Media Enterprises* Cablevision Guideworks Telmar* Canoe Ventures* Invidi Time Warner Charter IMS* TiVo* Comcast itaas TNS Concurrent* Navic TRA Global* Cox Media* The Nielsen Company* TVWorks* DirecTV OpenTV* Verizon Donovan Data Systems* Rentrak Visible World* A debt of gratitude is owed to those who participated! -6-
“The State of Set-Top Box Viewing Data as of December 2009” Aggregator Findings: Content and Activity • Every type of data that could be obtained from a set-top box is currently being obtained and processed by at least one aggregator, though no aggregator is processing everything. For example: • Channel changes - processed by all • Muting, program guide, VOD - processed by at least half • Internal DVR playback, fast-forward, pause and rewind and PIP – processed by half or fewer • External DVR, games, DVDs and VCRs – not processed by any • Most but not all aggregators can identify tuning to commercials, with “second-level” granularity. A subset of these can differentiate between national and local ads, though the process by which this is done is not known. • Most aggregators keep track of program and channel lineups, some employing external sources (TNS, Tribune, syndicated sources) while others used internal records. -7-
“The State of Set-Top Box Viewing Data as of December 2009” Aggregator Findings: Demos and Characteristics • Household characteristics (HH Size, Kids/No Kids, etc.) are reported by most aggregators, with 80% doing so through the use of modeling or ascription. • Seasonal HHs are not being identified by those who answered the question. • Tuning from residential and non-residential locations is combined by all who answered the question. Just one aggregator identifies the data in a way that permits the user to differentiate residential from non-residential tuning. • The location/room in which tuning occurs is not identified. • Resident demographics are reported by half of the aggregators. • Viewer demographics are reported by only one aggregator; the others have no plans to do so at this time. -8-
“The State of Set-Top Box Viewing Data as of December 2009” Aggregator Findings: Processing • Frequency of data updates or uploads varies greatly. While most responded “at least daily,” other responses included “once per week,” “continuously” and “several times a day.” • Most aggregators indicated they could change the frequency. • Weighting is done by only one of the five aggregators responding to this question. • Privacy concerns might limit the availability of subscriber characteristics. • Most aggregators indicated there are quality control measures in place; the aggregators determine the criteria for data usability. • Little detail provided about edit rules, time zone adjustments, data loss, etc. • Few provided information about the percentage of total STBs that contribute daily tuning data; responses given were “one-seventh of sample each day,” “80-85%” and “configurable.” • While a few keep records of STB malfunctions, only one publishes errors in reports. -9-
“The State of Set-Top Box Viewing Data as of December 2009” Aggregator Findings: Reporting • Shortest tuning duration is one or two seconds. • Half indicated this could be adjusted in the reporting system, while the other half said it could be done through reprocessing of the data. • Reporting varies from Household to DMA levels; only two project data to larger universes. • Universe Estimates updated monthly, quarterly or annually • Most do not report data as “ratings.” -10-
“The State of Set-Top Box Viewing Data as of December 2009” Key Considerations • Disclosure and Third-Party Verification • Necessary for valid and reliable data for use as currency • Media Rating Council • Data Relevance and Utility • Carefully consider what data are meaningful and actionable • “Just because it’s there” doesn’t justify the use of some data • Too much data/too few people • Availability of Delivery Systems that can Process Data • Involve TPP and software developers as soon as practical • Keep them informed of potential new data sources, paradigm shifts, etc. • Try to anticipate future needs or applications -11-
“The State of Set-Top Box Viewing Data as of December 2009” Congratulatory Words from Nielsen From: Bhatia, Manish [mailto:Manish.Bhatia@nielsen.com] Sent: Tuesday, March 09, 2010 2:03 PMTo: Liguori, Patricia A; Richard ZackonCc: Donato, Paul; Link, Michael W; Boehme, JeffSubject: Congratulations on releasing the CRE STB Study Dear Pat and Richard, On behalf of my colleagues at Nielsen, I’d like to thank you and the CRE for your important work on the State of Set Top Box Viewing Data report. Your findings are consistent with what our experience in working with STB data and we believe the report advances the industry’s understanding of the challenges and opportunities presented by this resource. We look forward to working with the Set Top Box committee and others in the industry in working to implement your recommendations. Please convey our congratulations and high regard to the rest of the committee and the entire CRE. Sincerely, Manish -12-
“The State of Set-Top Box Viewing Data as of December 2009” Thank you!