1 / 38

Metadata Maturity Survey Findings

Metadata Maturity Survey Findings. Taxonomy Community of Practice Call Series July 15, 2009. Presenters. Paul Wlodarczyk Director, Solutions Consulting, Earley & Associates Ron Daniel Principal, Taxonomy Strategies LLC. Housekeeping. Calls last 60 minutes

renei
Download Presentation

Metadata Maturity Survey Findings

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Metadata Maturity Survey Findings Taxonomy Community of Practice Call Series July 15, 2009

  2. Presenters • Paul Wlodarczyk • Director, Solutions Consulting, Earley & Associates • Ron Daniel • Principal, Taxonomy Strategies LLC

  3. Housekeeping • Calls last 60 minutes • We will take questions at the end of the call • Lines will be muted during the call but you can email rebecca@earley.com or Skype Rebecca.M.Allen with immediate questions • Use *6 to mute/un mute your phone

  4. Metadata Maturity:Benchmarking Best Practices Survey Results, Analysis, and Conclusions Paul Wlodarczyk Director, Solutions Consulting Earley & Associates 13 July 2009 Ron Daniel Principal Taxonomy Strategies LLC

  5. Introduction • Presentation is about the results of the 2009 survey of practices in Search, Metadata, Taxonomy, and Information Development. • Audiences: • Survey participants: See the results of the survey and how your organization compares to others. • Organization information specialists: See how to benchmark your organization against others of your size and/or industry. Use that to justify catching up, or taking the next step, with your management. • Information scientists: See the current state of practice and how it varies from previous surveys, or by size and/or industry of the organization.

  6. Background – Metadata Best Practices Survey • Goals for 2004, 2005 metadata best practices surveys: • Help develop a Metadata Maturity Model (MMM) • Keep the MMM “tied to reality” through best practices • MMM was a potential method to: • Identify search and metadata best practices and their pervasiveness • Predict potential problems in consulting engagements • Help clients identify projects that are within their reach (and that aren’t) • In 2009 we updated the survey • Added questions on information development practices (e.g. content reuse) • Goals for 2009 survey: • Spot trends in best practices since 2005 • Determine which practices impact Findability and Information Quality • Help validate and refine the MMM

  7. Previous Metadata Maturity Model

  8. CMMI: The Best-Known Maturity Model • Developed for Software Engineering, Funded by the DoD. • ~400 Processes, in 22 Process Areas, Keyed to 5 Maturity Levels. • Process Areas contain Specific and Generic Practices, organized by Goals and Features, and arranged into Levels • Process Areas cover a broad range of practices beyond simple software development • CMMI Axioms: • Individual processes at higher levels are AT RISK from supporting processes at lower levels. • A Maturity Level is not achieved until ALL the Practices in that level are in operation.

  9. Towards a Metadata Maturity Model • Caveats: • Maturity is not a goal, it is a characterization of an organization’s methods for achieving its core goals. • Mature processes impose expenses which must be justified by consequent cost savings, revenue gains, or service improvements. • Nevertheless, Maturity Models are useful as collections of best practices and stages in which to try to adopt them. • We are looking for a model to help clients decide what to do, we are not looking to audit and certify service providers. A simpler model is appropriate.

  10. Second Metadata Maturity Model (ca. late 2005)

  11. Changes to the MMM from ‘04 to ‘05. • Basic structure simplified from five maturity levels to four. • Several processes moved up a level of sophistication as evidence showed they were not used at the basic level. • More practices added, esp. around staff and training. • The processes and their arrangement were arrived at through a mix of survey results and educated opinion. • How have things changed in the last four years?

  12. Population statistics

  13. Participants by Organization Size 2009 N=217 2005 N=60

  14. Participants by Job Role 52% 39% 23% 39% 44% 100% 115% 2009 N=217 2005 N=60

  15. Participants by Industry 2009 N=217 2005 N=60

  16. The State of Best Practices in 2009

  17. Search Practices, 2009 vs. 2005 75% 30% 28% 17% 24% 2009 N=129 2005 N=54

  18. Metadata Practices, 2009 vs. 2005 19% 31% 13% 2009 N=127 2005 N=51 16%

  19. Taxonomy Practices, 2009 vs. 2005 50% 59% 2009 N=128 2005 N=52

  20. Comparing MMM to Reality (1) • The MMM predicts that practices will be adopted in a certain sequence. • How accurate is that? • Expect to see percentage not adopting increase as practice is more sophisticated. • See it in search, not in others. • Overestimated the uptake of query log examination. Percentage yet to adopt a practice

  21. Comparing MMM to Reality (2) • How much growth in maturity is due to 1-person practices vs. group practices? • Solo practices flagged in yellow. • Most change is due to group practices.

  22. Information Development Practices, 2009 2009 N=175 2005 N/A

  23. Analysis: IM Challenges by Company Size

  24. Key Question: Do mature organizations perform better? • We used the survey responses to develop a “Metadata Maturity Quotient” • Score on individual survey responses: • 0 = not practiced • 1 = under development • 2 = practiced in one unit • 3 = practiced in more than one unit • Separate scores for search, metadata, taxonomy, and information development • Overall Metadata Maturity Quotient is sum of Search, Metadata, and Taxonomy scores • Eliminated respondents with large number of “Don’t Know / NA”

  25. Size Guides our Analysis • Preliminary analysis showed that size matters • Divided respondents into groups by organization size: • > 10,000 employees • 1,000 – 9,999 employees • 100 – 999 employees • < 100 employees • Then into quartiles based upon Maturity Quotient: • Best in Class (top quartile) • Majority (middle two quartiles) • Laggards (bottom quartile)

  26. Maturity Quotient Score, by Organization Size • Maturity increases with larger organizations because of IT investment, need • Small organizations can get there faster with less complex deployments • Selection bias – our small companies are more content-intense…

  27. Maturity Quotient Score, by Industry N=128 Knowledge / content-intensive Transactional Archival Product / Technical Operational

  28. Respondents by Industry, Org Size Number of Respondents Average Industry Metadata Maturity Small firms are more likely to be in content-intensive industries that are more mature with respect to metadata best practices.

  29. Document Types by Organization Size Mature firms are more document-intensive

  30. Metadata search, by organization size Mature firms are more search-intensive

  31. Do mature organizations perform better? • We assessed responses to “Information Management Challenges” for each group of respondents (by size and maturity level) • Impact / importance of issues varies dramatically by organization size • Generally, more mature organizations perform better on issues that matter, across all size groups

  32. “Significant Impact to the Business” Larger companies have more intense IM needs

  33. “Significant Impact to the Business” >10,000 Employees 1000-9,999 Employees 100-999 Employees 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% <100 Employees 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% Mature companies fare better than laggards regardless of organization size

  34. Summary • Larger companies and those in certain information-intense industries have more pain resulting from managing more unstructured content • Generally speaking, the bigger the organization, the worse the pain • Organizations with the most pain have adopted best practices in the areas of information development, search, metadata, and taxonomy • Our findings show that those who adopt these practices have less pain

  35. Next steps • Use the findings to refine the Metadata Maturity Model • e.g. Move query log examination out of the basic level, add spelling correction to the basic level, add more staff practices, etc. • Is there interest in collaborating with us on the next revision? Contact Rebecca Allen (rebecca@earley.com). • Benchmark your organization • Take the survey and score the practices: 1 for under development, 2 for practiced in one unit, 3 for practiced in multiple units. • Sum the scores and compare to others in this presentation. • Is there interest in an online tool for this, or for more in-depth diagnostic assessments based upon the MMM?

  36. Community of Practice Calls SharePoint IA Group: http://tech.groups.yahoo.com/group/SharePointIACoP/ Taxonomy Group: http://finance.groups.yahoo.com/group/TaxoCoP Search Group: http://tech.groups.yahoo.com/group/SearchCoP Upcoming calls: August 5, 2009 – Conducting a Search Audit September 2, 2009 – DITA October 7, 2009 – Taxonomy Usability Testing November 4, 2009 – Developing an Ontology December 2, 2009 – Applications for Topic Maps January 6, 2010 – Taxonomy Management

  37. Please fill out the survey that should be in your inbox. Let us know what topics you are interested in and how we can improve the series. Seth Earley seth@earley.com www.earley.com 781-820-8080

  38. Questions?

More Related