1 / 48

Measuring What Matters: Case Studies in the Value of Metrics

Measuring What Matters: Case Studies in the Value of Metrics. Presenters:  Daniel Chin  - Spartan Software Inc. Eric Peterson  - Tableau Software Oleksandr Pysaryuk - Achievers Daniel Sullivan  - Tableau Software Erik Vogt  - Moravia. A case for data-driven decision making.

bobbymoore
Download Presentation

Measuring What Matters: Case Studies in the Value of Metrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring What Matters: Case Studies in the Value of Metrics Presenters: Daniel Chin - Spartan Software Inc.Eric Peterson - Tableau SoftwareOleksandr Pysaryuk- AchieversDaniel Sullivan - Tableau SoftwareErik Vogt - Moravia

  2. A case for data-driven decision making • The data is often available (or inferable) • Data is cheap to store and processing data is fast • It can save thousands $$ • It can help justify investment and balance budgetary priorities • It can be good for your career • And it’s not that hard…

  3. OK, sometimes it’s a little bit hard…

  4. So the most important questions… …are the ones you can decide something differently depending on the answer. Such as: How much capacity should I plan for? Should I invest in….? Where can my budget be most useful?

  5. Davenport-Kim Model

  6. Basic ROI calculations often involve 2 scenarios • Option B = Invest $20,000 • $10 per transaction • $35 per translation • 85% pass rate • Cost per job = $51.75 • Option A – status quo • $25 per transaction • $35 per translation • 85% pass rate • Cost per job = $69

  7. Employee Success Platform

  8. Investment in Terminology Research • 170 terms (e.g. Boost, recognition, achievement) • Training and certification for terminologists • Research, translation, back translation • Simulated A/B testing of a few prominent terms: • terminologist suggests multiple term translations • 2 other specialists offer opinions, notes, preferences • terminologist evaluates & decides on translation • Achievers + Language Service Partner act as consultants Total investment: $1,900

  9. Scope of Re-work • 10%, or 17 terms, need updates: • 300 occurrences of “Boost” • 1,600 occurrences for “recognize” • 350 occurrences for “achievement” • Avg. 300 occurrences of each term • 1 term per segment • 17 * 300 == 5,100 segments need term updates! • 1 segment == 6 words • 5,100 segments * 6 words == 30,600 words to review! • 30,600 * 70% == 21,420 “weighted” words Cost of Re-work: ~ $6,650 • 21,420 words * $0.28 per word rate == $ 5,997 • + engineering & PMngmnt fees

  10. Total cost of re-work: ~ $6,650 Original investment: $1,900 Potential savings: $4,750 (71%)

  11. Investment in i18n Training • i18n training for translators (Achievers Academy) • certification exam on PHP Intl message formatting Total investment: $80 (1.5 training hours)

  12. Scope of Re-work • Assume a subset of 300 segments • 80%, or 240 segments, translated incorrectly • Productivity: 2 min per segment • Time spent: 2 min * 240 segments = 480 min Cost of Re-work: ~ $720 • Hourly rate: $60 • 480 min, or 8 hours * $60 = $480 • + engineering & PMngmnt fees

  13. Total cost of re-work: ~ $720 Original investment: $80 Potential savings: $640 (89%)

  14. Terminology Research Potential savings: 71% i18n Training Potential savings: 89% Other areas to invest in: • In-context translation • Localizability studies • Mobile first • Query management system • Automation where possible

  15. CMS and TMS Integration: Take-aways Increased project visibility Iteration at higher velocity In-context review and quality control as part of translation process reduced post-product error rates Demonstrable return on investment from tools integrations

  16. Localization Platform Technology Stack

  17. Project Management

  18. Website Project Management

  19. Before KitchenSync (2015) AVG of 12 pages/month for all languages in 2015 In-context review was challenging and resource intensive Each page required manual pull, manual load, manual download, manual input Build source file: 10 minutes Create project in TMS: 10 minutes Download files from TMS: 10 minutes Load and QA content: 40 minutes 70 minutes human processing time required per page AVG 12 pages/month x 70 mins = TOTAL 840 minutes (14 hours) Iteration for one language per page was a 20-30 minute process 24+ hour response time from editorial team

  20. Post-KitchenSync Universe (2016) AVG 78 pages/month in 2016 (+550%) Push a single or multiple pages: 2 minutes Push back to Drupal and iterations based on in-context QA are controlled by editors: 30 seconds a push QA by project manager and pushing pages to live 10 minutes AVG 78 pages/month x 15 mins = TOTAL 1170 minutes or 20 hours Iterations can happen as much as needed based on results from in-context review Any issues the editors cannot control are managed through a responsive issue management process

  21. 2015 2016 Processing volume +550% AVG 78 pages/month x 15 mins = TOTAL 1170 minutes (20 hours) Iterations can happen as much as needed based on results from in-context review Any issues the editors cannot control are managed through a responsive issue management process AVG 12 pages/month x 70 mins = TOTAL 840 minutes (14 hours) Iteration for one language/page was a 20-30 minute process 24+ hour response time from editorial team

  22. Quality at 10,000 Meters Use industry standards (or something close it them) Monitor quality levels of goods received Isolate potential issues and address them immediately

  23. In-context Review: KitchenSync

  24. In-context Review: KitchenSync

  25. Website UAT and T9N Bugs: Usersnap/GitHub/Workfront

  26. Localization at Scale

  27. You can’t measure what you’re not doing

  28. Build Learn Measure

  29. Vendors IT Product / Engineering Agencies

  30. Stability & Security Sales / Roadmap Product Features Resource Utilization

  31. How to Scale: Empower those most accountable for changes being made to implement and measure those changes themselves.

  32. Take stock of your systems landscape Social TMS CRM WCMS DAM

  33. Integrate Systems: Data First TMS DAM Social CRM WCMS

  34. TMS Drupal Drupal

  35. TMS Drupal

  36. TMS Drupal Eloqua

  37. CRM Analytics TMS Google Analytics Drupal Eloqua

  38. Localization at Scale at Tableau

  39. KitchenSync Generic PHP Interface Entity XLIFF FTP Eggs’n’Cereal PHP XLIFF Serialization Library HTML ←→ XLIFF WorldServer Integration

  40. KitchenSync Generic PHP Interface Entity & Field-Aware Entity XLIFF Entity XLIFF FTP Eggs’n’Cereal PHP XLIFF Serialization Library Drupal Implementation Paradigm-agnostic HTML ←→ XLIFF WorldServer Integration

  41. KitchenSync Generic PHP Interface Entity & Field-Aware Entity XLIFF Web Services Entity XLIFF FTP Eggs’n’Cereal PHP XLIFF Serialization Library Platform-agnostic Drupal Implementation Paradigm-agnostic HTML ←→ XLIFF WorldServer Integration

  42. Generic PHP Interface Entity & Field-Aware Entity XLIFF Usersnap Workbench Moderation Web Services Entity XLIFF FTP Eggs’n’Cereal PHP XLIFF Serialization Library Platform-agnostic Drupal Implementation In-Context Review Draft Quality Assurance Paradigm-agnostic HTML ←→ XLIFF WorldServer Integration In-Context Review

  43. Elomentary Litmus Test E-mail Deploys Platform Middleware Eggs’n’Cereal PHP XLIFF Serialization Library Eloqua REST API In-Context Review Draft Quality Assurance WorldServer Integration In-Context Review

  44. KitchenSync TMS Drupal

More Related