1 / 45

PS Benchmark Review: SAMPLE

PS Benchmark Review: SAMPLE. Updated July, 2011 Bo Di Muccio , Ph.D. Vice President, Professional Services. Benchmark Highlights for SAMPLE. Overall, this exercise reveals that SAMPLE’s processes and performance levels are more on target than off target

sun
Download Presentation

PS Benchmark Review: SAMPLE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PS Benchmark Review:SAMPLE Updated July, 2011 Bo Di Muccio, Ph.D. Vice President, Professional Services

  2. Benchmark Highlights for SAMPLE • Overall, this exercise reveals that SAMPLE’s processes and performance levels are more on target than off target • Based on a core peer group comparison against other large, product-centric PS organizations (“Product Providers”), SAMPLE looks largely as one would expect • Especially on target benchmark results obtain for Services Engineering and Partner Management • Sales, Delivery, and Operations functions all compare favorably to current industry and peer group benchmark standards • The main area of concern is financial performance which - even compared to very product and partner centric PS businesses – benchmarks rather poorly • Detailed findings and recommendations attempt to zero in on high impact ways to aligning better to industry and peer group benchmarks

  3. PS Benchmark Dashboard: SAMPLE Reminder: This is a benchmark comparison, not a judgment about whether you’re “good or bad” at these things or whether your practices are “mature.” It’s simply a comparison of your practices and metrics to industry and peer group benchmarks using the logic and methodology previously articulated.

  4. TSIA and Benchmark Methodology Overview

  5. Technology Services Industry Association: Why, How, What? “The most interesting economics in technology are now being generated by the right combination of products and services.”

  6. Professional Services

  7. TSIA Community Sample TSIA PS Members • 300+ TSIA Members • 105 PS Members as of 7/1/2011 • 40% growth in 3 years • This is our benchmarking community

  8. 2011 PS Research Activities Service 50 Cloud 20 Europe 20 Rates, Comp Project Perform. Solution Centers, Partner Mgt Cloud and PS, Partner Enablement Basis for Benchmark Reviews

  9. Core PS Benchmark Study Overview ©2009, TPSA Inc. Core • Perpetual study since 2005 • Hundreds of PSOs benchmarked • 100+ validated completes in most recent snapshot • Ability to segment in multiple ways: • Company/TPSO size • HW versus SW • Traditional SW vs. SaaS model • Services intensiveness • PS charter • PS Performance • Basis for: • Data mining • Advisory/Inquiry • PS benchmark reviews 9

  10. Benchmark Review Logic • This is a benchmark review NOT a “maturity assessment” • No theoretical/arbitrary standards, rather how you look against others • Entries carefully validated to ensure benchmark quality data • SAMPLE benchmark compared to 2 core groups • “Industry” • 100+ companies benchmarked and captured in Q410 snapshot • Cross section of larger/smaller (50/50) and software/hardware (65/35) • Virtually all technology product companies with services in the portfolio • “Peer Group”: Product Providers • 60+ PSOs in this group • Revenue mix and PS size make it the best current-state comparison • Peer group is more heavily weighted in “scoring” • Scoring is meant to highlight areas for focus or possible initiatives • 100% confidential in every way

  11. Service Strategy Profiles

  12. Key Goals of Service Strategy Profile Approach Help PSOs Understand True Priorities Help Align Executive Management Team • Charter of PS • Financial Business Model • Growth Targets • Type of Service Offerings • Service Sales Strategy • Core PS Skills • Service Partner Strategy • Scalability Model

  13. Benchmarking Practices: Comparison 1 In Alignment? Peer Group Majority Practice Yes = Better Your Practice No = Worse Industry Majority Practice Yes = Good Required Practice No = Bad

  14. Comparison 2: Practice “Zones” Required Practice Differentiated Practice Practices are business processes that help companies achieve target results.

  15. Metrics/Results Zones • Metrics are analytical measurements intended to quantify the state of a business. They are independent variables or leading indicators for the performance of a PS business. Examples are attach rates, billable utilization, and project duration • Results are analytical measurements intended to quantify the state of a business. These are dependent variables or lagging indicators for the performance of a PS business. Results are produced by practices and metrics. Examples are project margin, field margin and net operating income. +/- 10% of Avg “On Target” Average Off Target (High) Off Target (Low) • Important Exceptions • Results or metrics that are off target relative to both the industry and peers, but in a clearly positive way are rated as “differentiated” (Ex: extremely high project margins) • Key metrics or results that are off target relative to both the industry and peers and in a clearly negative way are rated as “critical off target” (extremely low project margins)

  16. Evaluation Framework • Each self-reported practice was compared to the majority practices of the industry and the target peer group • On target industry practices were worth 3 points; on target peer group practices were worth 5 points • Each self-reported metric and result was compared to the average metrics and results of the industry and the target peer group • On target or better industry metrics and results were worth 3 points; on target peer metrics and results were worth 5 points • Differentiated practices/metrics: points taken off the table • Missing of off target critical practices/metrics: points doubled • Scores are expressed as % of total possible points and assigned color coding as follows: • 0% - 24%: RED • 25% - 49%: ORANGE • 50% - 74%: OLIVE • 75% - 100%: GREEN

  17. Putting It All Together: An Example Overall Rating (Points divided by total possible) Practice or Metric Missing or Off Target (Possible points unchanged) CRITICAL Practice or Metric Missing or Off Target (Possible points doubled) Differentiated Metric or Practice (Points taken off the table)

  18. PS Engine Total Score Customer Experience Services Sales Services Marketing Services Delivery Business Results Partners & Sourcing Product Impact Partner Mgt Services Engineering Services Operations

  19. PS Benchmark Dashboard: SAMPLE Reminder: This is a benchmark comparison, not a judgment about whether you’re “good or bad” at these things or whether your practices are “mature.” It’s simply a comparison of your practices and metrics to industry and peer group benchmarks using the logic and methodology previously articulated.

  20. TPS Business Model Overall Bus. Model

  21. Business Model: PS Pace Setters Observation The top 20% of PS organizations are very profitable, but the bottom 80% are just north of break-even Yet, revenue mix for these two groups is not very different. For the vast majority of PSOs, this data explodes the 20% Net OI “myth.” Source: TSIA PS Benchmark Data, Q1 2011

  22. PS Business Model Observations Pretty close fit for a classic Product Extender Very heavy emphasis on product over service in the revenue mix PS revenue contribution precisely on the peer group average

  23. PS Business Model Observation Dashboard shows that SAMPLE is truly a classic Product Provider in virtually every way Exception is in financial performance Even Product Providers do better on project margin Field costs are about average So pressure on field margin comes from low project margins

  24. PS Business Model Observation Total below the line OpEx is pretty much on target The one potential miss is in the fact that SAMPLE is spending 2X the average on sales G&A spending is ½ the industry average. Is this high efficiency or lack of critical investment?

  25. PS Business Model Observation Combination of low project margins and moderately high costs means that SAMPLE’s business model looks very different from the industry and from the Product Provider peer group Cost Center model was more common 10 years ago It’s actually very uncommon now Doesn’t mean it isn’t the right model for you

  26. PS Business Model Observation Same data, different view This view highlight the consequences of lower project margins coupled with higher overall costs

  27. Overall Observations and Recommendations

  28. PS Benchmark Dashboard: SAMPLE Reminder: This is a benchmark comparison, not a judgment about whether you’re “good or bad” at these things or whether your practices are “mature.” It’s simply a comparison of your practices and metrics to industry and peer group benchmarks using the logic and methodology previously articulated.

  29. Delivery • Delivery in the green, but not by much: • Delivery staff spending a lot of time on presales, which likely contributes to higher sales costs • Revenue metrics are off the charts high, so that’s not the problem • However, rate realization and discount rates (from Sales Module) are huge misses • What’s puzzling is that the actual rates are squarely in range, indicating rate card is skewed • Most companies have far higher involvement of direct FTE project managers

  30. Business Model • The other of only two ratings areas NOT in the green: • If this business models is documented in the strategy and represents a deliberate decision to run PS as a cost center, this has been achieved • The message is that even Product Providers have moved to a profit center model for PS, including ones that have a heavily partner centric model • A 5% to 10% increase in project margins would move SAMPLE squarely into the expected profile for Product Providers • This will be difficult as long as PS reports to Sales, which likely is only concerned about product margins • Question: Do you have a “seat at the table” with this economic model in place? • Would you have a better seat if you could demonstrate good “services hygiene?”

  31. Framework for Prioritizing Initiatives Higher Impact Easier to Do Lower Impact Easier to Do TPSO Ability to Change Lower Impact Harder to Do Higher Impact Harder to Do Positive Impact on PS Financial Performance

  32. Priority Initiatives for SAMPLE Review Rate Card and Discounting Policy Review Project Margin Performance Solution Center Overall Pyramid/Labor Costs Consider Formal Project C-Sat Program Consider implementing PMO Practice TPSO Ability to Change Validate Product Provider Model as Foundation of Service Strategy Review and Validate Sales Model Positive Impact on PS Financial Performance

  33. Appendix:Additional Function Detail

  34. Total Score

  35. Practice Zones: SAMPLE • Missing/Off Target Practices • PSE reports to Sales Exec • PSO w/o final pricing authority • Sales not comp’d on PS bookings • No project C-Sat program • Partners have primary customer rel. • Resourcing done locally only • Differentiated Practices • Documented PS Strategy • Formal sales methodology • Formal skills assessment • Defined career paths • SvcsEngin releases new PS offers • Formal partner certification Good balance of missing/off target and differentiated practices

  36. Key Metrics and Results: SAMPLE • Below Target • Project margin • Field gross margin • Net operating income • Attach rate • PS- product revenue ratio • Proposal hit ratio • Days quote to closure • % projects with C-Sat survey • Rate realization • % delivery corp PS direct • % delivery remote direct • Avg project duration • % projects with PM • Avg days onboard to billable • Avg days to source new PS • Engineering target utilization • Mktg spend: awareness • Mktg spend; demand gen • On Target • PS revenue contribution • PS EPS contribution • Field costs • PS 1 yr growth • Project size • Target utilization • Actual utilization • Realized rate ($) • % delivery local PS direct • % delivery global partners • Voluntary attrition • % projects with post-mortem • G&A costs • % delivery time on training • New PS concept to delivery • Marketing costs • Margin on sub resources • Above Target • Total OpEx • Total costs • PS 3 yr growth • Sales costs • Avg rate discount • % engagements Fixed/NTE • Delivery % time on presales • Revenue per consultant • Revenue per total PS HC • % delivery subcontracted • % delivery local partners • Engineering costs • % offers repeatable packages • Mktg spend: content • Mktg spend: analysis Lagging Results On Target Results Severely Lagging Results Relative balance of below target, on target and above target metrics and results

  37. Overall Result vs. Recent Reviews Pract ices SAMPLE vs. Peers vs. Industry Results

  38. Practices and Results on Target by Functional Area: Business Model

  39. Practices and Results on Target by Functional Area: Sales and CRM

  40. Practices and Results on Target by Functional Area: Delivery

  41. Practices and Results on Target by Functional Area: Operations

  42. Practices and Results on Target by Functional Area: Engineering

  43. Practices and Results on Target by Functional Area: Marketing

  44. Practices and Results on Target by Functional Area: Partner Management

  45. Questions?Bo Di Muccio, Ph.DVice President, Professional ServicesTechnology Services Industry AssociationOffice: 724-946-9798Mobile: 724-877-0062E-mail: bo.dimuccio@tsia.comBlog: DiMuccio's DataViews BlogWebsite: www.tsia.com

More Related