240 likes | 248 Views
Pilots provide valuable feedback, and they can springboard into paid engagements, and they can support sales and marketing. Or, they can be a waste of time, they can lead to nowhere, and they can actually hinder growth. We’re going to go through an exercise in how to screw up your pilots, so that it doesn’t happen to you in real life.
E N D
All About Pilots Scott Brewster Mitch Weisburgh scott@hatsandladders.com mitch.weisburgh@academicbiz.com
What is a pilot? A school or teacher trying a product to decide if they are going to buy it? Part of a research effort to show efficacy? A type of quality assurance where a class or school uses a product and provides feedback? PILOT DEFINED: A small-scale preliminary study conducted in order to evaluate feasibility, time, cost, adverse events, and improve upon the study design prior to performance of a full-scale research project.
Why we pilot. We we evaluate. Reasons most of us immediately think about... To know a product really works To get data so you can improve your product Exposure so potential customers can get to know who you are
Why we pilot. We we evaluate. Cont’d To show who my product work for and under what conditions To show potential purchasers that the product works To get early adopters hooked so you can later on convert them to paying (Freemium model) To get researchers and users interested so they become proct evangelists for you To add to the professional literature To show investors your product is effective and has users To get preliminary data to support further research
What Have Been the Models? Academic Research | Gold Level | Parma One long study to see if an intervention works Randomized control trial taking place over several years The focus is on the overall average effect of the intervention
Why Past Models May Not Work Today Can’t wait for years Summary data alone is not good enough It’s not necessarily accurate anyway; what other factors are at play Too expensive
Four Levels of Research (US DofEd) More emphasis on “lower levels” of evidence defined in The Every Student Succeeds Act (ESSA) Strong evidence of impact LEVEL 1 Moderate evidence of impact LEVEL 2 Promising evidence of impact LEVEL 3 Provides rationale for expecting of impact LEVEL 4
Four Levels of Research (US DofEd) More emphasis on “lower levels” of evidence defined in The Every Student Succeeds Act (ESSA) Strong evidence of impact Slow; requires advanced planning; not useful for product improvement; experiment that randomizes students & teachers LEVEL 1 Fast; lower cost evidence; best for pilots in multiple school districts; comparison study of schools or students Moderate evidence of impact LEVEL 2 Find out which parts of products are most effective and who your product works for; correlational study with statistical controls Promising evidence of impact LEVEL 3 Provides rationale for expecting of impact Create rational based on Learning Science; basis for more evidence gathering; logic model is rationale for why it might work LEVEL 4
More Realistic Today Informed product development & QA A form of “try before you buy” Typical end users can influence your product development and evolution Create and publish white papers and studies Establish cadre of evangelists Use aggregate data generated by the product
Types of Pilots Observation Pilots: when you need qualitative data, when you want reactions to a presentation or concept Usability Testing Pilots: qualitative data and you want deep follow-up questions Efficacy/Outcomes Pilots: qualitative data and you want deep follow-up questions
Evaluation Tools Surveys: when you need quantitative data and you know the likely response choices to your question Focus Groups: when you need qualitative data, when you want reactions to a presentation or concept In-depth interviews: qualitative data and you want deep follow-up questions
Also Important Competitor Analysis: to understand the competitive environment, thinking of a new product, look for features
Pilot Periods Audience Teacher Length 1 Time Rigor Just to try it Usage/reporting requirements Grade 1 Week Implementation requirements School Multi-week Independent evaluator/designer District Month Quarter School Year
More Realistic Today A school or teacher trying a product to decide if they are going to buy it?
Questions to prepare for a pilot 1. What are the criteria for success? 2. What are you comparing to? 3. What and how are you measuring and how are you getting that information 4. What reporting are you going to do? 5. What do the teachers/schools/districts have to do? 6. How will the teachers/schools/districts be able to use it after the pilot? 7. How are you going to implement and support? 8. What do you want them to report back to you? 9. What’s in it for the teachers/schools/districts to participate? 10.Target district or school characteristics 11.Are you going to have a third party conduct or write up the results? 12.What are the IRB requirements?
Pilot Models Digital Promise Model SIIA Model
Financial-Related Paid vs. not paid pricing Free - teachers love free, administrators know that free just means the cost is hidden Freemium - free then leading to pay after pilot period Reduced Price - financial incentive, but skin in the game
Challenges Long sales cycles Technology moves fast
How Do You Get Schools to Pilot 1st-Level Issues 2nd-Level Issues Providing a clear benefit Actually getting their attention/rising above the noise Large enough for them to care about Aligning with their problems How will they know if it’s successful? Reducing their risk What happens at the end? Data security, data privacy, accessibility
How Do You Get Their Attention Reach them from someone they already trust? (Connection) Campaigns (email, social media, etc.) Talk to them at conferences & events Organizations who run/manage pilots
Questions & Discussion Scott Brewster Mitch Weisburgh scott@hatsandladders.com mitch.weisburgh@academicbiz.com