430 likes | 528 Views
Planning and Sustaining Evaluation of Instructional Technology Support Programs. Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University Lynne O’Brien, Duke University. http://dls.cornell.edu. http://cit.duke.edu. Our assumption.
E N D
Planning and Sustaining Evaluation of Instructional Technology Support Programs Yvonne Belanger, Duke University Joan Falkenberg Getman, Cornell University Lynne O’Brien, Duke University http://dls.cornell.edu http://cit.duke.edu
Our assumption • How can we build a culture of evaluation, so that many people contribute to evaluation? • How can we provide a context for evaluation strategies and results? • How can we conduct evaluation that helps with decision making? Schools want to know if IT services, organizations & projects are effective, but have limited resources for evaluation.
Overview • Key issues in evaluation planning • Early planning for evaluation at Cornell • General approaches to evaluation at Duke • Case studies: Duke iPod project, Duke Faculty Fellows • Resources and templates
Assessment v. Evaluation • Assessment is an ongoing process aimed at understanding and improving student learning. • Evaluation is a judgment or determination of the quality of a performance, product or use of a process against a standard. Did it work in terms of the needs being addressed or the system goal? Article: Differentiating Assessment from Evaluation as Continuous Improvement Tools by Peter Parker, Paul D. Fleming, Steve Beyerlin, Dan Apple, Karl Krumsieg
Why Evaluate? “Research is aimed at truth. Evaluation is aimed at action.” Michael Patton
Cornell University • Private, 4 year, Research 1 • New York State land-grant institution • Partner of the State University of New York • 11 schools: 7 undergraduate and 4 grad/professional • 19,600 FTE students, 1540+ faculty http://www.cornell.edu
Cornell Computing Environment • IT centrally and locally supported • Undergraduate education is campus based • CIT strategic plan encourages selective innovation • President’s “Call to Action” • Provost’s Distributed Learning Initiative
Provost’s Distributed Learning Initiative • Core technologies development • Faculty development and training • Faculty Innovation in Teaching (new) • Lynx Student Assistant Program (new)
Key Issues in Planning an Evaluation • Who are the stakeholders and what do they want to know? • What is success? • How will you measure success? • Do you have an evaluation team or partner? • What is the most effective way to report evaluation findings? • Are you allowing for discovery as well as confirmation? • How is the data going to guide decision-making and improvements?
Provost President Vice President of Information Technology Faculty Students Deans Dean of faculty Dean of students Library Center for learning and Teaching Cornell Adult University Faculty Advisory Board on Information Technology IT staff and Helpdesk staff Executive Budget and Finance Committee Different Stakeholders Different Interests
Defining Success • Identify different dimensions or domains for evaluation. • Identify indicators of success in those domains. • Data collection method and source of data will vary with indicators of success.
Measuring Success Project: Student response systems (polling) in large enrollment class. Goals: Improve learning, implement inexpensive, low-maintenance technology with specific functionality, increase student engagement, short learning curve for faculty, adoption of polling by other large enrollment classes. Domains: • Instructional: strategies, learning outcomes • Technology: functionality, reliability… • Student experience: attitude, use of technology… • Faculty experience: attitude, use of technology… • Programmatic impact • Cost
Balanced View of Success Domains: • Instructional: strategies, learning outcomes • Technology: functionality, reliability… • Student experience: attitude, use of technology… • Faculty experience: attitude, use of technology… • Programmatic impact • Cost Outcomes: Students: like it in several ways and they self-report improved learning Faculty: too much time in prep, tech not meeting needs, still like the idea IT staff: User support for faculty and facilities taxing limited staff time Finance Office: clicker replacement and new projection system beyond budget Was the project a success?
Evaluation Models and Standards • Scientific inquiry and experimental models Emphasizes values established by research community • Management-oriented models Emphasizes decision-making: Stufflebeam’s CIPP model • Qualitative and Anthropological models Emphasizes discovery of values based on description • Participation-oriented models Emphasizes values being "socially constructed" by the community
Stufflebeam’s CIPP Model Context, Input Process and Product evaluation • Focus: decision-making • Purpose: facilitate rational and continuing decision-making • Evaluation activity: identify potential alternatives, set up quality control systems
Action Research Action research is deliberate, solution-oriented investigation that is group or personally owned and conducted. It is characterized by spiraling cycles of problem identification, systematic data collection, reflection, analysis, data-driven action taken, and, finally, problem redefinition. The linking of the terms "action" and "research" highlights the essential features of this method: trying out ideas in practice as a means of increasing knowledge about and/or improving curriculum, teaching, and learning (Kemmis & McTaggart, 1982).
Permissions and Partners • Check with your institution’s research office for policies on human subject research. • Be creative - put together an evaluation team or partnership - involve stakeholders for credibility.
Reporting Evaluation Results • Format your information and customize your report to stakeholders so that it meets their interests and style. • Narrative • Video interviews • PowerPoint presentation • Excel spreadsheets • Images, graphical representation of numerical data Include unexpected outcomes Use benchmark studies for additional context
Focus on the Intent of Evaluation • Evaluation uses a combination of data to present a comprehensive picture • Return to original purpose of the evaluation and the types of decisions the data will inform. • It is possible for a project or program to have some components that succeeded and others that did not
Duke University • Private, 4 year, Research I • 9 schools: undergrad and professional • 12,000 FTE students, 2,350 faculty
Duke Center for Instructional Technology • Established 1999 in response to a general needs assessment on instructional technology at Duke • Goals: increase faculty and student use of technology, leverage resources, coordinate planning
IT is both central & school-based Growing interest in distance ed in professional schools Undergrad ed = campus based classroom teaching Strategic plan encourages IT experimentation Duke CIT Context
Experiments with laptops, Blackboard, PDA’s, iPods and other technologies
Is the CIT doing a good job? Do students learn more when they use iPods? What is the best way to help faculty make good use of technology? Is Blackboard a success? The big questions
Answerable questions • Is CIT making positive changes in the areas identified by the original needs assessment? • Do iPods improve course logistics and increase student access to a rich set of course materials? • Are faculty satisfied with the IT development programs they use? • How widely used is Blackboard, and what new kinds of teaching does it enable?
Tools for Structuring Evaluation • CIPP and Logic Modeling • Context: Environment & Needs • Input: Strategies & Resources • Process: Monitoring implementation • Product: Outcomes - both quality and significance • Logic Modeling
CIPP View of Institutionalized Evaluation Stufflebeam, OPEN, 2003
CIPP approach recommends… • Multiple observers and informants • Mining existing information • Multiple procedures for gathering data; cross-check qualitative and quantitative • Independent review by stakeholders and outside groups • Feedback from Stakeholders • Be appropriately circumspect in generating and reporting conclusions
Faculty Fellows Program Goals: • Faculty Development • Department Development • Intensive orientation • Occasional meetings • One-on-one consulting • Showcase presentation
Evaluating the Fellows Program • Stakeholder and staff input to clarify program goals • Developing consistent reporting tools • Distributing effort • Stakeholder review of outcomes • Participant responsibility for disseminating results Evaluation of Instructional Technology Fellows Program
Re-envisioning the Fellows • Full week of orientation →1-2 days + 4 additional short meetings • Single project focus → Multiple small scale activities • Customized individual project → theme-based offering
Duke iPod First-Year Experiment • Technology innovation • Student life, campus community • Academic impact Project goals
Distributed 1,599 20 GB iPod devices to first-year students on Aug. 19, 2004
Evaluation Challenges • Baseline info unavailable • Iffy implementation of instructors’ course evaluation plan • How best to capture academic projects outside of CIT purview • Quick start - experimentation; outcomes vs. predefined goals • Proving correlation between iPod use and improved course outcome
Focusing the evaluation of academic iPod use • Feasibility of using iPod to support teaching and learning • Improving logistics of course delivery • Enhancing student learning and interest
Sharing preliminary information • Crucial to have early understanding of project lessons • Matrix of evaluation strategies • Grouping uses into similar cases • Examples: • Summary of iPod projects and their evaluation strategies • Early feedback on uses and lessons learned Available at http://cit.duke.edu/evaluation
Other Resources & Templates • http://cit.duke.edu/evaluation • Annotated bibliography by Cornell and Duke • Sample CIT reports • CIT Logic Model example and template http://www.innovation.cornell.edu • How can we all share more information about our activities and learn more from one another’s successes and failures?
Summary • Understand what success is for your efforts • Reframe questions to be answerable • Focused rather than comprehensive evaluation • Build culture through distributed team approach • Bring context and input into evaluation • Take a formative view
Thank You! Yvonne BelangerProgram Evaluator, Duke Center for Instructional Technology yvonne.belanger@duke.edu Joan GetmanAssistant Director, Distributed Learning Services,Cornell Informaiton Technologiesjmf4@cornell.edu Lynne O’BrienDirector, Duke Center for Instructional Technologylynne.obrien@duke.edu