1 / 14

Epiphany vs. Evidence Panel Discussion

Epiphany vs. Evidence Panel Discussion. Carl Berger Lev Gonick Gilbert Gonzales K.C. Green Lynne O’Brien. Educause ~ Orlando, FL ~ Oct. 19, 2005. Overview. Goals and evaluation for 2004-05 Duke iPod project Evolution to Duke Digital Initiative with new goals and evaluation plan.

adia
Download Presentation

Epiphany vs. Evidence Panel Discussion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Epiphany vs. Evidence Panel Discussion Carl Berger Lev Gonick Gilbert Gonzales K.C. Green Lynne O’Brien Educause ~ Orlando, FL ~ Oct. 19, 2005

  2. Overview • Goals and evaluation for 2004-05 Duke iPod project • Evolution to Duke Digital Initiative with new goals and evaluation plan

  3. Context for our iPod initiative • Our one of nine goals in our Univ. Strategic plan was to foster wider use of technology in all aspects of campus life • Over 5 years we encouraged instructional initiatives with laptops, Blackboard, PDA’s, streaming media, wireless and more… • Decision to use iPods was a convergence of a readiness to try something new, opportunities with Apple, and the growing popularity plus newness of the iPod

  4. iPod Project goals for 2004-05 • Mostly an experiment, “scattering seeds” • As experiments got underway, different kinds of goals emerged, making a structured evaluation difficult • Another challenge was trying to decide early in the spring whether to continue, when the experimentation was still evolving

  5. Evaluation focus • Was iPod a stimulus to innovation in teaching and learning? • Which uses were most fruitful and should shape planning for academic technology? • How feasible was it to use iPods, and should the experiment be continued, dropped or changed?

  6. Evaluation strategies • Classroom observations • Faculty / deans/ staff discussion sessions • Staff focus groups • Faculty project reports • Faculty and student questionnaires • Faculty and student focus groups Full report at end of year available at: http://cit.duke.edu/pdf/ipod_initiative_04_05.pdf

  7. Findings: Yes, they used them! • iPod use in15 fall, 33 spring courses* • 75% of first-year students said they used iPod in a class or for independent support of their studies* • Recording = most widely used feature for academic purposes, although all features used in at least some ways* • Of course they listened to music! * Includes formally designated iPod courses plus independent use

  8. Findings: Yes, iPods drove innovation • Faculty ideas and interest exceeded expectations • Innovation with iPods prompted exploration of other new technologies • “Fun factor” and low learning curve drove class use • Faculty who previously had not experimented with IT tried new things

  9. Findings: Internal impact • Fast pace of implementation meant that some infrastructure wasn’t ready – made it hard to know if something would have been more educationally valuable if the support was more robust • Increased collaboration among Duke IT groups • Prompted much discussion among faculty, staff, students and administrators about technology in teaching

  10. Findings: External impact • Significant and unanticipated publicity • Many inquiries from and opportunities for collaboration with other educational institutions • New partnerships with publishers, hardware and software vendors • Increased visibility for Duke as technology innovator

  11. DDI – Overarching Goals Based on what we learned, Duke is now focusing on extending parts of the experiments which seemed most promising. • Innovative and effective teaching • Curriculum enhancement • Infrastructure development • Knowledge sharing

  12. Evaluation strategy for DDI • Earlier and more discussion with faculty and administrators about goals so we have more agreement about what to evaluate. • More structured and consistent evaluation to make it more likely that faculty and students will complete evaluation activities • Asking each faculty member with a project to choose one thing that they want to know about the outcome of using tech in their course and then helping them figure out how to measure that – to get more faculty to buy into evaluation

  13. Now tech innovation is 3 year cycle • Year 1 - Experimentation, open-ended • Year 2 – Select what seems most promising and extend that more broadly • Year 3 – Transition to established support models for the things that prove worthwhile • Each year, some technologies would be at each of these stages.

  14. Useful web sites • Full evaluation report on first year of iPod initiative http://cit.duke.edu/pdf/ipod_initiative_04_05.pdf • DDI website with this years goals and some evaluation information, plus pointers to what is new and experimental this year http://www.duke.edu/ddi/ • CIT website has list of instructional projects for last year and this year http://cit.duke.edu/about/ipod_project.do

More Related