1 / 46

Every thing you want to know about surveys…

Every thing you want to know about surveys…. And aren't afraid to ask!. Carl Berger January 30, 2001. All sources are available, look on the last slide. But it’s really not about surveys…. (Gratuitous comments and/or insights…on the bottom of some slides).

morna
Download Presentation

Every thing you want to know about surveys…

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Every thing you want to know about surveys… And aren't afraid to ask! Carl Berger January 30, 2001 All sources are available, look on the last slide.

  2. But it’s really not about surveys… (Gratuitous comments and/or insights…on the bottom of some slides) • It’s about helping Presidents, Provosts, and CIO's to make decisions! But… you've got to know the territory!

  3. Why? • Institutional Readiness • Basis of strategic plan for transformation • Faculty interviews and MERLOT • Not e-mail survey • Paul Hagner, Interesting Practices and Best Systems in Faculty Engagement and Support • NLII White Paper January 25, 2001 Paul's paper is must reading!

  4. NLII Readiness • Readiness and Bridges Task Forces • Entrepreneurial Faculty and “Second-Wave” Faculty Differences in Engagement • These two groups of faculty, while united in their commitment to quality learning environments, are very different in both their technical capabilities and their attitudinal readiness to embrace these new technologies. It would be a serious mistake for administrators to make allocation decisions based solely on the characteristics of the “entrepreneurs,” since their needs and their motivations can differ greatly from the “second-wave” faculty. • … an “enabling environment” is a precondition to institutional change. • universal student access, • reliable networks, • multiple opportunities for training and consulting, and • “a faculty ethos which values experimentation and toleration of falters.”

  5. But how do you know? • Guess • Traditional Sources • Conventional Wisdom • Anecdotes • Or… • You could ask them!

  6. Even more important… • Helping decision makers • Those who have little time to develop deep knowledge • Present in ways that decisions seem to pop up • Or don’t! Lots of time its realizing that this data snipit can't help a decision!

  7. Topics • Asking the right question • Asking the question right! • Getting a great response (What is a great response?) • Decision Graphics • Hidden meanings • An example and some surprising? results

  8. Advantages to asking (survey) • Real data, almost always dumps myths • Understand all groups, early adopters, early majority, late majority, luddites • Faculty, Students, Administrators… The real pay-off is that they understand that they (faculty, students, administrators) are part of the process!

  9. Disadvantages of asking • Expensive • Time consuming • Will they understand it when were done? • And worse yet… • you may find out that which you don’t want to know! About two years ago we invested heavily in NetG. (an on-line training program) Good idea?… stay tuned…

  10. Take the plunge, needs, wants and reality • Enough to give credence Representative • Need to get luddites • Keep cost down • Understandable • And… • Getting the right questions but even more… • Getting the right analysis and presentation! We chose paper surveys and non-respondent follow-up to make sure we weren't favoring techie types!

  11. On the shoulders of … Flashlight http://www.tltgroup.org/programs/flashlight.html UCLA http://www.uncwil.edu/oir/faculty_folder/ucla_survey_99/survey_results.htm Michigan http://sitemaker.med.umich.edu/cberger/reports Berkeley, Cal State System and Others on the way, Stanford, Penn State, Minnesota but best… YOU! Beg, borrow or steal good items and question styles. But give credit!

  12. The 12 Step Program to Success • 1. Select audience • 2. Categories • 3. Initial development with faculty, students • 4. Test with small group • 5. Contact • 6. Follow-up • 7. Data entry • 8. Analysis • 9. Presentation • 10. Distribution • 11. Feedback • 12. They want next version Number 12 is the measure of success. (Along with using the results for decisions)

  13. Asking the right question • Don't jump too quickly to a survey: • The Ehrmann technique (Open focus groups and listen, listen, listen) • A majority of three (If you hear the same from three folks, unsolicited then it's a survey item) • Now build a survey • Small sample trial (Try out some survey types) • Single page (A few 1-pagers with several groups) • Check, check and check again for interpretation and errors • Try for 1 big paper survey every 2 years to prevent "surveyed to death" Steve Ehrmann of Flashlight has the best question development technique! Gary Gatien of UM develops excellent questions

  14. Asking the question right • Don’t ask “Do you use…?” • Ask “How often do you use…” • Don’t ask “Do you use either a or b?” • Ask “Check all that apply” • Don’t ask “How often did you use it last week…” • Ask “When you used it the most how often did you use…” Make questions do multiple duty. "How often" also tells yes, no, + lots more

  15. Underlying constructs • Think of an underlying scale • Try to avoid yes or no • Use common or natural intervals • Not…Often…Frequently…sometimes…once…never • But…1/day…1/week…1/month…1/term…1/yr With a little creativity you can create a log scale from the last one!

  16. Non respondents • 1. Make a non respondent list. • 2. Compare against a representative list. We used our LDAP. • 3. Look for over or under represented groups. • 4. Over sample them in relation to the population. (make comparable) • 5. Then follow up as above. Send e-mail, campus mail, and finally call. • 6. Don't use phone or personal calls to get promises of response. • 7. Either carry out the survey over the phone or go to their office. • 8. Stop when you have a 'good feeling.’ • 9. Try long enough to get a significant group or collapse trying. • 10. Finally, include, successful non respondents to check out some unusual claims. • Well... not quite a 12 step program but that's about two weeks of work in my evaluation course (without the interactive lab to give a feeling for the frustration of tracking down and analyzing results from nonrespondents).

  17. Data entry • Enter for analysis and presentation • Use different codes for missing, not applicable, or filled out incorrectly • Look out for multiple missing data codes. • Try out a sample set analysis before it is too late. • Do a small amount (25%)of double entry to get reliability measures. Make all missing data codes 9999. Data entry persons like 9, 99, 999 But you then can't do a simple search and replace for missing data.

  18. Data analysis • Look out if you average. Medians may be better. • Report stats sparingly. • But look for variation. • Look for out of range. (stats can do) • Simple stats, and complex ones (factor analysis, MDS, etc) Outlying data can really move averages. Also don't be afraid of advanced stats. New computer programs can help you visualize complex data. Try StatView for some real eye openers.

  19. Presentation • Minimize tables for comparison • Use graphs. http://www.uncwil.edu/oir/faculty_folder/ucla_survey_99/survey_results.htm You could spend hours looking for conclusions…

  20. Same data as a graph…(unmodified Microsoft graph…bleh…We'll fix this later.) Just converting it to a chart isn't more helpful…

  21. Chart junk… False 3D Terrible background <-- Lousy Colors Poor Layout Worse… No way to see real differences These errors are caused by Accepting Excel defaults. With a little work the results--> are clean and clear Tufte would love this one!

  22. Decision graphics • Making data clear with graphics and.. • Using graphics to help decision makers • Combines complex chart data • Uses visual design theory • Uses perception theory Decision Graphics started in the 80's to help parents understand Individualize Learning Programs for special education students.

  23. From tables to a decision graphic (7 steps) Step one:… Start with a table of data Question 26 Use Tech for… What a table to try to figure out! But at least it is sorted by the total of Already use, would like very much, would like somewhat.

  24. Question 26 Use Tech for… Step 2… Convert table to an Excel Graph I'll never know why MS builds starts with such terrible charts!

  25. Step 3: Convert to a bar chart (set legend below the chart) Question 26 Use Tech for…

  26. Step 4: Change to a stacked bar chart… Question 26 Use Tech for… Aha…now that sorting makes sense as you look along the dark blue bars.

  27. Step 5: Expand bars (100% wide and less space between)… Question 26 Use Tech for…

  28. Step 6:Change colors to flow from cool to warm Question 26 Use Tech for…

  29. Step 7: What the heck.. Shade those colors… Question 26 Use Tech for… Bottom Line… Our faculty want course web pages but no distance teaching!

  30. Remember the UNC data?(unmodified Microsoft graph)

  31. With color metrics… Conclusion… Not much difference!

  32. Ranking versus top three… • Ranking works with few choices. • Selecting top three will take care of ranking 4 choices • Selecting top three is easier to take • Results easier to analyze and display

  33. Easier to take

  34. But the results are revealing… What method do you like to use to learn technology? And guess which University just spent big bucks for on-line computer classes?

  35. A little factor analysis Yep, we've got leading and second wave faculty plus the good old AV types

  36. The U of Michigan case study • The 1999 Faculty Survey • Distributed in February 1999 • 1500 faculty, stratified random sample • 19 Schools and Colleges • 743 responses • Results to CIO in August 1999 • Released in to the public in March 2000

  37. Survey Categories • Use • Resources • Support

  38. Q29 Base Academic Unit Had to use a log scale. Med School rules! (We had one school with more responses than faculty, what a story!)

  39. Questions 29-35Demographics…

  40. Question 27Would you use the web for… Very little use now but… pent up demand for next wave use.

  41. Q10 How often do you use… Surveyed just as our CourseTools was coming on line. Next survey???

  42. Question 15 Concerns… Bottom Line… It's time, reliability and support!

  43. Adding students <-Faculty1999 UM Faculty Survey Students -> 2000 UM Student Survey Not too different, second wave for both?

  44. CIO and SACUA CIO Staff ITD Staff OIT Staff ISR 743 faculty members! Special thanks to: Kati Bauer Steven Burdick Jose Marie Griffiths Gary Gatien Karen Kost Nicole Kirgis Kathleen McClatchey Eric Rabkin Credits

  45. Flashlight, part of the TLT group • Online database • In depth rather than broad • Excellent source • Part of a broad program • http://www.tltgroup.org/programs/flashlight.html

  46. The Full MontyThe Michigan 1999 Faculty Survey and the 2000 Student Survey Form • Original blank survey (.doc) [Faculty and Student] • and…Initial results (.pdf) [Faculty] • but wait…there’s more…This presentation (.ppt) • are available at www.carat.umich.edu • Follow links to projects and scroll to faculty survey… • Or…if you want a special question answered carl.berger@umich.edu Thanks for coming. We can help decisions and improve learning and teaching!

More Related