380 likes | 515 Views
Assessment with Alumni Surveys: Administration Tips and Data Sharing Suggestions. Assessment Institute October 2013 Amber D. Lambert, Ph.D . Angie L. Miller, Ph.D . Center for Postsecondary Research, Indiana University. Abstract.
E N D
Assessment with Alumni Surveys: Administration Tips and Data Sharing Suggestions Assessment Institute October2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary Research, Indiana University
Abstract Alumni surveys can be important sources of information for institutions, yet many obstacles are involved in their execution. The Strategic National Arts Alumni Project (SNAAP) is a multi-institution online survey of arts graduates from secondary and postsecondary institutions. This presentation shares lessons learned during the first five years of SNAAP, including issues related to accurate alumni contact information, response rates, design factors, and implementation of results. Actual examples from previous SNAAP-participating institutions are shared to promote idea generation among participants for the use of alumni surveys at their own institutions.
Literature Review • As funding to higher education institutions continues to be cut, colleges and universities are often required to show measures of their effectiveness (Kuh & Ewell, 2010) • Surveys are used in many areas of higher education (Kuh & Ikenberry, 2009; Porter, 2004) • Alumni surveys can provide valuable information on student satisfaction, acquired skills, strengths and weaknesses of the institution, and current career attainment
Literature Review • A major concern with all surveys, and alumni surveys in particular, is low response rates • Over the last decade survey response rates have been falling (Atrostic, Bates, Burt, & Silberstein, 2001; Porter, 2004) • Alumni surveys often have lower response rates than other types of surveys (Smith & Bers, 1987) due to: • Bad contact information • Suspicion of money solicitation • Decreased loyalty after graduation
SNAAP • As an example, we present some best practices for survey administration and share results from the Strategic National Arts Alumni Project (SNAAP) • What is SNAAP? • Online annual survey designed to assess and improve various aspects of arts-school education • Investigates the educational experiences and career paths of arts graduates nationally • Findings are provided to educators, policymakers, and philanthropic organizations to improve arts training, inform cultural policy, and support artists
SNAAP Basics: Who is surveyed? • Participants drawn from.. • Arts high schools • Independent arts colleges • Arts schools or departments in comprehensive colleges & universities • Broad definition of “arts” • All arts alumni, all years (since 2011) • 2008-2010: surveyed selected cohorts
Increasing Numbers… • 2010 Field Test • Over 13,000 respondents • 154 Institutions • 2011 Administration • More than 36,000 respondents • 66 institutions • 2012 Administration • More than 33,000 respondents • 70 institutions • 2013 Administration • Currently underway • Combined 2011 and 2012 respondents to create a “SNAAP Database” with over 68,000 respondents – plan to add 2013 data after this year!
Questionnaire Topics • Formal education and degrees • Institutional experience and satisfaction • Postgraduate resources for artists • Career • Arts engagement • Income and debt • Demographics
Participating institutions receive... • Institutional Reports: Customized, Confidential • Separate reports for undergraduate and graduate alumni • Both quantitative and qualitative data • Specialreport on “Recent Grads” • Comparative data with other schools • Complete data file of responses • Workshops/webinars on how to use data
Survey Administration Challenges: Locating the Lost • Important that contact information is accurate and up-to-date • Encourage proactive efforts • Newsletters • Websites, social networking • Alumni tracking • Contracted with Harris Connect, a direct marketing firm
Survey Administration Challenges: Response Rates • Response rates are directly related to the accuracy of contact information • Incentives: only minimally effective • “Open enrollment” features can increase number of responses • Social networking sites • Need to verify respondents
Survey Administration Challenges: Response Rates • Email invitations to participate in the survey • Is it better to have HTML or plain text? • For the 2011 administration, we created visually appealing email invitations in HTML format
Survey Administration Challenges: Response Rates • For the 2012 administration, we systematically compared the effectiveness of HTML invites to plain text invites across the 5 email contacts sent to participants • Results of this experiment suggested that a combination of message types gets the highest response rates • Plain text was more effective for the initial contact • HTML was more effective for follow-up contacts • Potential reasons: plain text may reach larger numbers, but HTML may give the project legitimacy
Examples: Sharing on Campus • Miami University of Ohio: “Assessment Brief” published by Institutional Research office after receiving their 2012 SNAAP Institutional Report • Purdue University: Office of Institutional Research published a 4-page report summarizing data from 2011 Institutional Report, integrating both quantitative and qualitative data
Examples: Alumni and Donor Outreach • The University of Texas at Austin: College of Fine Arts promoted SNAAP results in 2011 with the Dean’s letter in its quarterly print publication, thanking alumni for participating and sharing selected findings • In 2012, UT Austin also used their SNAAP data, integrated with other information sources such as IPEDS, to develop a web page that illustrates some of its findings with info-graphics
Examples: Alumni and Donor Outreach • Herron School of Art + Design (IUPUI): created a website to share selected findings, and integrated SNAAP data with “alumni profiles” on the website • Virginia Commonwealth University: mentions SNAAP participation in their VCU Alumni publication, and incorporated an option for alumni to update their contact info for future surveys
Examples: Recruitment • Kent State University: created flier for those considering majoring in visual arts using aggregate SNAAP findings • Herron School of Art + Design (IUPUI): created a recruitment brochure based on its alumni achievements, potential careers, and comments from the SNAAP survey
Examples: Program & Curricular Change • Virginia Commonwealth University: found discrepancies when comparing the business skills alumni needed for their work to the business skills learned at VCU • Introduced a new “Creative Entrepreneurship” minor to address this • University of Utah: found that many alumni were dissatisfied with career advising their school offered • College of Fine Arts developed an Emerging Leaders Program that offers “high-stakes internships,” mini-grants, and peer-mentoring opportunities designed to prepare students for their transition into the world of work
References Atrostic, B. K., Bates, N., Burt, G., & Silberstein, A. (2001). Nonresponse in U.S. government household surveys: Consistent measure, recent trends, and new insights. Journal of Official Statistics, 17(2), 209-226. Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), 1-20. Kuh, G. D. & Ikenberry, S. O. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education, Urbana, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment. Porter, S.R. (2004). Raising response rates: What works? New Directions for Institutional Research, 121, 5-21. Smith, K., & Bers, T. (1987). Improving alumni survey response rates: An experiment and cost-benefit analysis. Research in Higher Education, 27(3), 218-225. *Special thanks to Miami, Purdue, IUPUI, UT Austin, VCU, Kent State, and U of Utah for sharing their examples with SNAAP!