170 likes | 268 Views
The Development of a Comprehensive Assessment Plan: One Campus’ Experience. Bruce White ISECON 2007. Feedback / accountability. “For society to work […] we must be accountable for what we do and what we say.” “No person can succeed unless he or she is held accountable”
E N D
The Development of a Comprehensive Assessment Plan: One Campus’ Experience Bruce White ISECON 2007
Feedback / accountability • “For society to work […] we must be accountable for what we do and what we say.” • “No person can succeed unless he or she is held accountable” • “Feedback is the breakfast of champions” • “You need a culture of assessment, not a climate” 1 -Betty Dowdell 2 – Grant Wiggins 3 – Ken Blanchard 4 – Gloria Rogers
Overview • Are we teaching what we say we are? • Are students learning? • How can we be more effective in our instruction? SO … • What do we want students to learn? • Why do we want them to learn it? • How can we help them to learn it? • How do we know what they have learned?
Furthermore . . . • Stakeholders want to see if we are accomplishing our goals of education. • Possible stakeholders: • Students • Parents • Employers • Board of Regents / State Agencies • Accrediting groups (AACSB / ABET / etc.) • Faculty • Alumni
Our campus program • The Information Systems Management program at Quinnipiac University in Hamden Connecticut started our journey towards a comprehensive assessment program in 2003. • Prior ‘assessment’ was informal: • So ISM faculty – how do you think we are doing? • So Advisory Board – what advise do you have for us? • So IS education community – what should we teach (like IS2002 curriculum) • So Employers – what do our students need to know (or know better) • Etc.
Our desired outcomes: • Analysis and design of information systems which meet enterprise needs. • Use and experience with multiple design methodologies. • Experience in the use of multiple programming languages. • Development of hardware, software and networking skills. • Understanding of data management. • Understand the role of IS in Organizations.
Possible assessment methods: Direct Assessment Methods: • Simulations • Behavioral Observations • Performance Appraisals • Locally Developed Exams • External Examiner • Portfolios / E-portfolios • Oral exams • Standardized Exams (Source: Gloria Rogers- ABET Community Matters 8-06)
Indirect Assessment Methods • Exit and other interviews • Archival data • Focus groups • Written or electronic surveys / questionnaires • Senior exit surveys • Alumni surveys • Employer surveys • Other factors: • IS model curriculum • Advisory board • Alumni Surveys
Foundation of Our Assessment Program • We got interested in the CCER IS Assessment test early • It is a direct assessment test based on the IS2002 model curriculum • It has been thoroughly tested and analyzed • It has been shown to be valid and reliable • Test scores reported on 37 different areas – and relevant to our learning outcomes • The test questions are written at higher levels of Bloom’s taxonomy – with scenarios
More on our assessment process • We also use a senior exit survey (indirect measure) • An advisory board gives input • Informal controls: • Campus decisions (such as number of credits allowed, changes in general education) • Model Curriculum Changes • Employer input • Conferences / technologies
Overall Overall Analysis
Next Step – setting metrics • So … we have a solid direct measurement • And … we have a good indirect measure • Now … what • We are working on setting metrics (especially for our direct measurement – the CCER IS Assessment test) • From ABET: • “Every student does not have to achieve the desired outcomes, but targets must be defined.”
Setting expectations • The faculty have considered the goals of the program and feel our program emphasizes systems analysis, the role of IS in organizations and data management ISECON 2007 - Accreditation
Now what • Next year as students take the CCER IS Assessment test and as we get feedback from our senior survey, we will analyze the data to see if our outcomes have been reached. • If they have: • If they haven’t: • We analyze why not – was it poor instruction? Poor students? Poor textbook? Overly optimistic expectations? • We change in an effort to ‘constantly and continually improve’ our program!!