240 likes | 380 Views
2. Howard Davies: Background. Faculty of Business at The Hong Kong PolyU220 full-time staffHeavy research focus
E N D
1. 1 FROM HERE TO THERE: BUT WHERE IS HERE ANDWHERE IS THERE? Howard Davies
Professor and Associate Dean
The Hong Kong Polytechnic University
Co-convenor
Hong Kong University Grants Committee
Task Force on Outcome-based Education
2. 2
3. 3 Why The Whimsical Title? When we talk about Outcome-based Approach – we know that we need to move from “here” to “there”.
But we are often not very clear where “here” and “there” are:
Some colleagues say we are already there and always have been, some say the distance to be travelled is huge, and maybe not worth the time and effort
4. 4 Topic for Today Conceptual and practical issues which arise in respect of the Outcome- based Approach.
Defining outcome-based education
Issues which arise with respect to the specification of outcomes/learning goals
The alignment of content and pedagogy with outcomes/learning goals
Collection and reporting of evidence for the OBA
The challenge of the OBA construed as organizational change.
5. 5 The “Outcome-based Approach?” Entails;
A focus on defining and stating what students will be able to do at the end of the program: learning outcomes
The alignment of teaching and assessment with the achievement of learning goals
The collection of evidence on the extent to which learning goals/outcomes are achieved
‘Closing the loop’ by systematically taking follow up where improvements are needed
‘Say what we will do and then do what we say’
6. 6 What’s New About It? We’ve always done that!’ IS IT TRUE?
YES. We have always assessed students on the content of our subjects
Can they interpret a set of accounts? They have an exam in Accounting 101
Do a piece of market research? They pass Market Research 201
BUT NO. (Even within subjects -sometimes we actually assess only a small part of a subject’s content – one essay, one presentation and 3 exam questions out of 6 can mean only 3 topics assessed)
7. 7 What’s New About It? AND NO. We have not generally assessed explicitly for more broadly written/generic program outcomes.
‘Students will develop Communication Skills’
They do a lot of presentations!!! But we mark them for subject content, we don’t explicitly teach or assess the Communication Skills element, we don’t explain to students what we mean by good communication and we don’t extract evidence on Communication Skills per se
Program outcomes are key, but how many of your faculty members are REALLY aware of the program outcomes/learning goals?
Program outcomes are written by Program Leaders, agreed by Program Teams, discussed at validation meetings and then forgotten?
Program leaders say what “we” will do but the “we” who do it is a different “we”!!!
8. 8 Step 1: Define Program Learning Outcomes How many to have?
Where do they come from?
9. 9 How many outcomes? In Business 4 to 10 is said to be the norm – ‘the fewer the better’ is commonly heard
But one reason to define outcomes is to help students choose a program
A significant tension here
10. 10 How many outcomes? PARSIMONY has obvious advantages and is often recommended
Easier to assure
BUT if we opt for a very small number, everyone will have:
Critical and Creative Thinking/ Communication/ Ethics/ Working with People/ Global Outlook
AND those generic goals would be shared by Business, Engineering, Nursing and Humanities programs!!
11. 11 How many outcomes? If Business/Engineering/Science/ Humanities students are told that the goals for their program are entirely generic, they are likely to be de-motivated.
Students come to university to study something they are interested in. TO FORGET CONTENT IS TO THROW THE BABY OUT WITH THE BATHWATER!
Entirely generic learning goals provide no guidance to program content, and no guidance for student choice
12. 12 How many outcomes? You need to find a balance between:
defining goals which really express what your BUSINESS/ENGINEERING/SCIENCE program is trying to do, signalling the specific content, AND
Having too many goals to assure effectively
In our case, at PolyU, we have 13 goals for our BBA. Are we over-burdening ourselves? We hope not, because 6 or 7 of them are essentially ‘business content’ goals, addressed and assessed in the normal way
13. 13 Where do the outcomes come from? Your Faculty or Department’s Mission, of course, which should be consonant with the Mission of your University.
So we have different levels of outcomes, so that consistency and coverage are difficult to achieve:
University-level
Program-level
Subject-level
Going from program level outcomes to subject level is difficult enough
Do the subject goals simply repeat the program outcomes?
Are faculty members able to write subject level goals which are achievable and at the same time reflect program goals? This is a skill which many do not have.
Are program outcomes separable from subject outcomes? E.g Is critical thinking inherent in learning and applying concepts or is it something which should be separately assessed?
14. 14 Step 2: Alignment A Curriculum Map and Assessment Plan, which identify where each Program Outcome is addressed and assessed
In subjects
In other activities
Assessment tasks which are suited to the outcomes
15. 15 Step 2: Alignment Easy to describe, difficult to do!!
Are faculty members actually aware of program outcomes and assessment plans?
Do they actually follow through with aligned assessments?
Do they remember next time round?
Are new staff aware of what to do?
Do different faculty members teaching the same subject do the same thing?
16. 16 Step 2: Alignment If the program outcomes to be met by a subject are not clearly set out as outcomes for that subject, in the paperwork which everybody uses – THERE WILL BE SIGNIFICANT DRIFT!!
Faculty members not used to specifying realistic and achievable learning goals
Faculty members not used to seeing assessment tasks as directed onto specific learning outcomes.
What about ‘participation’ grades?
What if students have a choice of essays – which outcomes will be addressed?
If an exam is said to cover many outcomes there need to be compulsory questions addressing those outcomes.
17. 17 Step 3: Collecting Evidence The ideal evidence for each goal would be:
Located at the end of the program, but with ‘mile-post’ tests along the way, to assess progress
Involve all students – sampling can have real problems!!
Involve DIRECT tests of whether the student can do what the outcome intends they should do
Have an element of external validity – not just the professors and students say all is OK
Be inexpensive to administer
Be non-intrusive on the program, integrated with the content
18. 18 Step 3: Collecting Evidence The evidence we usually have to hand includes:
Exam Board results for subjects – OK for Outcomes which are directly addressed by subjects –not helpful for generic outcomes- no external validity – only graduation results are at the right time
Student feedback results on subjects – tells us if the students believe they have achieved in subjects
Exit Surveys – at the right time, and can ask about program outcomes, but students’ perceptions only
In truth, these traditional sources have limited validity and are often indirect
19. 19 Step 3: Collecting Evidence New types of evidence we might try:
Employer/Supervisor surveys – but very difficult to do after graduation – maybe for Internships during the program
Subject-embedded assessment of generic outcomes – reported separately from subject results – use rubrics to help faculty members?
Additional direct tests of ability:
Language tests
Torrance test for creativity
Defining Issues Test -2 on ethical reasoning
Cross-cultural Adaptability Index for Global Outlook – but is it testing what we want?
Collegiate Learning Assessment – maybe a godsend?
20. 20 Step 4: Reporting and Closing the Loop How to draw together and summarize diverse sources of information, whose timing may be out of synch with reporting cycles?
How to be sure that actions are followed up? So much effort may go into the reporting process that by the time it is complete, it is too late to DO anything!
Faculty members are accustomed to paper exercises, not followed up
21. 21 Organizational Changefor the OBA Most attempts to introduce educational innovations fail
especially when they come from the top, and for accountability reasons
faculty members are capable of infinite guile and infinite resistance when faced with a change they do not like or want
Recognition of that problem is key and points to the need to initiate and facilitate organizational change
Seeing the OBA as akin to a change in the assessment regulations and paperwork is a guarantee of failure
22. 22 AoL as Organizational Change ‘Ironic consent’ is common – we fill in the papers, tick the boxes and get on with what we always did.
Many faculty members see course documents as something which Associate Deans and Program Leaders do, nothing to do with them, which has been (almost) sustainable in the past
Often a tradition of ‘telling the good news’ and avoiding blame.
In Asia, maybe especially, ‘say what you do and do what you say’ is not so easy to do
IF AoL IS TO SUCCEED IN ITS OBJECTIVES CHANGES IN ORGANIZATIONAL CULTURE ARE PROBABLY NEEDED
23. 23 How to bring about the change? Don’t try to go too fast! Expect to do everything 3 times before you get it right. PERSIST!
Secure LEGITIMACY amongst colleagues by pointing to top research schools who do it.
Try to make sure the top people understand that a policy paper and a set of guidelines does not mean that anything at all has been DONE.
Try to help the front-line teachers with the paperwork aspects and make sure they know they need to implement. Rubrics can help those who are nervous about grading generic goals
Remember that ‘the best is often the enemy of the good’ – don’t strive for perfection.
Generate and maintain ‘conversations’ within and across schools
24. 24 Good Luck and Thank You !