220 likes | 363 Views
The Impact of Evaluation - supporting change in VET institutions by action research. Ludger Deitmer, University of Bremen Institute of Technology and Education. Programme Administrator (ITB). 21 Pilot Projects. 4 Research Studies. 11 Single Partner Projects. 8 Multi Partner Projects.
E N D
The Impact of Evaluation - supporting change in VET institutions by action research Ludger Deitmer, University of Bremen Institute of Technology and Education
Programme Administrator (ITB) 21Pilot Projects 4 Research Studies 11 Single PartnerProjects 8 Multi Partner Projects 2 Research Projects 14 States (Laender) 100 VET-schools 14.000 students 500 VET-Teacher 18 VET Research Institutes Duration: 01.10.1998 to 30.09.2003 The Programme Financed by the Ministry of Education and Research (bmb+f) and the participating Federal States (Bundesländer) Fundsapprox. 13 Mio Euro
What did the programme? • Change of perspective: from subject organised curricula and teaching towards work processes related curricula • Building project networks: several schools work on the same topic and in different states; this should improve the cross Laender transfer and dissemination of best practice into different regions • Teachers work on the whole context of learning processes in the VET system, occupational analysis in industry, didactical design of (complex) learning situation, curriculum design, implementing into the institutional structures, the evaluation of learning processes, the use of multi-media learning arrangements
Some Implications for Modellversuche • Modellversuche rely on a good working partnership under internal and external actors and strong leadership • Our experience is that project partners pursue different orientations: without making these very clear! • Some examples for Malfunctioning elements; goal understanding, use and distribution of resources, project management not working, communication is seldom,.... • Missing diagnosis on performance of Modellversuch • Need: interactive and user centered evaluation as an integral part of the innovation process
Measures undertaken by action research (AR) in the view of the pilot actors (teachers involved)
R&D Innovation in HRD or VET Inputs (human resources, money, equipment etc.) Outputs (products, patents etc.) Transaction processes (Innovation process as such) Most Evalaution methods refer to: Input or Output (quantitative or quantifiable data): External evaluators doing summative evaluations Focus on the innovation process itself is missing Formative Evaluation to support the innovation process and the actors involved Portfolio of Evaluation methods
Using reflexive and dialogical approach in the evaluation sessions • A moderated and guided (questionnaire) group discussion among participants in a project networks took place (self-evaluation) • Those involved in the project evaluate their own work and/or (interim) results of their project under the direction of the moderator. • Participants from these network were asked to do an individual and collective weighting of criteria: How important is the criteria for you (in %)? • Following this the participants were asked to judge the criteria: How far have these criteria been achieved by the improvements made in the project network? • Different weightings and judgements of the criteria by the project partners is used to start a discussion on consensus or dissent.
Three step approach • Evaluation workshop (group discussion) with individual and collective weighting and scoring of the main and sub criteria (self evaluation as core element) • Concise summary of the workshop by analysing strength and weaknesses, preparing quantitative & qualitative data • Feed-back on results & work out PPP, Modellversuche and OD project prospect: recommendations for follow up processes
8 main criteria for innovation objectives (A, B, C, D) and innovation effects (E, F, G, H) Objectives Work process orientation and the relationship between learning and working (A) Self-directed and self-organised learning (B) Vocational and shaping Competence (C) Holistic forms of learning (D) Effects Internal transfer (E) External transfer (F) Improving the scientific knowledge base and the educational planning (G) New teaching practice und professionalisation (H)
Measuring single Modellversuch performance by innovation spider
Main methodological contributions from the evaluation approach for action research • provision of data on Modellversuch performance in relation to important success factors • analyze data by synthesizing the outcomes into an Evaluation spider • Spider allows the description of the specific strength and weaknesses by its shape • compare spiders • relating the empirical examples to the existing research on network structures • generating ideal typed functions and problems of networks
Factors of the evaluation design. Adequateness (1, 2 and 4) and Quality (3, 5, 6) (1) Fitting of criteria definition process (4) Right actors (2) Values of evaluation (5) Timing and time (3) Completeness of evaluation (6) Competence of the evaluation actors
Enablers and constrains for change via self evaluation Organisation context Goals, Motivations, structure, fields of responsibility Transferring projects to a wider field Communication and Co-operation structures in projects and the organsations Resources: time, personal, financial
Relation of Evaluation & Learning Adequateness Quality standards Evaluation methods Learning Effects Internal transfer External Transfer VET Projects
25,0 What was helpful during pilot project realisation? 20,0 15,0 10,0 5,0 0,0 Co-operation in the pilot team Co-operation with other schools, other pilots, industry Co-operation researchers Further training Evaluation More resources
School internal cooperation after pilot project has be finalised (only teacher) 80 Little cooperation Not continuous cooperation Intensive cooperation n 67,95 65,33 62,67 61,84 60 53,75 50,00 frequency [in%] 40,00 37,50 40 33,33 32,00 22,37 17,95 20 15,79 14,10 12,50 6,25 4,00 2,67 0 school department School management External With others subject (n = 75) Under all school teachers (n = 80) (n = 78) (n = 75) (n = 76) (n = 8)
Interfaces Actors who work in the organisation the innovation is going to be implemented Project actors
Conclusion: Evaluations can trigger change, but a set of factors have to be meet: • Autonomy of participants has to be secured • Different exspectations and interest under partners have to be made transparent • Spread out the evaluation to the interfaces • Internal organisational barriers and malfunctioning elements have to be made explicit • Pilot projects have to be combined with organisational development throughout the organisation • A quality management process has to be established within the school organisation • Formative evaluation can help to ensure that QM and innovation processes are seen in context
Thank you for your interest, more information under about BLK Modellversuche under http://www.itb.uni-bremen.de/projekte/blk/programmtraeger.htm