1 / 11

Berlin School of Economics and Law Hochschule für Wirtschaft und Recht Berlin Malta,

Berlin School of Economics and Law Hochschule für Wirtschaft und Recht Berlin Malta, 10-11 th April 2014. Piloting the Toolkit. Tools & Instruments chosen for piloting: Self-reflection for Learners (questionnaire from toolkit)

knut
Download Presentation

Berlin School of Economics and Law Hochschule für Wirtschaft und Recht Berlin Malta,

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Berlin School of Economics and Law Hochschule für Wirtschaft und Recht Berlin Malta, 10-11th April 2014

  2. Piloting the Toolkit Tools & Instruments chosen for piloting: • Self-reflection for Learners (questionnaire from toolkit) • Evaluation of teaching and learning – with a toolkit questionnaire • Evaluation of teaching and learning - in a group discussion (using the focus group instrument of the toolkit) Whereas the self-reflection questionnaire for learners puts the learning process and the learning results of the learner into focus, the tools & instruments for evaluation of teaching and learning” concentrate on the quality of the course and the trainers.

  3. Piloting the toolkitMethodology (self-reflection of learners) • Learning progress from bachelor students majoring in business administration/banking was surveyed in their 6th semester during the conduction of the bank-specific simulation game TOPSIM • The piloting was conducted in three different groups of students, each group having approximately 16 students • Questionnaire was adapted to the specifics of the simulation game module: • The question number 5 in the questionnaire was removed due to duration of the module • A paragraph for the evaluation of the tool itself was added • The questionnaire was filled in at the end of the first day of the simulation game. Findings were reported anonymously the next day by the trainer. Main discussion were on the difficulties and obstructions during the learning progress (question 2 / 3) and approaches to reduce gaps in the learning progress (question 7). • The actual questionnaire for self-reflection of the learning progress was complemented through a second reflection at the end of the simulation game module.

  4. Piloting the toolkitSelf reflection questionnaire for learners

  5. Piloting the toolkitResults obtained • Content, length and structure of the self-reflection questionnaire regarding the learning progress seem to be suitable. • The evaluation, if the questionnaire is principally appropriate for the investigation of the learning progress, does not show a unique conclusion. • Discussion of the findings from the first survey is necessary. Students predominantly find it useful to consider the anonymized findings in the further design of the course. But they were insecure about how far this feedback might help to increase the quality of the course. Possible reason for the uncertainty: shortness of the module (two consecutive days).

  6. Piloting the toolkitMethodology (course evaluation) • Applied in a banking practice workshop in two different ways a) with a questionnaire and b) in a group discussion). • Adaption from the toolkit questionnaire to the specifics of the practice workshop. • question 13 (exams) was left out; no exams required in that workshop • question 18 (opportunities to contact the teacher) was skipped; short duration of workshop • Guidelines for the group discussion have been adapted • Guidelines were completed with questions for the evaluation of the special form “group discussion” • The 15 participants of the workshop were divided into two groups after a common introduction into the procedure of the evaluation. The survey by questionnaire took place with five participants in parallel to the group discussion which was conducted in another room with six participants. The group discussion lasted 30 minutes.

  7. Piloting the toolkitGuidelines for the group discussion • Explanation of the background and the procedure of the workshop evaluation • Evaluation of the workshop • General evaluation of the workshop (was it interesting / did it meet the expectations / was it difficult) • Evaluation of the workshops content (improved level of knowledge and skill / integration of theoretical and practical part / achievement of the aims of learning outcomes / preparation for the theoretical semester) • Evaluation of the learning conditions (study materials) • Evaluation of teachers • Evaluation of procedures and methods applied (content logically structured / speed / time management / achievement of workshop goals) • Evaluation of relationship teachers & students (active involvement of students / answering questions / support with autonomous work / learning climate) • Other comments / suggestions • Which themes regarding the evaluation of the workshop and the teachers were not yet considered in the discussion? Which suggestions improvement do you have? • Evaluation of the instrument • To what extend do you consider the instrument group discussion as suitable for the evaluation of a workshop or for teachers? • You already know the evaluation with a questionnaire from the last workshop. Which instrument should we preferably use in future and why?

  8. Piloting the toolkitResults obtained (group discussion vs. questionnaire) • Assessments of the workshop and the teachers are pretty similar. The participants of both groups were mainly satisfied with the quality of the workshop and the teachers • Two suggestions were given in both groups: • participants asked for better combination / integration of theoretical and practical parts, and • for a stronger active involvement of students • Evaluation of tools & instruments Group discussion was seen to be more suitable, because points of criticism are more likely to be addressed. Participants are more motivated to explain critical comments in detail when they are questioned directly • Students recommend to use group discussion rather than the questionnaire for the evaluation of the workshop

  9. National Consultation Workshops • Participants: • Commerzbank, Going Public, Sparkassenakademie, the German Banking Institute, bbw (Bildungswerk der Wirtschaft in Berlin und Brandenburg) university for applied • Feedback: • The participants completely agreed, that the proposed QA framework is clear and easy to understand. The structure was described as useful. • Areas of improvement were stated for instance in the inclusion of more bank specific quality assurance characteristics • Another point missing was the comparison of curriculum on the input level. • The compilation “ready-to-use instruments and tools” was considered to be helpful. It was also pointed out, that the instruments might be helpful for beginners in quality assurance but not for institutes which already have standardized QA processes or certifications. • Instruments which should be additionally included: • Comparison of grades in centralized examinations (e.g. nationwide) • Quality assurance tools for E-learning • Comparison of curricula • Need to get forms, checklists and interview guidelines for the implementation. A list with software products which plan, manage and bill these quality assurance processes.

  10. National consultation workshopsStrengths and weaknesses of the toolkit

  11. QUADRO Contacts at BSEL erwin.seyfried@hwr-berlin.de tatjana.rabe@hwr-berlin.de

More Related