160 likes | 174 Views
Detailed analysis of pedagogical context, review, and evaluation criteria for learning materials. Insights from two Peer Reviewers.
E N D
MERLOT’s Peer Review Report • Composed from reports by at least two Peer Reviewers. • Description Section • Provides the pedagogical context (i.e. learning goals, target student population, and prerequisite knowledge) for the learning item. • Evaluation and Observations • Provides thereview of the teaching-learning material based on MERLOT’s evaluation criteria.
Description > Field 5 of 7 of the Description Section 5. Type of Material • MERLOT categories used to describe learning materials are: • Simulation • Animation • Tutorial • Drill and Practice • Quiz/Test • Lecture/Presentation • Case Study • Collection • Reference Material
Second Section of the Peer Review: The Evaluations and Observations • There are three evaluation standards • 1. Quality of Content • Validity and significance of the learning material • 2. Effectiveness as a Teaching/Learning tool • Likelihood of enhancing teaching/ learning • 3. Ease of Use • Likelihood of successful navigation and interaction.
1. Quality of Content • i.Content is accurate and current? • ii. Content is a pre-requisite for more advanced material? • iii. Content covers material that is difficult to teach/learn? • iv. Content stays on target throughout the material? • v. Content is flexible and can be readily integrated into curricula?
For example… • The site is well organized and designed to provide a variety of …. • There are clear, accurate, and extensive explanations. • An excellent feature is the opportunity to compare compounds and that large problem sets are presented in increasing order of difficult.
2. Effectiveness as a Teaching-Learning Tool • The 2nd Evaluation Standard:* • Are the materials likely to improve teaching and learning? • * Review based on designated type of material (i.e. tutorial or simulation …)
2. Effectiveness As A Teaching-Learning Tool: Learning goals and methods. • i. Is the learning material just as • effective or better compared to other • teaching methods? Is it an innovative, • new, original presentation of a concept? ii. Areteaching-learning goals easy to identify? iii. Areconcepts, models, or skills presented with clarity, focus, and organization? “?”
For example… • The items increases potential for enhancing student learning. Computer graphics has made the structures of large complicated biological macromolecules more readily accessible to biochemistry students …. • The molecular Models provide some excellent example of how these tools can be effectively used to teach structural concepts in biochemistry. • The item can be used in a variety of ways to achieve teaching -learning goals which are easy to identify.
2. Effectiveness as a Teaching-Learning Tool: Motivation to learn and Learning Styles. • iv. Does it appeal to multiple learning styles and learning processes? • v. Does it engage the learner, create intrigue, or otherwise motivate the learner to achieve? • vi. Does it engage multiple senses through audio, video, images, and text?
For example… • The Cameron Virtual Balloon Factory…: • appeals to many learning styles, with options to jump right into descriptions of the business, explanations of theory, or spend time in virtual tours or reading about the staff, • engages students with interactive pictures, activities, and interesting stories,
2. Effectiveness as a Teaching-Learning Tool: Learner engagement and feedback • viii. If it is interactive, does it provide: • - effective interactivity? • - immediate feedback regarding the learner’s response accuracy? vii.Is it engaging, interactive, and/or entertaining?
2. Effectiveness as a Teaching-Learning Tool: Conceptual understanding. • ix. Does it develop/enhance conceptual understanding? • x. Does it provide examples, when appropriate, that help illustrate concepts? • xi. Does it have flexibility or versatility of use?
3. Ease of Use • The third evaluation standard describes how easy it is for students and faculty to use and interact with the learning material.
3. Ease of Use: Navigation • i. Is the siteeasy to navigate and robust? • ii. If it is interactive, does it provide effective feedback for user actions? Will the user know if they are waiting for a response from the system, or if the system is waiting a response from the user? • iii. Is it self-contained, or are instructions necessary? • For example… • “Elemental Spectra” • Does take some time to load but once loaded, a quick click-on any element in the Periodic Table displays the absorption or emission spectra of that element. • is easy to use and simple, yet functional.
3. Ease of Use: Feedback • iv. Are any instructions or “help”, clear, relevant, complete, and available when needed? • v. If applicable, it clearly tell users if an error is made, and how the user should continue? • vii. Is the presentation clearly designed with no distracting design elements (e.g., color, sound, animation, too much on a page)? • viii. Are the terms and any new jargon, defined? • For example… • An intuitive piece of simulation software • Students are not going to get lost because everything is done from one screen
3. Ease of Use: Interface and Layout • ix. Interface elements (Labels, buttons, menus and …)and layout are consistent and distinct. • ix. Related parts of the site are clearly related, while parts that offer different content areas or audiences distinct? • x.Links are provided to easily access required plug-ins for downloading? • xi. Are there any major bugs (e.g., links that do not work)? • For example • This Java Applet does not work with Internet Explorer. • There is no visual clue provided for the selected element (i.e no roll-over, no different shading…)