830 likes | 1.08k Views
Most of the information presented in this workshop represents the presenter's opinion and not an official NSF positionLocal facilitators will provide the link to the workshop slides at the completions of the webinar.Participants may ask questions by
E N D
1. 1 Dealing with Project Evaluation and Broader Impacts(An Interactive, Web-Based Workshop)Russell Pimmel and Ning FangDivision of Undergraduate EducationNational Science FoundationApril, 15 and 15 201
2. Most of the information presented in this workshop represents the presenter’s opinion and not an official NSF position
Local facilitators will provide the link to the workshop slides at the completions of the webinar.
Participants may ask questions by “raising their hand” during a question session. We will call on selected site and enable their microphone so that the question can be asked.
Responses will be collected from a few sites at the end of each exercise. At the start of the exercise, we will identify these sites and then call on them one at a time to provide their responses.
2 Important Notes
3. Learning must build on prior knowledge
Some knowledge correct
Some knowledge incorrect – Misconceptions
Learning is
Connecting new knowledge to prior knowledge
Correcting misconception
Learning requires engagement
Actively recalling prior knowledge
Sharing new knowledge
Forming a new understanding Framework for the Session
4. Effective learning activities
Recall prior knowledge -- actively, explicitly
Connect new concepts to existing ones
Challenge and alter misconceptions
Active & collaborative processes
Think individually
Share with partner
Report to local and virtual groups
Learn from program directors’ responses 4 Preliminary Comments Active & Collaborative Learning
5. Coordinate the local activities
Watch the time
Allow for think, share, and report phases
Reconvene on time -- 1 min warning slide
Ensure the individual think phase is devoted to thinking and not talking
Coordinate the asking of questions by local participants 5 Facilitator’s Duties
6. Long Exercise ---- 7 min
Think individually -------- ~2 min
Share with a partner ----- ~2 min
Report in local group ---- ~2 min
Short Exercise ------ 5 min
Think individually --------- ~2 min
Report in local group ---- ~2 min
Individual Exercise ----------- 2 min
Questions ----------- 5 min 6 Participant Activities
7. The session will enable you to collaborate more effectively with evaluation experts in preparing effective project evaluation plans
It will not make you an evaluation expert Goal for Project Evaluation Session
8. After the session, participants should be able to:
Discuss the importance of goals, outcomes, and questions in evaluation process
Cognitive and affective outcomes
Describe several types of evaluation tools
Advantages, limitations, and appropriateness
Discuss data interpretation issues
Variability, alternate explanations
Develop an evaluation plan with an evaluator
Outline a first draft of an evaluation plan Session Outcomes
9. Evaluation and assessment have many meanings
One definition
Assessment is gathering evidence
Evaluation is interpreting data and making value judgments
Examples of evaluations and assessment
Individual’s performance (grading)
Program’s effectiveness (ABET and regional accreditation)
Project’s progress and success (monitoring and validating)
Session addresses project evaluation
May involve evaluating individual and group performance – but in the context of the project
Project evaluation
Formative – monitoring progress to improve approach
Summative – characterizing final accomplishments
Evaluation and Assessment
10.
Project Goals, Expected Outcomes, and Evaluation Questions
11. Effective evaluation starts with carefully defined project goals and expected outcomes
Goals and expected outcomes related to:
Project management
Initiating or completing an activity
Finishing a “product”
Student behavior
Modifying a learning outcome
Modifying an attitude or a perception Evaluation and Project Goals/Outcomes
12. Goals provide overarching statements of project intention
What is your overall ambition?
What do you hope to achieve?
Expected outcomes identify specific observable results for each goal
How will achieving your “intention” reflect changes in student behavior?
How will it change their learning and their attitudes? Learning Goals and Outcomes
13. Goals --> Expected outcomes
Expected outcomes --> Evaluation questions
Questions form the basis of the evaluation process
Evaluation process collects and interprets data to answer evaluation questions Goals, Expected Outcomes, and Evaluation Questions
14. Read the abstract -- Goal statement removed
Suggest two plausible goals
One on student learning
Cognitive behavior
One on some other aspect of student behavior
Affective behavior
Focus on what will happen to the students
Do not focus on what the instructor will do
Long Exercise ---- 7 min
Think individually -------- ~2 min
Share with a partner ----- ~2 min
Report in local group ---- ~2 min
Watch time and reconvene after 7 min
Use THINK time to think – no discussion
Selected local facilitators report to virtual group Exercise: Identification of Goals/Outcomes
15. The goal of the project is …… The project is developing computer-based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. The project is being evaluated by … The project is being disseminated through … The broader impacts of the project are …
Non engineers should substitute:
“Organic chemistry” for “statics and mechanics of materials” “Interactions” for “external forces and internal reactions” Abstract
17. GOAL: To improve conceptual understanding and processing skills
In the context of course
Draw free-body diagrams for textbook problems
Solve 3-D textbook problems
Describe the effect of external forces on a solid object orally
In a broader context
Solve out-of-context problems
Visualize 3-D problems
Communicate technical problems orally
Critical thinking skills
Intellectual development PD’s Response: Goals on Cognitive Behavior
18. GOAL: To improve
Self- confidence
Attitude about engineering as a career PD’s Response: Goals on Affective Behavior
19. Exercise: Transforming Goals into Outcomes Write one expected measurable outcome for each of the following goals:
Improve the students’ understanding of the concepts in statics
Improve the students’ attitude about engineering as a career
Individual exercise ~ 2 minutes
Individually write a response
21. PD’s Response: Expected Outcomes Conceptual understanding
Improve students conceptual understanding as measured by a standard tool (e. g., a statics concept inventory)
Improve students conceptual understanding as measured by their ability to perform various steps in the solution process (e.g., drawing FBDs) when solving out-of-context problems
Attitude
Improve the students’ attitude about engineering as a career as measured by a standard tool (e. g., the Pittsburgh Freshman Engineering Survey )
Improve the students’ attitude about engineering as a career as measured in a structured interview
22. Exercise: Transforming Outcomes into Questions Write a question for these expected measurable outcome
Improve students conceptual understanding as measured by a statics concept inventory
Improve the students’ attitude about engineering as a career as measured the Pittsburgh Freshman Engineering Survey
Individual exercise ~ 2 minutes
Individually write a response
24. PD’s Response: Questions Conceptual understanding
Did the statics concept inventory show a change in the students' conceptual understanding?
Did the students’ conceptual understanding improve as a result of the intervention?
Attitude
Did the Pittsburgh Freshman Engineering Survey show an change in the students’ attitude about engineering as a career?
Did the students’ attitude about engineering as a career improve as a result of the intervention?
25.
Tools for Evaluating Learning Outcomes
26. Surveys
Forced choice or open-ended responses
Concept Inventories
Multiple-choice questions to measure conceptual understanding
Rubrics for analyzing student products
Guides for scoring student reports, test, etc.
Interviews
Structured (fixed questions) or in-depth (free flowing)
Focus groups
Like interviews but with group interaction
Observations
Actually monitor and evaluate behavior
Olds et al, JEE 94:13, 2005
NSF’s Evaluation Handbook Examples of Tools for Evaluating Learning Outcomes
27. Comparing Surveys and Observations Surveys
Efficient
Accuracy depends on subject’s honesty
Difficult to develop reliable and valid survey
Low response rate threatens reliability, validity, & interpretation Observations
Time & labor intensive
Inter-rater reliability must be established
Captures behavior that subjects unlikely to report
Useful for observable behavior
Olds et al, JEE 94:13, 2005
28. Example – Appropriateness of Interviews Use interviews to answer these questions:
What does program look and feel like?
What do stakeholders know about the project?
What are stakeholders’ and participants’ expectations?
What features are most salient?
What changes do participants perceive in themselves?
The 2002 User Friendly Handbook for Project Evaluation, NSF publication REC 99-12175
29. Originated in physics -- Force Concept Inventory (FCI)
Several are being developed in engineering fields
Series of multiple choice questions
Questions involve single concept
Formulas, calculations, or problem solving skills not required
Possible answers include detractors
Common errors -- misconceptions
Developing CI is involved
Identify misconceptions and detractors
Develop, test, and refine questions
Establish validity and reliability of tool
Language is a major issue Tool for Measuring Conceptual Understanding – Concept Inventory
30. Tool for Assessing Attitude Pittsburgh Freshman Engineering Survey
Questions about perception
Confidence in their skills in chemistry, communications, engineering, etc.
Impressions about engineering as a precise science, as a lucrative profession, etc.
Validated using alternate approaches:
Item analysis
Verbal protocol elicitation
Factor analysis
Compared students who stayed in engineering to those who left
Besterfield-Sacre et al , JEE 86:37, 1997
31. Tools for Characterizing Intellectual Development Levels of Intellectual Development
Students see knowledge, beliefs, and authority in different ways
“ Knowledge is absolute” versus “Knowledge is contextual”
Tools
Measure of Intellectual Development (MID)
Measure of Epistemological Reflection (MER)
Learning Environment Preferences (LEP)
Felder et al, JEE 94:57, 2005
32. Suppose you where considering an existing tool (e. g., a concept inventory) for use in your project’s evaluation
What questions would you consider in deciding if the tool is appropriate?
Long Exercise ---- 7 min
Think individually -------- ~2 min
Share with a partner ----- ~2 min
Report in local group ---- ~2 min
Watch time and reconvene after 7 min
Use THINK time to think – no discussion
Selected local facilitators report to virtual group Exercise: Considering an Existing Tool
34. Nature of the tool
Is the tool relevant to what was taught?
Is the tool competency based?
Is the tool conceptual or procedural?
Prior validation of the tool
Has the tool been tested?
Is there information or reliability and validity?
Has it been compared to other tools?
Is it sensitive? Does it discriminate novice and expert?
Experience of others with the tool
Has the tool been used by others besides the developer? At other sites? With other populations?
Is there normative data? PD’s Response: Evaluating a Existing Tool
35.
Questions
36. Interpreting Evaluation Data
37. Interpreting Evaluation Data
38. Data suggests that the understanding of Concept #2 increased
One interpretation is that the intervention caused the change
List some alternative explanations
Confounding factors
Other factors that could explain the change
Individual Exercise ---- 2 min
Individually write a response Exercise: Alternate Explanation For Change
40. Students learned concept out of class (e. g., in another course or in study groups with students not in the course)
Students answered with what the instructor wanted rather than what they believed or “knew”
An external event distorted pretest data
Instrument was unreliable
Other changes in course and not the intervention caused improvement
Characteristics of groups were not similar PD’s Response: Alternate Explanation For Change
41. Exercise: Alternate Explanation for Lack of Change Data suggests that the understanding of the concept tested by Q1 did not improve
One interpretation is that the intervention did cause a change that was masked by other factors
Think about alternative explanations
How would these alternative explanations (confounding factors) differ from the previous list?
42.
Evaluation Plan
43. List the topics that need to be addressed in the evaluation plan
Long Exercise ---- 7 min
Think individually -------- ~2 min
Share with a partner ----- ~2 min
Report in local group ---- ~2 min
Watch time and reconvene after 7 min
Use THINK time to think – no discussion
Selected local facilitators report to virtual group Exercise: Evaluation Plan
45. Name & qualifications of the evaluation expert
Get the evaluator involved early in the proposal development phase
Goals, outcomes, and evaluation questions
Instruments for evaluating each outcome
Protocols defining when and how data will be collected
Analysis & interpretation procedures
Confounding factors & approaches for minimizing their impact
Formative evaluation techniques for monitoring and improving the project as it evolves
Summative evaluation techniques for characterizing the accomplishments of the completed project. PD’s Response: Evaluation Plan
46. Workshop on Evaluation of Educational Development Projects
http://www.nsf.gov/events/event_summ.jsp?cntn_id=108142&org=NSF
NSF’s User Friendly Handbook for Project Evaluation
http://www.nsf.gov/pubs/2002/nsf02057/start.htm
Online Evaluation Resource Library (OERL)
http://oerl.sri.com/
Field-Tested Learning Assessment Guide (FLAG)
http://www.wcer.wisc.edu/archive/cl1/flag/default.asp
Science education literature Other Sources
47. Identify the most interesting, important, or surprising ideas you encountered in the workshop on dealing with project evaluation
47 Reflection on Project Evaluation
48.
Questions
49. BREAK
15 min
49
50.
BREAK
1 min 50
51. NSF’s Broader Impacts Criteria
52. NSF proposals evaluated using two review criteria
Intellectual merit
Broader impacts
Most proposals
Intellectual merit done fairly well
Broader impacts done poorly 52 NSF Review Criteria
53.
To increase the community’s ability to design projects that respond effectively to NSF’s broader impacts criterion 53 Workshop Goal
54. At the end of the workshop, participants should be able to
List categories for broader impacts
List activities for each category
Evaluate a proposed broader impacts plan
Develop an effective broader impacts plan 54 Workshop Objective
55.
Broader Impacts Categories and Activities 55
56. TASK:
What does NSF mean by broader impacts?
Individual Exercise ---- 2 min
Individually write a response
56 Exercise: Broader Impacts Categories
58. NSF Review Criteria Every NSF solicitation has a set of question that provide context for the broader impacts criteria
Suggested questions are a guide for considering intellectual merit and broader impacts
Suggested questions are NOT
A complete list of “requirements”
Applicable to every proposal
An official checklist
59. NSF Suggested Questions for Broader Impacts Will the project
Advance discovery - promote teaching & learning?
Broaden participation of underrepresented groups?
Enhance the infrastructure?
Include broad dissemination?
Benefit society?
NOTE: Broader impacts includes more than broadening participation
60. Will the project
Involve a significant effort to facilitate adaptation at other sites?
Contribute to the understanding of STEM education?
Help build and diversify the STEM education community?
Have a broad impact on STEM education in an area of recognized need or opportunity?
Have the potential to contribute to a paradigm shift in undergraduate STEM education? TUES Suggested Questions for Broader Impacts
61. TASK:
Identify activities that “broadly disseminate results to enhance scientific and technological understanding”
Pay special attention to activities that will help transport the approach to other sites
Long Exercise ---- 7 min
Think individually -------- ~2 min
Share with a partner ----- ~2 min
Report in local group ---- ~2 min
Watch time and reconvene after 7 min
Use THINK time to think – no discussion
Selected local facilitators report to virtual group 61 Exercise: Dissemination Activities
63. Dissemination to general public
Applies to research and education development proposals
See handout
Dissemination to peers (other instructors)
Education projects should include strategies for
Making other instructors aware of material and methods
Enabling other instructors to use material and methods PD’s Response: Two Types of Dissemination
64. Partner with museums, nature centers, science centers, and similar institutions to develop exhibits in science, math, and engineering.
Involve the public or industry, where possible, in research and education activities.
Give science and engineering presentations to the broader community (e.g., at museums and libraries, on radio shows, and in other such venues).
Make data available in a timely manner by means of databases, digital libraries, or other venues such as CD-ROMs 64 General Dissemination -- NSF’s Representative Activities I
65. Publish in diverse media (e.g., non-technical literature, and websites, CD-ROMs, press kits) to reach broad audiences.
Present research and education results in formats useful to policy-makers, members of Congress, industry, and broad audiences.
Participate in multi- and interdisciplinary conferences, workshops, and research activities.
Integrate research with education activities in order to communicate in a broader context. 65 General Dissemination -- NSF’s Representative Activities II
66. Standard approaches
Post material on website
Present papers at conferences
Publish journal articles
Consider other approaches
NSDL
Specialty websites and list servers (e. g. Connexions)
Targeting and involving a specific sub-population
Commercialization of products
Beta test sites
Focus on active rather than passive approaches 66 PD’s Response: Peer Dissemination Strategy
67.
Questions
68.
Reviewing a Project’s Broader Impacts 68
69. Review the Project Summary & the excerpts from the Project Description
Assume the proposal is a TUES Type 1 with a $200K budget and a 3-years duration and that the technical merit considered meritorious
Write broader impacts section of a review
Identify strengths and weaknesses
Use a bullet format
(Extra) Long Exercise ---- 9 min
Think individually -------- ~4min
Share with a partner ----- ~2 min
Report in local group ---- ~2 min
Watch time and reconvene after 9 min
Use THINK time to think – no discussion
Selected local facilitators report to virtual group
69 Exercise: Reviewing a Sample Proposal
71. Scope of activities
Overall-very inclusive and good
Well done but “standard things"
Did not address the issue of quality
No clear-cut plan
Activities not justified by research base
Dissemination
Limited to standard channels
Perfunctory 71 Program Officers’ Views – Review Comments
72. Industrial advisory committee a strength
Collaboration with other higher ed institutions
Institutions appear to be quite diverse but use of diversity not explicit
Interactions not clearly explained
Sends mixed message – raises questions about partnership effectiveness
High school outreach
Real commitment not evident
Passive -- not proactive
High school counselors and teachers not involved 72 Program Officers’ Views – Review Comments (cont)
73. Modules are versatile
Broader (societal) benefits
Need for materials not well described
Value of the product not explained
Not clear who will benefit and how much
Assessment of broader impacts not addressed 73 Program Officers’ Views – Review Comments (cont)
74. TASK:
Identify desirable features of a broader impacts plan or strategy
General aspects or characteristics
Long Exercise ---- 7 min
Think individually -------- ~2 min
Share with a partner ----- ~2 min
Report in local group ---- ~2 min
Watch time and reconvene after 7 min
Use THINK time to think – no discussion
Selected local facilitators report to virtual group 74 Exercise: Characteristics of Broader Impacts Plans
76. Include strategy to achieve impact
Have a well-defined set of expected outcome
Make results meaningful and valuable
Make consistent with technical project tasks
Have detailed plan for activities
Provide rational to justify activities
Include evaluation of impacts
Have a well stated relationship to the audience or audiences
76 PD’s Response: Characteristics of Broader Impacts Plan
77.
WRAP-UP 77
78. Use and build on NSF suggestions
List of categories in solicitations
Representative activities on website
Not a comprehensive checklist
Expand on these -- be creative
Develop activities to show impact
Integrate and align with other project activities
78 Summary-Broader Impacts
79. Help reviewers (and NSF program officers)
Provide sufficient detail
Include objectives, strategy, evaluation
Make broader impacts obvious
Easy to find
Easy to relate to NSF criterion 79 Summary-Broader Impacts (cont)
80. Make broader impacts credible
Realistic and believable
Include appropriate funds in budget
Consistent with
Project’s scope and objectives
Institution's mission and culture
PI’s interest and experience
Assure agreement between Project Summary and Project Description 80 Summary-Broader Impacts (cont)
81. Identify the most interesting, important, or surprising ideas you encountered in the workshop on dealing with broader impacts
81 Reflection on Broader Impacts
82.
Grant Proposal Guide
GPG
http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg
Broader Impacts Activities
http://www.nsf.gov/pubs/gpg/broaderimpacts.pdf
82 REFERENCES
83.
Questions
84.
To download a copy of the presentation- go to:
http://step.eng.lsu.edu/nsf/facilitators/
Please complete the assessment survey-go to:
http://www.step.eng.lsu.edu/nsf/participants/ Thanks for your participation!