E N D
1. LISD Data Camp August 25 and 26, 2009
LISD TECH Center
Jennifer DeGrie and Stan Masters
2. Welcome Necessary Forms
SB-CEUs
SHU Credit
W-9
Security Forms
3. Session 1 Essential Questions Using State Data to Identify School Improvement Goals
How do we use inquiry to identify our statewide assessment strengths and challenges?
How do we use this data to identify school improvement goals?
How do we engage staff in the data analysis process?
4. Session 1 Outcomes Analyze statewide assessment performance to identify strengths and challenges
Identify questions raised by their data
Align data with their school improvement goals
Develop an action plan for engaging staff in the data analysis process
5. School Improvement Process
7. Norms for Our Work Participate actively
Actively listen
Seek application
Press for clarification
Honor time agreements and confidentiality
Keep ‘side bars’ to a minimum and on topic
Take care of adult learning needs
8. FERPA/HIPAA Pre-Test
9. Data Roles What roles will each member of your team play in today’s work?
Identify roles
Describe responsibilities
Hold each other accountable
10. Understanding Statewide Assessment Reports
11. Performance Levels Four levels for all MEAP tests
“Advanced” = “Wow”
“Proficient” = “You’ve Got It”
“Partially Proficient” = “Nearly There”
“Not Proficient” = “Oops”
12. Scaled Scores Represent the stable score on the assessment that is reported for each student.
Psychometricians establish the scaled scores as ranges of each performance level.
Floor of “Proficient” performance level is calculated as: (Grade level X 100)
5th grade “Proficient” = 5 X 100 = 500
13. Data Driven Dialogue The idea is that from this dialogue, new understandings may emerge. This shared understanding forms the base from which we may begin to design changes that will effect our students in positive ways.The idea is that from this dialogue, new understandings may emerge. This shared understanding forms the base from which we may begin to design changes that will effect our students in positive ways.
14. Data Driven DialoguePredictions
Reflect and record several of your preliminary thoughts about what you believe the data will tell.
I assume . . .
I predict . . .
I wonder . . .
Some possibilities for learning that this data may present . . . So, Sarah – what kind of data are we looking at today?So, Sarah – what kind of data are we looking at today?
15. Data Driven DialogueObservations Reflect and record what you see in the table and/or graphic representation
I observe that . . .
Some patterns/trends that I notice are . . .
I can count . . .
I’m surprised that I see . .
16. Data Driven DialogueInferences
Reflect and record what you believe is true that explains what you see in the table and/or graphic representation
I believe that the data suggests . . . because, …
Additional data that would help me verify/confirm my explanation is . . .
I think the following are appropriate suggestions/solutions/responses that address the needs implied by the data:
Additional data that would help guide implementation of the suggestions/solutions/responses and determine if they are working include :
17. Level I Inquiry - Content Area Gap Analysis What content areas(s) are not meeting state AYP objectives?
What is the trend over time for the identified content areas?
What cohort of non-proficient students appears in more than one identified content area?
Which students have improved/declined from one year to the next?
18. Level II and III - Strand Gap Analysis and Benchmark Gap Analysis
How did students perform on strands in the identified content areas?
What is the trend over time for the identified strands?
How many students in strands with low performance are proficient in the content area?
What cohort of students that appears in more than one strand with low performance?
How did students perform on GLCE in underperforming strands?
What is the trend over time for the identified GLCE?
How many students underperforming in the GLCE were proficient in the strand?
What cohort of students that appears in more than GLCE with low performance?
19. Naming Convention for Reports Which 4th grade students are not making adequate growth based upon the Fall 2007 and Fall 2008 Reading MEAP?
20. S.M.A.R.T Goals and Action Plan Begin to develop your S.M.A.R.T goal and action plan, aligning with your school improvement planning
Graphic
Template
Action Steps
Please write your S.M.A.R.T goal at the top of your implementation chart.
21. LISD Data Camp August 25 and 26, 2009
LISD TECH Center
Jennifer DeGrie and Stan Masters
22. Session 2 Essential Questions Using School Data to Clarify and Address the Problem
What data do we have about our instructional program in a low performance area?
What strategies need to be part of our school improvement plan to address high impact causes of low performance?
What professional development do our teachers need to implement the strategies?
23. Session 2 Outcomes Identify additional data needed to clarify the problem
Identify specific content standards that need to be monitored
Identify instructional program and staff development needs
Develop an action plan to engage staff in clarifying the problem
24. School Improvement Process
25. Norms for Our Work Participate actively
Actively listen
Seek application
Press for clarification
Honor time agreements and confidentiality
Keep ‘side bars’ to a minimum and on topic
Take care of adult learning needs
26. Data Roles What roles will each member of your team play in today’s work?
Identify roles
Describe responsibilities
Hold each other accountable
27. Understanding Statewide Assessment Reports
28. A data warehouse connects databases to each other, usually through the use of an unique student identifier.
With a warehouse, not only can districts answer the questions --
Who are our students
Who are our teachers
How is our system effectively aligning our teachers to who we have as students?
What are our results?
What do students know and what are they able to do.
What processes are we using
How do teachers, students and parents perceive the learning environment --
Districts will be able to answer the questions that require the intersection of the different databases, such as
NCLB -- Where are the student learning gaps? Are their student groups that are not continuously improving?
What processes (instructional strategies, curriculum, programs) make a difference for who we have as students?
is there a connection between student achievement results and students perceptions of the learning environment?
Ultimately, we want to answer the question that is represented by the middle, or the 4 way intersection:
Which processes work the best for who we have as students? Perceptions might even give us the best answer to that question.
A data warehouse connects databases to each other, usually through the use of an unique student identifier.
With a warehouse, not only can districts answer the questions --
Who are our students
Who are our teachers
How is our system effectively aligning our teachers to who we have as students?
What are our results?
What do students know and what are they able to do.
What processes are we using
How do teachers, students and parents perceive the learning environment --
Districts will be able to answer the questions that require the intersection of the different databases, such as
NCLB -- Where are the student learning gaps? Are their student groups that are not continuously improving?
What processes (instructional strategies, curriculum, programs) make a difference for who we have as students?
is there a connection between student achievement results and students perceptions of the learning environment?
Ultimately, we want to answer the question that is represented by the middle, or the 4 way intersection:
Which processes work the best for who we have as students? Perceptions might even give us the best answer to that question.
29. Keys to Quality Classroom Assessment Clear Purposes
Clear Targets
Good Design & Methods
Sound Communication
Student Involvement in all keys!
30. assessment for learning
formative
(monitors student progress during instruction)
placement
(given before instruction to gather information on where to start)
diagnostic
(helps find the underlying causes for learning problems)
interim
(monitor student proficiency on learning targets) assessment of learning
summative
(the final task at the end of a unit, a course, or a semester) Purposes of Assessments
31. Formative Assessment techniques Oral Language
Accountable talk, nonverbal cues, value lineups, retellings, think-pair-share, whip around
Questions
Response cards, hand signals, personal response systems, Socratic seminars
Writing
Interactive writing, read-write-pair-share, summary writing, RAFT
Tests
Multiple choice with misconceptions as distracters, short answer with word banks, true-false items with correction for the false items
32. Components of an Summative Authentic Assessment Task What “new” prompt will you use to trigger “old” learning from prior instruction?
What directions will you give to the students completing the task?
What procedures will you use as the teacher administering the task?
What scoring rubric will use to evaluate the quality of the students’ task?
33. Kinds of Learning Targets Stiggins, Arter, Chappuis, and Chappuis. (2006). Classroom Assessment for Student Learning. Portland, OR: ETS. Knowledge – The facts and concepts we want students to know and understand.
Reasoning – Students use what they know to reason and solve problems
Skills – Students use their knowledge and reasoning to act skillfully
Products – Students use their knowledge, reasoning, and skills to create a concrete product.
Dispositions – Students’ attitudes about school and learning.
34. Methods of Assessment Selected response
one answer is correct; sometimes taken from a list
Extended written response
constructed into sentences; criteria given for quality
Performance assessment
observed product of learning; criteria given for quality
Personal communication
interaction with student; uses checklist or criteria
35. Methods of Assessment
36. Seven Strategies of Assessment for Learning Where am I going?
Clear targets
Models of work
Where am I now?
Descriptive Feedback
Student self-assessment/goal setting
How can I close the gap?
Lessons that focus on one target at a time
Teaching self-reflection
Student record-keeping
37. Level V Inquiry - School Variable Analysis Is there a difference in the core curriculum between proficient and non-proficient students?
Is there a difference in teachers or the sequence of teachers between proficient and non-proficient students ?
Is there a difference between the time students have been in the building/district?
What is the correlation between classroom/course grades and statewide assessment performance?
38. Level VI Inquiry - Student Variable Analysis Is there a difference in the behavior patterns between proficient and non-proficient students?
Is there a difference in attendance between proficient and non-proficient students ?
Is there a difference in perception of school between proficient and non-proficient students ?
Is there a difference in participation in extracurricular activities between proficient and non-proficient students ?
39. Level VII Inquiry - School Improvement Variables Analysis How effective are the remediation strategies for non-proficient students?
How will the school/district perform on the next round of statewide assessment?
40. Naming Conventions Assessment and Exams
School Year
(e.g., 2008-2009)
Name of Course/Grade
(e.g.,US History and Geography or 4th Grade)
Name of Assessment or Exam
(e.g., World War II, EXPLORE, or Dolch Words)
You may also identify the timing of the assessment
(e.g., Beginning, Middle, End or Fall/Spring or Pre/Post)
41. Fall Professional Development student learning summative assessments
end of unit, end of course?
standards-based grading and reporting
teacher access center, standards and competencies?
use and analysis of multiple measures of data
creating intersections, assessment templates?
formative assessment strategies
descriptive feedback, self-monitoring, checking for understanding?
studying student work through collaborative inquiry
protocols, procedures?
42. Fall Professional Development Planner student learning summative assessments
“Classroom Assessment for the 21st Century”
standards-based grading and reporting
“Grading and Reporting for the 21st Century”
use and analysis of multiple measures of data
“Using Classroom Data to Monitor Student Progress”
formative assessment strategies
“Classroom Assessment for the 21st Century”
“Grading and Reporting for the 21st Century”
studying student work through collaborative inquiry
“Examining Student Work to Inform Instruction”
43. Data Camp Follow up professional development 2009-2010 Examining Student Work to Inform Instruction
How do we structure time to examine student work?
How do we reach consensus about what proficient work looks like?
How do we monitor that teams are regularly examining student work and using the data to inform their instruction?
Using Classroom Data to Monitor Student Progress
What is the difference between summative and formative assessments?
What data do schools need to collect to monitor their student progress?
How do we communicate our expectations for how teachers monitor student progress?
44. S.M.A.R.T Goals and Action Plan Finish your S.M.A.R.T goal and action plan, aligning with your school improvement planning
Graphic
Template
Action Steps
Please write your S.M.A.R.T goal at the top of your implementation chart.
45. LISD Data Camp August 25 and 26, 2009
LISD TECH Center
Jennifer DeGrie and Stan Masters