By Elizabeth Tarbutton
Teacher evaluation is a hot topic in education these days, but has anyone stopped to ask what the purpose of it all is? I think most evaluators would say that the purpose is to grow better educators to create meaningful change in schools. In order to affect these changes, evaluators collect a lot of data on students and teachers. I would like to think that these data are commonly used to have a meaningful, actionable impact on student achievement. Unfortunately, many states, districts, and schools lack protocols as to how data should be used. As a result, data is often misunderstood and used as an autopsy and not as a tool of improvement. I served for three years as a data coach, while also taking on the responsibilities of classroom teaching. I helped my peers figure out what data meant and how to use it to improve student achievement. If meaningful data protocols were more widely employed, educators would be able to improve their instruction, and have a significant impact on student learning.
Subjective data come in many forms during teacher evaluation: teacher observations; informal formative assessment; student surveys; school culture; etc.
In my experience these data are most useful when protocols for the generation and analyses of these data include the following elements:
- The intent for subjective data collection is clear
- The evidence collected has a purpose that ties back to the intent for data collection
- Instruments used for data collection are intentional and thoughtful (i.e. use of technology enhances data collection as opposed to just be novel)
- There is training and discussion as to what the evidence means for all players
- Time is built in to reflect on data
- Meaningful goals can created out of data
- Action plans are created to enact goals
- Action plans are reflected on and amended, as necessary
Objective data most commonly come in the form of student assessment data. As a data coach, the most overwhelming feedback I received was how meaningful and transformative it was for educators to finally understand what assessment data meant and how they could leverage that data to differentiate instruction in their classrooms. The scary thing about this feedback is that, for years, educators administered assessments, but never understood or used the assessment results. Empowering educators as to what data mean allows them to use assessments as a tool to improve the classroom experience and learning of their students.
In 2011 I received student data from the state test on two of my incoming students (we get state assessment data on our students after they start the new school year in a new class). Bryan’s score improved 650 points from the year before, while Austin’s score decreased by 95 points. Bryan went from ‘low unsatisfactory’ to ‘low unsatisfactory’ (his score was significantly low the previous year), while Austin stayed ‘mid Advanced’. According to the Colorado Growth Model, Bryan had inadequate growth, while Austin had adequate growth. Perplexed, I looked into why this was the case and learned that the statistics applied to students in the Colorado “Growth” Model are ranking statistics: the model should truly be called the Colorado “Rank” Model. This exemplifies that data analysis needs to be appropriate and meaningful.
After having successfully coached educators in interpreting and using data to inform their instruction, I have seen test scores increase by as much as 55% in one year. What I have learned is that protocols need to be in place for creating assessments to generate meaningful data and to reflect on assessment data to inform instruction. These are the key elements for successful data protocols.
Protocols for Creating Meaningful Assessments should include these elements:
- Assessments should be designed to assess specific student learning
- Evidence of student learning should be mutually determined when creating the assessment
- Grading rubrics should be written so that student mastery is easily identifiable via key elements of performance
- Rubrics should highlight key advances from one level of mastery to the next such that it is easy to identify methods of differentiation to promote student improvement
- Assessment should be timely and administered in a way that educators and students can act on results
- Assessment should take minimal time out of classroom instruction, and would ultimately enhance instruction
Protocols for Reflecting on Assessment Data should include these elements:
- Educators and administrators should be trained as to what assessment data mean
- Data should be analyzed/processed in a meaningful, appropriate manner
- Educators should be given time to analyze assessment data using common procedures
- Educators should be given time to collaboratively reflect on assessment data
- Educators should be given time to plan a “response to data action plan” for their students
- Students should be given ownership of their data by:
- Including students in analyzing data
- Students should be guided in creating, reflecting on, and amending goals as a result of their assessment data
- Students should be aware of their resultant learning plan, and be given action items to enact their learning plan to reach their goals
- Parents should be included in the data conversations
- Parents should be informed as to what assessments their student is given and the purpose of that assessment
- Parents should receive student data and be trained as to what their student’s data mean
- Parents should be informed as to educational decisions being made regarding their student as a result of their assessment data
When all players are brought to the table, data is used to diagnose mechanisms to improve the student learning experience. When data is understandable and meaningful, the mounds of data collected during educator evaluation can drive meaningful change in the education profession.
Elizabeth Tarbutton is a middle school math teacher at Hill Middle School in Denver, CO. She participated in the VIVA CEA Idea Exchange: Ensuring an Effective and Supportive Teacher Licensure and Renewal System in Colorado.