You are here
Promising practices of flexibilities, efficiencies, and differentiation for measures of student learning/outcomes
Did You Know?
If a measure of student learning/outcomes (MSL/O)is grounded in the Colorado Academic Standards, you can use it! MSL/O systems and structures are an evolving process and can be changed. Legislative requirements for teachers include:
- individual attribution (student results are attributed to one licensed person),
- collective attribution (student results on a measure are attributed to more than one licensed person),
- statewide assessment results, when available, and
- results from the Colorado growth model (CGM), when available.
The last two requirements do not apply to many teachers in Colorado due to the lack of availability of state tests and Colorado Growth Model (CGM) in all subjects/grades.
Find opportunities to capitalize on the great work teachers are already doing to maximize student outcomes. Districts who have incorporated any level of teachers’ voices overwhelmingly report a higher level of fidelity of implementation and buy-in from educators. Unlike teacher evaluations, principal and specialized service professionals (SSP) evaluations do NOT have a collective or individual attribution requirement. Another legislative requirement is the the data need to be available at least two weeks prior to the last day of school in order to be used for an MSL/O. WIDA ACCESS achievement results are generally available two weeks prior to the last day of school. The PSAT and SAT are scored on the same scale; therefore, results from the PSAT can be used to set meaningful SAT goals.
Ideas for Measures of Student Learning/Outcomes
Example of utilizing the SPF/DPF to fulfill multiple MSL requirements
A number of districts have found that three of the four requirements for MSLs are met (collective attribution, statewide assessment results, and results from Colorado Growth Model) when they use School and/or District Performance Frameworks (SPF/DPF). Some districts set SPF/DPF targets as a change in overall percent of points earned on the SPF/DPF from year to year while others use the performance data from the SPF/DPF sub-indicators to guide school targets.
Example of using the same assessment for individual and collective measures
A single assessment can be used for both the individual and the collective attribution requirements (i.e. state and/or vendor-based assessments). For example, one district uses Northwest Evaluation Association (NWEA) Math as the school collective measure (as math is indicated as a performance priority in its Unified Improvement Plan (UIP) and all teachers are expected to incorporate math into their curriculum). This district also uses NWEA Math as an individual measure for its math teachers. When this approach is taken, districts should be aware of the implications for “double-dipping” which can result in a disproportionate influence of one measure on the overall rating.
Explore local decisions in evaluation
Some districts found that putting more weight on local assessments created stability in the MSL system and used the statewide data to inform conversations about local measure results. Assessments already used, whether at the district, school, or classroom level, are valuable in informing instruction and monitoring student progress and can be easily used as an MSL. One district has moved toward student led conferences where students set and monitor their own growth goals on local measures which emphasize reaching goals that kids care about.
Example of aligning MSLs to UIP goals
To streamline efforts, many districts merged their Unified Improvement Plan (UIP) process with the development of appropriate measures which are aligned with school performance priorities. Interim and benchmark measures which are identified in the UIP were also used as part of their MSL systems.
Example of spacing the MSLs throughout the year
To encourage teacher buy-in and deter too much emphasis on a single measure, some districts aligned MSLs to class units or trimester goals which are spread out throughout the year. One district piloted a process with its middle school language arts and math teachers where teachers identify what they want students to know and be able to do in learning progressions throughout the year. Teachers were heavily invested and ultimately created a Student Learning Objective (SLO) in the service of learning, not simply for the purpose of evaluation.
Example of using existing Professional Learning Communities to create MSLs
One district has brought role-alike educators together in existing Professional Learning Communities (Professional Development, UIP, etc.) where there are discussions about the teaching and learning cycle to develop MSLs. Administrators and teachers have come together to discuss a process that allows teachers to create and submit a MSL for approval to be used in their evaluation. Administrators have reported that this process and the conversations brought a different perspective to the process and allowed them to utilize the knowledge and talents of teachers in the process. This alleviated the burden on administrators to feel the need to create the whole structure and provided the opportunity to capitalize on these conversations.This method also accounts for professional development and frequently provides more rigorous measures as districts report teachers tend to set higher expectations for students. Creating year-long PLCs where teachers and administrators collaborate on the process and the level of engagement in these processes can inform the professional practice portion of teacher evaluations.
Example of connecting professional practices to MSLs
During the mid-year conversation, one district discussed progress and performance on MSL goals and connected them to professional practices observed at that point in the year. One teacher from this school district was able to make explicit connections across MSL and professional practice expectations and the evaluator used the mid-year conversation more efficiently by discussing both sides of the evaluation. In addition, the teacher was highly engaged in the SLO/ MSL process, which then informed progress on specific professional practices.
Example of using evaluation data to inform system changes
Many districts use the charts and reports feature in the Colorado Performance Management System to determine if teacher professional practice ratings and MSL ratings are correlated. This data point is then used to guide discussions with both teachers and evaluators about how these ratings are interrelated. For example, if MSLs are rated high and professional practices are rated low, this may indicate MSL rating criteria are not rigorous enough or that evaluators are being too tough on teachers. This data point can be used to drive the “why” discussion.
Colorado Stories of Student Learning/Outcomes
Measuring Up in Miami-Yoder
Colorado Stories of Student Learning/Outcomes
Measuring Up in Miami-Yoder
Thirty-eight miles east of Colorado Springs lies Miami-Yoder School District serving approximately 316 students. Here, the staff of the elementary school worked hard to develop differentiated MSLs that are truly a reflection of teacher effectiveness as valued by the instructional staff and administration. Each grade/content area created a team to describe the assessments they felt effectively indicated student achievement and growth. These assessments are unique and valued by each team member as tools to drive instruction and ensure student success. Each teacher uses the School Performance Framework as an indicator for effectiveness, but then created a unique MSL that includes the identified assessments they value such as Teaching Strategies Gold, DIBELS Next Benchmark Data, NWEA Reading, NWEA Math, and NWEA Language. The biggest challenge the team faced was facilitating a conversation in order to develop a solid indicator of effectiveness. "It takes a dedicated team to dive deeply into the student data and identify areas in which they can truly effect and use to form strong instruction. The use of individualized MSLs has given teachers more of a voice, not only in their evaluation, but also in the development of curriculum, scheduling, and interventions,” says elementary principal Sheila Hartley.