You are here

District Case Story - Greeley

Greeley 6 school district enrolls 19,623 students throughout 30 schools.

District Contacts:

  • Nancy Devine, Chief Academic Officer

How do we find the time for UIP? What was our timeline for engaging in UIP?

The district began formal work on the UIP plan in February and submitted the initial plan in the spring as required by state Title guidelines. The revised plan was submitted in January 2011 as required by the new accreditation process. The district participated in a CADI review in 2005 which indicated that District 6 was a district of schools rather than a school district. This was the result of a long history of site-based management and resulted in limited support for schools from the district. The results of the CADI report guided the development of both the 1st and 2nd district strategic plans which, in turn, guide the annual district and schools’ improvement plans. It provides a framework for the next year’s critical actions and next steps.

How did we build staff capacity to engage in UP?

In the summer, the district staff provided a 5-day Leadership Institute where school teams were trained in team building, roles and responsibilities of school-based teams, school performance frameworks (SPFs), root cause and action planning. Then school teams were given 3 additional days to go back and work with their school staff.

We also hired external consultants to assist them. In some cases, the external consultants helped develop the UIP plan. We had had SST audits for many of these schools, so they had additional data from these experiences. The consultants worked monthly or sometimes more often, with school teams, collecting data, doing walkthroughs, and providing other implementation coaching support.

The district provided a lot of data and data analysis for the schools. While the district provided a lot of support for the schools, we didn’t provide nearly enough for some. This support requires a lot of resources. The schools got a lot more help doing the plans and not so much support implementing the plans.

How did we implement UIP at the district level? At the school level?

Weld District 6 in Greeley tried out the UIP template in the spring of 2010. The district was on Title 1 District Improvement at that time. Just a couple of people were involved initially. We drew strategies from the district strategic plan. This UIP plan was submitted in spring of 2010. The district reworked the plan in Nov/Dec 2010. Again, we drew strategies and action plans from the district strategic plan to develop the annual district and school improvement plans. This was a standard district practice at the time the UIP process was first implemented.

People involved in its development (both initial plan and reworked plan) were director-level staff. School plans were driven by the district plan.

Challenge:

  • Pulling multiple plans together midway through the 5-year cycle of our previous plans (Title plans + Strategic plan, etc.). This will be easier in the future. Trying to pull the plans together and mesh federal and state requirements that might be different from the UIP requirements was frustrating. Part of this was because some of the plans were already in place. We have to transition to the UIP becoming the central focus of our planning. The struggle is to determine the relationship of the district strategic plan and the UIP. What’s appropriate for each?

A lot of the actions in the strategic plan are actions that happen at the school level, e.g., RtI, data team structures, training and use of data, etc., all which had to be implemented at the school level. There was tight alignment between the district plan and what was going to be happening at the school level. One of our critical q uestions was: What support would/could the district provide to support the action required to develop and implement these elements at the school level?

Challenges:

  • Sufficient resources to support the work.
  • Accountability for doing what was in the action plans.
  • Monitoring school actions. The challenge is not so much writing the plan as it is implementation of the plan, i.e., insuring there are structures and resources in place to implement and monitor the plan both at the district and at the school level.

Accessing time and people to do this is always a challenge. There are not enough people at the district level to provide the kind of support needed to do this. We hired some external consultants to help but they can’t hold consultants accountable like we can our district staff. They were very helpful as they provided support and expertise for the process. The consultants expanded the internal capacity of the district and schools, helping them to understand the process and the work. It takes more resources than we have now.

How do find meaning in student performance data?

We used all of the usual suspects, i.e., CSAP, student engagement data, parent perceptions, and a lot of student and demographic data particularly related to the impact of demographic changes on our work and is the work changing in response to the changes in demographics. We also used CELA data and internal reading, math and writing data. We didn’t use TELL data because we didn’t have enough people respond to the survey to be meaningful to our district.

How do we address root cause analysis?

These processes were not as effective as we would have liked last year. This year, we are going to use a lot more protocols than we did last year and we will be spending a lot more time on root cause. We did this work at the district level and then mandated that it be done at the school level. Schools need to take on the importance of this work. We all need time and expertise resources. We will spend more time building capacity this year. Staff used some protocols from Pruess’ Root Cause Analysis materials. Root cause analysis takes a lot of time both to do it and to understand the concept and its value.

How did we work with our local school board? District accountability committee? School accountability committees?

The School Board had a lot of information provided to them in their work sessions, at retreats and sometimes in open board meetings. The board reviews the UIP at their retreat but because this is a subset of the district strategic plan, it is not a separate item.

The DAC reviewed the district plan. Principals were to go over their SPFs with their SAC members. SAC was to be involved with the development of schools’ plans. District staff provided ongoing coaching, especially for our highest need schools.

This continuous improvement system (the Colorado growth model, SPFs and DPFs, UIPs and their technical terminology) is so complex that it is practically impossible to have SACs and DACs really understand and it is not realistic. The terminology and concepts are really difficult to understand. This really is rocket science. That is clear when we try to explain the growth model and the SPF framework to our staff. The board reviews the UIP at their retreat but because this is a subset of the district strategic plan, it is not a separate item. The DAC reviews the plans but terminology and the context behind it is a struggle for most board members to really understand.

What advice would we give to other districts?

  • We should not set unreal expectations re: perfection the 1st time around. There aren’t ways to make it shorter or easier.
  • Data and having it readily available is critical. We need a school profile which is a document with all of the school’s data, i.e., how have demographics changed in all of the subgroups over the past few years? What are the changes, trends, etc. in student data broken down by subgroups?
  • Consider the follow-up questions about the data vs. collecting the data. As a district we helped with the data profiles. Then the district and school teams could analyze the trends, ask critical questions, and determine root cause from these data profiles.
  • Don’t underestimate the time and expertise that is required to do this work. If we really want to do this right, it is going to take time and resources to do it. In 5 years it may begin to look closer to what we’re looking for. Implementation and monitoring is not a part of what happens at the state level.
  • There must be adequate expectations, training, and resources provided by the state and they need to recognize the realities of the work that is required after the plan has been submitted to the state.