Site Index | CDE Offices | Staff Directory

 

* * CDE will be closed on Thursday, Dec. 25 and Friday, Dec. 26 for the Christmas holiday. * *

District Case Story - Garfield 16

Garfield 16 school district enrolls 1,133 students throughout 5 schools.

District Contacts:

  • Dr. Ken Haptonstall, Superintendent

Why is it important to do UIP?

What has been the benefit of Unified Improvement Planning (UIP) for the district?

What did UIP add to our existing improvement planning efforts?

How do we find the time for UIP? What was our timeline for engaging in UIP?

The intent was to plan last year and implement this year and then look at the data each year and revise. We all need to look at improvement planning as an ongoing, cyclical improvement process.

We (the principals and school-level teams) met for about a month, meeting every other week for 8 hours at a time, looking through their data. This was in September and October. We probably won’t do that in the future but the reason we did it the first time around was that we wanted to be sure all of the staff understood all of the data including demographic and perception data.

We accessed a lot of time. We have early release every Wednesday. At least two of those Wednesdays were devoted to formulating UIP plans at the building levels. Then, during our regular leadership team meeting time, we accessed everyone on the administrative UIP planning teams. The process won’t continue to take so much time in the future. Last year was the initiation of the planning process. This year, we are looking at the new data, verifying our current plans, and revising them as needed. Since these are two-year plans, we won’t need the kind of time we did initially.

How did we build staff capacity to engage in UP?

We went to a CDE training in June and then again in August. We went to CDE trainings in mid-September and October. The training processes and tools have improved significantly as we have all learned from trial and error.

The additional district training was on root cause analysis using a book study process to learn how to do this. What a root cause is and how it impacts on progress is pretty complex. Ken’s review of other district plans helped him know that extra time on this part of the process would pay off in the long run. They began this study in September after one of the CDE trainings. (The book the staff studied was Root Cause Analysis by Paul Preuss.)

How did we implement UIP at the district level? At the school level?

The UIP work was done at the school level first. Then we came together with our administrative team to review all of the school plans. We followed the review with the conversation re: what about the school plans could be molded into a plan at the district level that would move our district forward. We participated in a CDE - CADI review at the same time, last fall. The CADI team input plus our district team improvement planning helped us put together a pretty strong plan for the district.

The schools, through their school improvement plans and the CDE - CADI review, set the stage for developing the UIP. Plan development began last fall with the SPF and DPF and continued through the end of January. The plan implementation spans the end of last year and throughout this year. The district administrative team includes: 5 Principals, 2 APs and Directors of Finance and Technology, one school board person, and 3 building accountability (BAC) members who also represent the DAC.

The keys to the process going well for us:

  • We did quite a few trainings at the district and went to CDE trainings last fall.
  • As a result of our training, we related to the purpose and process for UIP planning, sources of data, and how to use that data to construct real root causes. These training activities and applications to our planning were beneficial for my staff, our DAC, and our board member, and helped add to our success in UIP planning with our staff.

How do find meaning in student performance data?

Many of our improvement goals in the past have been driven solely by achievement data rather than looking at some of the other aspects of our schools to create a whole picture of student performance and the reasons for it. We filtered through a lot of data with the school level teams and now that I think they have a more comprehensive picture, I think it is easier for them to cull out the information they need. They know where to go to look for the information now and they won’t have to wade through everything.

We lacked perception data. As for the TELL data, we don’t have enough people participating to get enough of an “n” number to make that much of a help. Surveys are also a cost issue for us. We used demographic, program and achievement data which we have a lot of. We required ourselves to verify root cause with at least 2-3 pieces of data. This helped us sort out those things that weren’t real root causes and to determine when we needed more data to substantiate what we thought could be our root causes.

The data balance is important. If you are capable of getting perception data and using it, balancing between all four of the Bernhardt domains in the data used is pretty critical to creating a true picture of the district’s achievement. It helps ferret out root causes that may be nestled somewhere beyond the academic data, i.e., the challenge may be a demographic problem and while you may not solve the demographic issue itself, maybe there is something inside that area (a root cause) that can be solved or may be leading to other more solvable issues. So the balance is an important part of the overall analysis process in order to make more effective plans.

In the past, the superintendent has done most of the data collection and analysis but now collecting and analyzing data is much more of a shared process. This year, as we revise our plan, we will be able to sort through our data for relevant and useful data.

How do we address root cause analysis?

Root cause is difficult work because we tend to make assumptions at surface levels without getting down to the real issues related to our lack of success. The superintendent suggests that districts need to look at models of successful systems. Ask, “What are they doing that increases student achievement?” and then look for solutions that match up with the school or district’s identified root causes.

Getting a handle on the whole root cause process is essential. It was helpful for CDE to change the terminology from priority needs to priority challenges. It makes much more sense.

How did we use SST or CADI reviews in our planning efforts?

While we were meeting to develop our plan, we were also meeting, at the same time, for our CADI review. We were demonstrating our data review process for the CADI team, so for us, we thought this was a great opportunity - to have both the UIP and CADI processes going on simultaneously. It gave us an inside and an outside review at the same time. It also gave us more of a purpose for the work.

Because of the support of the CADI review, we had simultaneous support during our UIP planning. We were pretty clear about where we were and where we needed to go so when we wrote our implementation plan, it was pretty straightforward in terms of what the implementation benchmarks needed to look like and the actions we needed to take.

Even if districts don’t have an SST or CADI visit scheduled, they [might] consider use of the CADI and SST rubrics as part of their UIP planning process and tools. The grant that follows the SST or CADI review is also very helpful.

How do we monitor the progress of the implementation of our UIP?

Progress monitoring is built into the building goals and implementation process in our UIPs. So when the district leadership team meets monthly, they review and monitor the process as well as the plan itself. Administrators share their progress and issues with each other from school to school through this meeting.

How do we engage in UIP when we are a small district?

How did we work with our local school board? District accountability committee? School accountability committees?

One person from the building accountability committees is also on the DAC which is good. It encourages a lot of communication within and across the system; however, the commitment of time can be overwhelming. It takes a lot of community stakeholders’ time and commitment.

The DAC met in December and January for plan reviews and revisions and, finally, DAC approval. The DAC presented the plan to the school board including the planning process and resulting plan. The school board approval followed the DAC presentation.

It was pretty much smooth sailing. We didn’t overwhelm the school board or the DAC with all of the data. We did all of the data review processes and then streamlined the findings from the data we had already reviewed for the DAC and school board. We wanted to make it easier for them to look at the trends and the root causes, and to understand how we determined those so that they could move forward with us to determine major strategies and action plans.

How do we make the school and district plans fit together?

What advice would we give to other districts?

  1. Go slow to go fast.
  2. Make sure you have the processes in place.
  3. This year’s plan may not be perfect so remember it is a dynamic plan, not something that is static. There is flexibility to get better and plan more thoroughly in the future.
  4. Encourage CDE to keep the trainings coming. All of it, i.e., initial training and refresh/retrain. Face to face works much better than web training. Scaffolding of the learning process needs to continue. There are 40 new superintendents across the state this year.

Quick Links

 

Contact Us

Colorado Dept. of Education
201 East Colfax Ave.
Denver, CO 80203
Phone: 303-866-6600
Fax: 303-830-0793
Contact CDE

CDE Hours
Mon - Fri 8 a.m. to 5 p.m.
See also: Licensing Hours

Educator Licensing