You are here

District Case Story - Elizabeth C-1

Elizabeth School District C-1 enrolls 2,656 students throughout it's 8 schools.

District Contacts:

  • Douglas Bissonette - Superintendent
  • Andrea Duran - Director of Learning Services
  • Jim Wilson - Running Creek Elementary Principal
  • Steve Thiessen - Frontier High School Director

Why is it important to do UIP?

Why has Unified Improvement Planning (UIP) been helpful for the district?

What did UIP add to our existing improvement planning efforts?

How did we build staff capacity to engage in UP?

The year before the UIP process began, we trained our district and school leadership staff in the data driven dialogue process. We then began training teachers in each building on how to access, utilize and make data driven decisions. Next, we provided training to school teams, our DAC and our Board of Education in the data driven dialogue process. We also trained them on the UIP process and template when it became a requirement for staff to complete. CTLT trainers and materials were a part of the training resources used during the 2010-2011 school year.

Our training had to be different for each stakeholder group. For some, particularly community members, it had to be slower, scaffolding information that other groups might have had in order to build capacity. Some examples include training in questions like: “What is the data? “How do you look at and analyze data?” “How do you apply what you learn from the data to determine trends and challenges and to develop strategies and action plans?”

We used the CTLT tools, templates, and processes. Without these, we would have really struggled.

We believe that our training in the UIP template and process, and the time we spent working with the data driven dialog process, resulted in the entire UIP process being effective. Without this preparatory work and training, completing our district and school UIPs would have been much more difficult.


  • Training, especially for our SAC, DAC and BOE: to understand the UIP template and process, to avoid jumping to conclusions too quickly, particularly regarding root causes; to analyze data; and get to true root causes. This has been a significant mind shift for many of the staff and community members involved in the process.
  • Another challenge and shift in thinking was to analyze multiple sources of data thoughtfully and carefully rather than using just one source of data as support and verification of our trends, challenges and root causes.
  • Follow-through – i.e., providing time to continually monitor progress on our improvement plan; to stay the course with regard to the monitoring process; and to ensure that improvement planning is an ongoing process and conversation. Current district staff thinking is to make progress monitoring one of our action plans with quarterly monitoring as a minimum expectation.
  • It takes time to build capacity. As much as the leaders may understand and enjoy the process, it still takes time to get everyone on board and we need to be patient.

How did we implement UIP at the district level? At the school level?

We established some timelines at the school level for the development of their plans using the UIP process. The schools created drafts of their plans. These plans were submitted to the district leadership team, called Administrative Cabinet, or “Ad Cab”. This group reviewed the drafts, discussed them, provided feedback and recommendations and returned them to the schools for revisions.

The district leadership team used the school draft plans to build the district plan. Targets and goals that the schools identified in their plans were used to create the district targets and goals. This was a bottom-up approach that helped all levels see the alignment and common themes in school and district improvement planning.

How do we find meaning in student performance data?

We used many forms of data. Each level of data is important, depending upon its purpose and use. As primary or 1st level data, we use the SPF data including our academic progress and growth data, our dropout data, our graduation rate data, and disaggregated group data. We ask ourselves, “Are we growing all of our students?”

As 2nd level data, we used other CSAP results found in other state reports. Since this assessment comes only once per year, district staff break down this data for buildings, teachers, and parents. Each stakeholder group gets appropriate access and copies of the data, i.e., parents receive their student’s CSAP data. In addition to CSAP, we subscribe to the NWEA map assessment. We give that assessment 3 times a year to K-5 and 2 times per year to 6-10. This gives us the ability to progress monitor during the year.

Our 3rd level data focuses on formative assessment practices at the classroom level. Are teachers using this data to inform their instruction, i.e., to re-teach, for referral to RtI? Are teachers using data to inform instruction at the individual student levels, every day – all day?

4th level data includes perception data from teachers, parents, students, etc. We collect this information at every level through various surveys to stakeholders and feedback from our SACs and DAC.

It is hard to manage so much data. We need to be thoughtful, structured and plan for the data collected, analyzed and used for various purposes. This helps to make the process doable, not just once a year but in increments.

Best advice: Be thoughtful about what data you use, how you train for its use and how fast you set expectations for staff performance throughout the UIP process. Scaffold the work and your expectations for results particularly re: understanding of the process and tools. The goal is to build staff capacity while being careful not to overwhelm and shut them down regarding the process and its value to improving classroom practice and student performance.

How do we monitor the progress of the implementation of our UIP?

Develop a culture of data analysis and determination of root causes to enhance our understanding when the same results keep happening. Monitor for what’s working as well as what is not working. Maybe successful strategies and structures in one place can be implemented beneficially in areas where we are not as proficient or strategies are not working as well.

The way we have built the UIP planning steps in our district, from bottom-up, really worked for us.

How did we work with our local school board? District accountability committee? School accountability committees?

SACs, DAC and the school board played a direct role in the planning process. SACs had an integral part in building the school plans. The SAC members brought the plans to the DAC after we trained the DAC members in the UIP template and planning process. Our bottom-up approach enabled the DAC members and the school board to see the alignment between the school and district plans.

How do we engage in improvement planning when we are already high performing?

What advice would we give to other districts?

  1. Take your time and be intentional.
  2. Support the process and have data easily and readily accessible to teachers and stakeholders.
  3. Stick to the data driven dialogue protocols.
  4. Keep digging on the root causes and don’t allow blame and pointing fingers to derail the process.
  5. Realize you can only set action plans based on what you can control.

Time is our biggest challenge. How do we build in sufficient time to complete a thorough planning process? We can’t short change the process. We can’t force the process. We have to take the time to do justice to each part of the planning process. Staff will lose motivation when/if we try to push when we aren’t ready to take the next step.

We will continue to refine the process we used last year by building capacity with teachers, SACs, DAC and our Board of Education. Our next steps include to revisit our UIPs from last year, analyzing our targets and adjusting the plan where needed.

What are our next steps?