You are here

Hybrid Learning Guide: Self Evaluation Rubric - Domain III

Self Evaluation Rubric

Domain III: Sustainability -- 1. Self-Assessment

In order to effectively manage progress towards sustainability, programs have clear success indicators and goals achieving success. In ideal settings, these goals are constructed of shared lead and lag measures and unbiased data collection and analysis.

Self Assessment Indicator: III. 1 -Self Assessment

1: Disagree

2: Slightly Disagree

3: Slightly Agree

4: Agree

**III.1.a. Our school/organization has a strong understanding of lead and lag measures and how they work in relation to building reliable programs.

      1

2

3

4

**III.1.b. Our school/organization is confident that we’ve established the correct qualitative metrics (e.g. stakeholder satisfaction or opinion) to measure our Hybrid Learning Program’s success.

1

2

3

4

**III.1.c. Our school/organization is confident that we’ve established the correct quantitative metrics (e.g. content mastery, active engagement, and etc.) to measure our Hybrid Learning Program’s success.

1

2

3

4

**III.1.d. Our school/organization has an effective process using analog and digital tools and in a variety of formats (e.g. surveys, focus groups, interviews, benchmarking) to collect feedback on our Hybrid Learning program from all stakeholders.

1

2

3

4

**III.1.e. Our school/organization performs usability audits of our online content to ensure consistent and intuitive design and alignment with our non-digital resources.

1

2

3

4

*III.1.f. Our school/organization frequently uses unbiased third parties to perform design audits and administer qualitative information sessions and data analysis.

1

2

3

4

*Site-Based

**Reference/Resources: CDE DARE Resource, CDE Program Evaluation

Additional Resources and Exemplars

Having good measures and collecting evidence in a variety of reliable ways is the key to gathering the right information to gauge program success.

Useful Tools:

Potential Reads:

  • A Handbook for High Reliability Schools by Marzano, Warrick, and Simms.

Domain III: Sustainability -- 2. Refinement

Self assessment and data analysis show where programs are successful and where improvements need to be made, and program refinement comes in the form of the augmentation or elimination of metrics that missed the mark and the conscious continuation of metrics that are determined to be promise points for programmatic success.

Self Assessment Indicator: III.2 - Refinement

1: Disagree

2: Slightly Disagree

3: Slightly Agree

4: Agree

**III.2.a. Our Transition Team uses analysis of this self assessment and data from our identified qualitative and quantitative metrics as part of decision making and action planning.

1

2

3

4

**III.2.b. Our school/organization articulates clear goals [e.g. SMART (Specific Measurable Attainable Relevant Time-based)] to meet program objectives.

1

2

3

4

**III.2.c. Our decisions for professional development needs are informed by our self-assessments

1

2

3

4

**III.2.d. Our decisions for software and digital tool selection are informed by our self-assessments.

1

2

3

4

**III.2.e. Our Transition Team has the ability to modify our program goals, metrics, assessment methods, as objectives are met, as time frames expire, or as new knowledge or situations demand programmatic changes.

1

2

3

4

*III.2.f. Our school/organization readministers this self assessment regularly to monitor changes on key self assessment in class="rtecenter"dicators.

1

2

3

4

*Site-Based

**Reference/Resources: CDE DARE Resource, CDE Program Evaluation

Additional Resources and Exemplars

After the data is collected, this section is about analysis and action. What to do with the data and how to set up a process of continuous improvement.

Useful Tools:

Potential Reads: