Each month, AEI Program Coordinators monitor data which has been entered into LACES for compliance against reporting requirements, for progress towards performance targets, and to identify any areas of strength or growth.
AEI uses a combination of views, reports, and searches in order to complete this monthly data review:
Custom View - Student List
AEI creates a custom Student List view to monitor several key data elements monthly. Instructions for creating a custom student view in LACES can be found on pages 11-14 of the LACES user manual.
AEI’s custom student list view includes the follow fields:
- Creation Date
- Student ID
- Last Name
- First Name
- Middle Name
- Date of Birth
- Intake Date
- Age at Intake
- Underage Participation Allowed
- Last Enroll Date
- Secondary Program
- Funding Stream
- Entry Level
- Subject Area
- Assess Status in Subject Area
- Current Level
- Instructional Hours Since Last Assessed in Subject Area
- Current FY Instructional Hours
- Last Assessed Date
- Last Hours Date
- Days Since Last Hours Date
- Overall Status
- Cultural Barriers (Barrier to employment)
- Disabled (Barrier to employment)
- Displaced Homemaker (Barrier to employment)
- Economic Disadvantage (Barrier to employment)
- Ex Offender (Barrier to employment)
- Exiting TANF within Two Years (Barrier to employment)
- Foster Care Youth (Barrier to employment)
- Homeless (Barrier to employment)
- Long Term Unemployed (Barrier to employment)
- Low Literacy Levels (Barrier to employment)
- Migrant Farmworker (Barrier to employment)
- Seasonal Farmworker (Barrier to employment)
- Single Parent or Guardian (Barrier to employment)
Note: AEI removes the “current fiscal year” search filter on the student list to view learner records in the system in order to check for duplicates.
AEI reviews the data above using filters in LACES or in the exported Excel document generated by the custom student list view to look for the following things:
- Duplicate learner records – AEI applies a duplicates search on the Last Name and Date of Birth fields to identify potential duplicate records and flag these for program follow up.
- Count of total learner records – AEI runs a count of all learner records in LACES and check with programs that the number is reasonable.
- NRS reportable records – AEI runs a count of NRS reportable records, by filtering out any records that do not have WIOA Title II reported as the Funding Stream AND/OR where the record is not a current FY learner, meaning it does not have a current FY intake date AND has no current FY instructional hours.
- NRS participant records – AEI runs a count of NRS participant records by filtering out:
- Any records without WIOA Title II selected as the funding stream.
- Any records where the record is not a current FY learner (see above).
- Any records where the Age at Intake value is less than 16. (These individuals are not eligible to enroll in AEFLA/IELCE programming).
- Any records where the Age at Intake value is less than 16 and the underage participation allowed value equals FALSE. (These individuals are not eligible to enroll in AEFLA/IELCE programming).
- Any records where the Entry Level equals Level Not Defined. (All NRS Participants must have a pre-test of record in the current fiscal year).
- Any records where the Entry Level equals Completed Adv ESL (ESL L6). (This is an invalid entry EFL and these learners must be re-pretested on an ABE assessment).
- Any records where the Current FY Instructional Hours value is less than 12.
- NRS participants by Entry EFL – AEI runs counts on the list of NRS participants using the Entry Level field to determine the number of ABE (ABE Levels 1-4), ASE (ABE Levels 5 & 6) and ESL (ESL Levels 1-6) participants served. We compare these to the targets programs set in their continuation applications and follow up with local programs if the totals are significantly under projected targets.
- Corrections Ed Participants – AEI runs counts on the list of NRS participants using the Program field to determine the number of corrections education participants served. We compare these to the targets programs set in their continuation applications and follow up with local programs if the totals are significantly under projected targets.
- IELCE Participants – AEI runs counts on the list of NRS participants using the Secondary Program field to determine the number of Integrated English Literacy and Civics Education grant participants served. We compare these to the targets programs set in their continuation applications and follow up with local programs if the totals are significantly under projected targets.
- Typographical date errors – AEI filters the Age at Intake field for negative values and ages well below and above the general range of AE learners served; for any anomalous ages, AEI reviews the Date of Birth and Intake Date fields for typos. AEI also reviews the Last Assessed Date for assessments outside of the current FY for any typos.
- Percent of SSN reported and SSN typos – AEI reviews the SSNs reported to ensure they have 9 digits and we look at the percent of participants for which an SSN is reported as this is the only match criteria for Employment, Wage and Postsecondary Entrance performance measures. Where SSNs have fewer than 9 digits, AEI flags these records for local program follow up. Where the percent of SSNs reported is considerable low, AEI follows up with local programs on collection strategies.
- Underage Participation Allowed Documentation – For records where the Underage Participation Allowed value equals TRUE, AEI reviews the documents row of the student data tab in each record to see that required age participation documentation has been uploaded into LACES. If no documentation has been uploaded, AEI follows up with local programs.
- NRS Exit Status – AEI reviews the Days Since Last Hours Date, filtering out any records where the value is less than 90. Of the resulting cohort, we filter out any records where the Overall Status field equals Left. The remaining list shows those learners that need to be dropped/exited from enrollments in LACES per the NRS exit definition of a 90-day attendance gap. AEI flags these for local program review.
- Barriers to Employment – AEI checks the percent of NRS Participants who have one or more employment barriers listed in LACES for the current FY. Where no barriers have been reported on any records, or the percentage seems low, AEI follows up with local programs to review intake procedures for collecting this data.
On the reports screen from the Student List, AEI uses several reports to check on attendance and assessment reporting. Note: We filter the student screen for NRS participants prior to running these reports so that only participants are included in the reports.
- Student: All Hours with Hours Types between Date Range – AEI uses this report to check the following:
- We review the date column for the prior month period to check that attendance was reported weekly comparing to class schedules provided in continuation applications as needed.
- We review the Hours Type column over the prior month period for any instruction not reported with a subject area. Where a subject area is missing we flag these records for local program follow up.
- We review the Hours Type column over the prior month period for distance learning instruction reported/not reported for programs who have been approved to use distance learning curriculum. Where a subject area is included but hasn’t been approved or has been approved but is missing, we follow up with local programs.
- We review the Hours Present column for attendance reported over the prior month for any non-total values greater than 8. We spot check that multiple/duplicate attendance records on the same date have not been reported. Where single-day values or totals are greater than 8, we flag these records for local program follow up.
- Student: Current Year Pre and Post Assessments – AEI uses this report to check the following:
- We use the # of students field in the report as the numerator in our post-test rate calculations. The denominator is determined by the count of NRS participants (see above). Where the post-test rate is below the 70% we follow up with local programs.
- We use the Y/N column in the report to determine the percent of learners achieving a Measurable Skill Gain through pre- and post-testing. Learners with a Y value are included in the numerator. The denominator is determined by the count of NRS participants (see above). Where the percentage is lower than the statewide target of 34%, we follow up with local programs.
- We use the scale score change column to review learner records where a level gain was not achieved for the percent of those records where scale score increased, remained the same, and decreased. Any concerning or promising patterns are flagged for local program follow up.
- We use the Post-Test Date column to run a count of learners post-tested over the prior month period and review this with local programs to ensure all post-tests administered were reported in the system. We review this number with local programs for accuracy. Per the state assessment policy, pre- and post-test data must be reported into LACES no less than 2 weeks after test administration takes place.
- We spot check scale scores in this report for out-of-range scores and follow up with local programs where any concerning patterns are evident.
- Assessments: Hours Between Assessments – AEI uses this report to check the following:
- We filter the Hours Bet. column for any values less than 40 and flag these for local program follow up, reminding programs that a Early Post Test Exception form must be on file locally (and may be uploaded in the student documents in LACES) for any post-test which occurred before the learner accumulated 40 instructional hours.
- We compare the Start Form, Start Subtest, and Start Level columns to the Match Form, Match Subtest, and Match Level columns for any learners pre and post-tested on the same assessment form. We use the Start Assess. and Match Assess. date columns to filter the resulting list for same form assessments delivered where 6 months did not elapse. Local programs are made aware of flagged records for review and reminded about vendor requirements to alternate test forms.
AEI uses the student alerts widget on the dashboard to identify learners who are LACES tracks as Post-test eligible, meaning they have accumulated any 40 instructional hours since their last assessment date (or the beginning of the program year, whichever occurred most recently). Instructions for adding widgets to you dashboard can be found starting on page 108 of the LACES user guide. In the student alerts widget, we use the Students Eligible for post-testing row to click on the number listed to see a drilldown of student records included in this count. We provide the resulting list to local programs for follow up to determine if those learners accumulated enough hours in their subject area to post-test.
- Using the custom student list view above, AEI identifies which learners have been marked as Integrated Education and Training Program participants (IETP) in LACES. We add this search criteria to by clicking Add Search, selecting IETP from the dropdown menu, ensure the = sign is listed in the operator box drop-down menu, and then by clicking Apply. If a program has IETP participants, we follow up to ensure those learners were marked correctly as IETP and that the IET program meets the WIOA definition/requirements.
- Using the custom student list view above, we identify which assessment records that have been marked as a pre-test through the Subject Area Override option in LACES. We add this search criteria to by clicking Add Search, selecting Override Subject Area from the dropdown menu, ensure the = sign is listed in the operator box drop-down menu, entering a 1 value next to the operator, and then by clicking Apply. If a program has a high percentage of learner records where the pretest override was employed, we review the assessments records in this search to ensure the pretest override is being used only when appropriate and that a comment has been added to the learner records explaining why the override was necessary.