In a recent news report on WHAV and in a presentation to the Haverhill School Committee on October 26th Superintendent Scully reported that more than half of the dropouts reported by Haverhill to the Department of Elementary and Secondary Education (DESE) for the 2015-2016 school year were in error.
This post provides a summary of what we know at present. This post will be updated as new information becomes available.
For now, we expect revisions to recent data for 2016, and perhaps earlier years. We hope for higher quality reporting from Haverhill in the future. At this point, however, given the limitations and questions about the newly presented information (which so far applies to one year and has not been verified as consistent with DESE reporting) we should not dismiss previous evidence that Haverhill has higher dropout rates than would be expected for a city with its income and demographic characteristics.
What the Superintendent reported
The Superintendent provided a single sheet with a pie chart and a few statistics. It identified 107 reported dropouts for FY2016 [which correspond to the state statistics for the Haverhill school district.] It suggests 59 of these are “errors” leaving 43 actual dropouts. It identifies the reported dropout rate of 5.9% [which is reported by DESE as the Haverhill district dropout rate for 2015-2016] and lists 2.36% as an “actual dropout rate”. The pie chart contains the following information identified as “coding errors”:
- 22% Transferred in state private (such as Phillips Andover)
- 30% Transferred out-of-state (but were recorded as dropouts)
- 40% Transferred out-of-state (but were recorded as transfer in MA and never reported by another MA school)
- 8% Our alternative school (should not have been included)
The presentation did not include details about how the information was collected.
Inconsistencies and questions about findings presented
There are some apparent inconsistencies in the data on the sheet provided by the Superintendent. First, subtraction: the 107 reported dropouts minus 59 “errors” is 48, not 43, for “actual dropouts.” Second, the treatment of the Alternative school is confused: The pie chart suggests they are part of the coding errors but indicates they “should not have been counted”. This is odd because the alternative school should be included in the district (as opposed to Haverhill High School) figures. Third, there is no evidence that the denominator was adjusted appropriately when the dropout rate was calculated. It is hard to know what to make of these figures until they are precisely defined. My calculations suggest that, even accepting all the cases reported as coding errors, the “actual dropout rate” would be 2.7% rather than the 2.36 reported. So questions remain about the reliability of the Superintendent’s reported numbers and calculations.
Followers of this website and blog will know that there are many variations of dropout rates. Although not specified, the Superintendent’s numbers appear to apply to the Haverhill school district grades 9-12 for FY 2016 (the 2015-2016 school year). Although the Superintendent’s comments suggested an ongoing problem for more than one year, the data provided appear to apply to just the 2015-2016 school year (FY 2016). Consequently we have been provided no evidence of a problem before 2016.
The majority of the reported “coding errors” (70%) come for those identified as “transferred out of state.” But we do not have any information on just what was done to determine errors. How were transfers out of state documented? How was enrollment elsewhere established (documentation of records, calls, visits to homes, etc.)? Are these procedures consistent with what DESE would accept as evidence of a transfer, or something less?
It is clear, however, that Haverhill has not been following Massachusetts Department of Education Dropout Reporting Guidelines that require attempts to contact the parent or guardian by certified mail and by a home visit:
Investigating and Recording Extended Absences: A school may not remove a non-attending student from the enrollment without evidence that the student does not intend to return to school. Each district/school must have a procedure for investigating extended absences and must document reasonable efforts to locate the student and determine the reason for not attending. The procedure should include attempts to contact the parents/guardian by phone, through certified mail, and by a home visit.
Then for grades 6 to 12, “if the student has transferred to another school (may be demonstrated through a transcript request from the receiving school or documentation of notice of transfer from the parent or guardian) that the student should be reported as a transfer.” Also “If the student has moved to another city/town or state and as a result is no longer attending school in your district and there is no indication whether the student has enrolled in school elsewhere, then the student should be reported as a dropout and any subsequent change reported to the Department.”
It is also worth noting that DESE tracks students across schools with a standardized ID number and identifies errors in district reporting. When a student shows up at another school, public or private, DESE will reclassify a student as a transfer. It is not clear how many of Haverhill’s reporting errors may have been already corrected in DESE statistics.
What can we conclude at this point?
It appears the Haverhill school district was not following these guidelines at least in FY 2016. Nor does it seem to have had in place procedures to ensure accurate reporting, such as employed by other cities such as Boston (see last page of this document).
In making comparisons of Haverhill with other districts, it is important that data be collected and processed in a standardized way, as established by DESE. It is normal for some students whose status is unknown to be classified as dropouts when transcript requests or parent statements are not available. This is an inherent limitation of the DESE dropout data reporting system, not an error. Since it applies to all school systems, comparisons are still meaningful.
The methods of the Superintendent’s investigation of the status of FY 2016 dropout are unclear to the public at this point. No documentation of methods has been provided. We do not know, for example, how many of those identified as errors have been corrected based on documentation received from parents or transcript requests that would meet DESE criteria for reclassification of the student and revision to the published dropout rates.
This blog has previously noted the inconsistent pattern observed in Haverhill’s 9th grade dropout rate for 2016 (see chart on annual dropout rate by class and cohort in the May 15 post to this blog). Based on the extraordinarily high recent 9th grade dropout out rates, it seems likely that this rate is overstated, though perhaps not nearly as much as reported by the Superintendent. From what we know at present, based on the Superintendent’s report and the unusually high dropouts for 9th graders in 2016, it is very likely that there was over-reporting of dropouts for 2016, especially for the 9th grade. However, this conclusion cannot be relied upon until the evidence is clarified and other years may be less affected.
Going forward: next steps
Overall, until we have evidence that reporting irregularities extend into prior years at similar rates, and until we better understand how transfers were investigated and documented in the Superintendent’s recent review, there is little reason to believe that Haverhill dropouts are nearly as low as suggested by the Superintendent’s one-page sheet with pie chart.
We will look for the Superintendent to provide:
- Documentation of just how “errors were identified” and what evidence supports the conclusion that cases identified are transfers.
- A report on whether DESE has found such evidence satisfactory to justify adjustment to reported dropout rates and to what extent DESE will adjust its statistics based on recent finding.
- Copies of communications from DESE on any revised dropout rates, so the school committee and the public will have a solid foundation for future policy.
- Copies of the revised procedures put in place to ensure accurate reporting in the future.
Implications for Haverhill school leadership
If the reporting problems are confirmed with fuller evidence, they would suggest:
- The importance of ensuring good data for public decision making
- The positive role of heightened public interest in school performance as a stimulus for ensuring accountability
- The possibility that the YES program was more successful in reducing dropouts than previously reported.
- The need to devote management attention and resources to following up with students who do not return to school.
Implications for BenchmarkHaverhillSchools.com data and reporting
This website seeks to present data accepted and reported by DESE as the best means of ensuring comparable data for diverse cities. Of course, this depends on each school district reporting data consistent with DESE definitions and procedures. When errors are identified, corrections also must meet DESE standards for acceptance of the revision. We will revise figures when corrected and accepted by DESE. At that point, if there are still outstanding issues with the data, these will be noted.
For now, we expect revisions to recent data for 2016, and perhaps earlier years. We hope for higher quality reporting from Haverhill in the future. At this point, however, given the limitations and questions about the newly presented information (which so far applies to one year and has not been verified as consistent with DESE reporting) we should not dismiss previous evidence that Haverhill has higher dropout rates than would be expected for a city with its income and demographic characteristics.
The results reported on this site are largely based on four-year cohort adjusted graduation and dropout rates. We expect these results through 2016 will be minimally affected by the coding problems identified to date by the Superintendent. While one-year dropout rates reported for 9th grade have been particularly high (9 percent in 2016), the cohort rates are based on four years of data, and for the 2016 graduating class and include 2016 data for only the senior year, when transfers to other school are less likely. The 2016 graduating class had a DESE-reported dropout rate of only 5% when it was in 9th grade, and for the class of 2017 it was only 3.5 percent (see May 15 Benchmark Blog post bar chart by cohort and grade). This suggests that the coding problem did not much affect these classes at the critical time. We hope any corrections will be in place for the classes of 2018 and 2019, for which aberrantly high 9th grade dropout rates were reported. This will enable this website to continuously report trends in graduation and dropout rates along with other measures of school spending and performance.