Benchmark Blog

What to Make of Haverhill Dropout Reporting Irregularities

In a recent news report on WHAV and in a presentation to the Haverhill School Committee on October 26th Superintendent Scully reported that more than half of the dropouts reported by Haverhill to the Department of Elementary and Secondary Education (DESE) for the 2015-2016 school year were in error.

This post provides a summary of what we know at present. This post will be updated as new information becomes available.


For now, we expect revisions to recent data for 2016, and perhaps earlier years. We hope for higher quality reporting from Haverhill in the future. At this point, however, given the limitations and questions about the newly presented information (which so far applies to one year and has not been verified as consistent with DESE reporting) we should not dismiss previous evidence that Haverhill has higher dropout rates than would be expected for a city with its income and demographic characteristics.


What the Superintendent reported

The Superintendent provided a single sheet with a pie chart and a few statistics. It identified 107 reported dropouts for FY2016 [which correspond to the state statistics for the Haverhill school district.] It suggests 59 of these are “errors” leaving 43 actual dropouts. It identifies the reported dropout rate of 5.9% [which is reported by DESE as the Haverhill district dropout rate for 2015-2016] and lists 2.36% as an “actual dropout rate”. The pie chart contains the following information identified as “coding errors”:

  • 22% Transferred in state private (such as Phillips Andover)
  • 30% Transferred out-of-state (but were recorded as dropouts)
  • 40% Transferred out-of-state (but were recorded as transfer in MA and never reported by another MA school)
  • 8% Our alternative school (should not have been included)

The presentation did not include details about how the information was collected.

Inconsistencies and questions about findings presented

There are some apparent inconsistencies in the data on the sheet provided by the Superintendent. First, subtraction: the 107 reported dropouts minus 59 “errors” is 48, not 43, for “actual dropouts.” Second, the treatment of the Alternative school is confused: The pie chart suggests they are part of the coding errors but indicates they “should not have been counted”. This is odd because the alternative school should be included in the district (as opposed to Haverhill High School) figures. Third, there is no evidence that the denominator was adjusted appropriately when the dropout rate was calculated. It is hard to know what to make of these figures until they are precisely defined. My calculations suggest that, even accepting all the cases reported as coding errors, the “actual dropout rate” would be 2.7% rather than the 2.36 reported. So questions remain about the reliability of the Superintendent’s reported numbers and calculations.

Followers of this website and blog will know that there are many variations of dropout rates. Although not specified, the Superintendent’s numbers appear to apply to the Haverhill school district grades 9-12 for FY 2016 (the 2015-2016 school year). Although the Superintendent’s comments suggested an ongoing problem for more than one year, the data provided appear to apply to just the 2015-2016 school year (FY 2016). Consequently we have been provided no evidence of a problem before 2016.

The majority of the reported “coding errors” (70%) come for those identified as “transferred out of state.” But we do not have any information on just what was done to determine errors. How were transfers out of state documented? How was enrollment elsewhere established (documentation of records, calls, visits to homes, etc.)? Are these procedures consistent with what DESE would accept as evidence of a transfer, or something less?

It is clear, however, that Haverhill has not been following Massachusetts Department of Education Dropout Reporting Guidelines that require attempts to contact the parent or guardian by certified mail and by a home visit:

Investigating and Recording Extended Absences: A school may not remove a non-attending student from the enrollment without evidence that the student does not intend to return to school. Each district/school must have a procedure for investigating extended absences and must document reasonable efforts to locate the student and determine the reason for not attending. The procedure should include attempts to contact the parents/guardian by phone, through certified mail, and by a home visit.

Then for grades 6 to 12, “if the student has transferred to another school (may be demonstrated through a transcript request from the receiving school or documentation of notice of transfer from the parent or guardian) that the student should be reported as a transfer.” Also “If the student has moved to another city/town or state and as a result is no longer attending school in your district and there is no indication whether the student has enrolled in school elsewhere, then the student should be reported as a dropout and any subsequent change reported to the Department.”

It is also worth noting that DESE tracks students across schools with a standardized ID number and identifies errors in district reporting. When a student shows up at another school, public or private, DESE will reclassify a student as a transfer. It is not clear how many of Haverhill’s reporting errors may have been already corrected in DESE statistics.

What can we conclude at this point?

It appears the Haverhill school district was not following these guidelines at least in FY 2016. Nor does it seem to have had in place procedures to ensure accurate reporting, such as employed by other cities such as Boston (see last page of this document).

In making comparisons of Haverhill with other districts, it is important that data be collected and processed in a standardized way, as established by DESE. It is normal for some students whose status is unknown to be classified as dropouts when transcript requests or parent statements are not available. This is an inherent limitation of the DESE dropout data reporting system, not an error. Since it applies to all school systems, comparisons are still meaningful.

The methods of the Superintendent’s investigation of the status of FY 2016 dropout are unclear to the public at this point. No documentation of methods has been provided. We do not know, for example, how many of those identified as errors have been corrected based on documentation received from parents or transcript requests that would meet DESE criteria for reclassification of the student and revision to the published dropout rates.

This blog has previously noted the inconsistent pattern observed in Haverhill’s 9th grade dropout rate for 2016 (see chart on annual dropout rate by class and cohort in the May 15 post to this blog). Based on the extraordinarily high recent 9th grade dropout out rates, it seems likely that this rate is overstated, though perhaps not nearly as much as reported by the Superintendent. From what we know at present, based on the Superintendent’s report and the unusually high dropouts for 9th graders in 2016, it is very likely that there was over-reporting of dropouts for 2016, especially for the 9th grade. However, this conclusion cannot be relied upon until the evidence is clarified and other years may be less affected.

Going forward: next steps

Overall, until we have evidence that reporting irregularities extend into prior years at similar rates, and until we better understand how transfers were investigated and documented in the Superintendent’s recent review, there is little reason to believe that Haverhill dropouts are nearly as low as suggested by the Superintendent’s one-page sheet with pie chart.

We will look for the Superintendent to provide:

  • Documentation of just how “errors were identified” and what evidence supports the conclusion that cases identified are transfers.
  • A report on whether DESE has found such evidence satisfactory to justify adjustment to reported dropout rates and to what extent DESE will adjust its statistics based on recent finding.
  • Copies of communications from DESE on any revised dropout rates, so the school committee and the public will have a solid foundation for future policy.
  • Copies of the revised procedures put in place to ensure accurate reporting in the future.

Implications for Haverhill school leadership

If the reporting problems are confirmed with fuller evidence, they would suggest:

  • The importance of ensuring good data for public decision making
  • The positive role of heightened public interest in school performance as a stimulus for ensuring accountability
  • The possibility that the YES program was more successful in reducing dropouts than previously reported.
  • The need to devote management attention and resources to following up with students who do not return to school.

Implications for BenchmarkHaverhillSchools.com data and reporting

This website seeks to present data accepted and reported by DESE as the best means of ensuring comparable data for diverse cities. Of course, this depends on each school district reporting data consistent with DESE definitions and procedures. When errors are identified, corrections also must meet DESE standards for acceptance of the revision. We will revise figures when corrected and accepted by DESE. At that point, if there are still outstanding issues with the data, these will be noted.

For now, we expect revisions to recent data for 2016, and perhaps earlier years. We hope for higher quality reporting from Haverhill in the future. At this point, however, given the limitations and questions about the newly presented information (which so far applies to one year and has not been verified as consistent with DESE reporting) we should not dismiss previous evidence that Haverhill has higher dropout rates than would be expected for a city with its income and demographic characteristics.

The  results reported on this site are largely based on four-year cohort adjusted graduation and dropout rates. We expect these results through 2016 will be minimally affected by the coding problems identified to date by the Superintendent. While one-year dropout rates reported for 9th grade have been particularly high (9 percent in 2016), the cohort rates are based on four years of data, and for the 2016 graduating class and include 2016 data for only the senior year, when transfers to other school are less likely. The 2016 graduating class had a DESE-reported dropout rate of only 5% when it was in 9th grade, and for the class of 2017 it was only 3.5 percent (see May 15 Benchmark Blog post bar chart by cohort and grade). This suggests that the coding problem did not much affect these classes at the critical time. We hope any corrections will be in place for the classes of 2018 and 2019, for which aberrantly high 9th grade dropout rates were reported. This will enable this website to continuously report trends in graduation and dropout rates along with other measures of school spending and performance.

Patterns for Improving Graduation Rates: Lessons from Ten Years of Experience in Gateway Cities

Haverhill’s relative standing in school performance among Gateway Cities has been slipping. But what do we know about the cities that are doing better? If Haverhill wants to improve its graduation rates it can look for examples in other cities that have recently improved their graduation rates.

Gateway Cities with Graduation Rates Similar to Haverhill in 2006

As reported in data from the Department of Elementary and Secondary Education (DESE) for 2006, 11 of the 26 Massachusetts Gateway Cities had graduation rates (four-year cohort adjusted) within 5 percentage points of Haverhill’s at 77.0 percent. By focusing on these particular Gateway cities we exclude cities such as Quincy, which already had a much higher graduation rate in 2006, and Lawrence that started with a much lower rate and showed substantial improvement under receivership. Despite a similar start, the experience of these 11 cities diverged over the ensuing decade. By 2016, six of these cities had improved their graduation rate by more than 10 percentage points, four had improved rates by 5 to 10 percentage points, and one (besides Haverhill) had improved less than 5 percentage points. In this post I look at DESE data to see what distinguishes the top performers from those at the bottom.

For this report I have divided these 11 cities into three group based on improvement from 2006 to 2016. The most improved group, which includes Attleboro, Pittsfield, Revere, Salem, Taunton, and Worcester, started with an average graduation rate in 2006 of 75.2 percent. The middle group includes Brockton, Fitchburg, Leominster, Lynn, Malden, and Westfield with an average graduation rate in 2006 of 76.9 percent. The lowest group includes just one city, Lowell, with a graduation rate in 2006 of 79.0%.

Overall Trends

In the graph below, we can see Haverhill’s graduation rate ended the decade where it started while all of the other groups improved, with the highest performers averaging an improvement of 12 percentage points over ten years, starting 2 points below Haverhill and finishing 10 points above. Clearly improvement is possible for Gateway Cities with graduation rates similar to Haverhill.

Low-Income Students

For low income students, we see substantially greater improvement in graduation rates in this period when federal and state program were targeted to low income students. Haverhill’s rate for low-income students improved by 6 percentage points, but the most-improved group improved their graduation rates for low income students by 20 percentage points. It should be noted that the number of low-income students in Haverhill’s high school cohort increased by 122 percent in this period, no doubt putting a strain on the schools to address the needs of this group. But the number increased in other cities as well, and even at the end of the period, the percent of students classified as low income in Haverhill (60%) was not greater than the percentage in the most-improved cities (75%).

Hispanic Students

The most improved among the Gateway Cities whose 2006 graduation rates were similar to Haverhill showed marked improvement in graduation rates for Hispanic students – 22 percentage points, from 67% to 89%. This group also showed a 103% increase in Hispanic enrollment in this period. This suggests that success with the growing number of Hispanic students is an important part of the overall improvement among the top performers. This contrasts with Haverhill, which showed a drop in graduation rates for Hispanic students to 61% in 2016, down from 71% in 2006.

Student/Teacher Ratios

Another measure of resources is the student/teacher ratio. Here we need to shift our thinking a bit as the higher the student/teacher ratio the lower the resources per student. We do not see dramatic differences among the graduation-improvement groups. In all three groups, the number of students per teacher increased somewhat, but the increase was smaller for those most improved (an additional 0.5 student per teacher) compared with the middle and least improved groups (with more than 1 additional student per teacher). Haverhill actually decreased the number of students per teacher and ended with only 0.4 more students per teacher than average for the most improved. This suggests that Haverhill’s under-performance on graduation rates from 2006 to 2016 s is not attributable to its somewhat higher student /teacher ratios.

Teacher Salaries

The chart below suggest that higher teacher salaries have not been the driver of graduation rate improvements. Among Gateway cities with graduation rates similar to Haverhill in 2006, the most improved group ended in 2015 with teacher salaries lower than the middle-improved group. Haverhill, however, become an outlier in teacher salaries in this period, starting in the middle and ending well below the average of the other groups. So, while salary levels do not explain the variation in improvement for these cities as a whole, we cannot rule out markedly lower salaries as a possible barrier to improvement for Haverhill.

Per Pupil Spending

The most improved group also supported their schools with greater increases in per-pupil spending. Haverhill started the period somewhat below the others and by 2015 was spending substantially less per pupil.

Summing Up

The chart below shows the changes in key measures by improvement group.

So what have we learned from this look at improvements in graduation rates over ten years? Among the 11 Gateway Cites starting in 2006 with graduation rates similar to Haverhill, the most improved districts:

  • Were able to improve graduation rates by 12 percentage points overall
  • Showed even greater improvements for low-income (20 percentage points) and Hispanic (22 percentage points) students than other students
  • Had notably greater increases in per pupil spending, exceeding the middle group by 11 percentage points over 10 years
  • Increased average teacher salaries only slightly more than others – by 2% over 10 years
  • Allowed student/teacher ratios to increase slightly less than the other groups

In contrast, Haverhill:

  • Improved graduation rates by less than 1%
  • Showed a 6 percentage point improvement in graduation rates for low-income students, but saw a 10 percentage point drop in graduation rates for Hispanic students
  • Started with per pupil spending 4% below the most improved group and slipped to 13% below this group in per pupil spending by 2016
  • Increased average teacher salaries substantially less than the most-improved, ending the period 9 percent below the most improved group and 11 percent below the other cities in our analysis
  • Reduced its student/teacher ratio slightly

Conclusion

What Haverhill has been doing has not been working to increase graduation rates as other Massachusetts cities have done. The results for the most-improved of the Gateway cities with 2006 graduation rates similar to Haverhill show that Haverhill has missed an opportunity for school improvement. That opportunity need not be missed going forward. We can learn from the experiences of others.

The results of the past decade suggest a possible path to improvement: focus less on student/teacher ratios and more on providing adequate resources (as reflected in per-pupil spending) and find ways to better meet the needs of low-income and Hispanic students. This may mean investing in more supporting resources to help our teachers better serve a changing student population. Other cities have shown how this can be done. Adapting their methods to Haverhill’s particular situation can be expected to produce meaningful improvements in graduation rates and greatly benefit our city for this and the next generation.

Three Ways to Look at Haverhill School Dropouts

Does Haverhill have a dropout problem? If so, how bad is it? And what can be done about it? Statistics on dropout rates from the Massachusetts Department of Elementary and Secondary Education (DESE) can help us answer these questions.

At the Reach Higher community forum at Hunking School on April 26, 2017, I presented statistics that show Haverhill’s irregular and persistently high dropout rates. I presented dropout rates calculated with the cohort (or longitudinal method) showing the percentage of dropouts among a class or cohort of students over the four-year period from 9th to 12th grade. Some have asked about why I used that particular measure rather than annual dropout rates or other measures. In this blog post I review the evidence on Haverhill dropouts from three types of measures. I also note the implications for addressing Haverhill’s dropout issue in the 2017-2018 school budget.

Cohort Based Measures Show Haverhill’s Irregularly High Dropout Rates

Cohort based measures show what percentage of a class cohort starting at grade 9 have graduated or dropped out four years later. The adjusted cohort formula adjusts for transfers in, so schools are not held accountable for students they did not serve from 9th grade on. Adjusted cohort graduation and dropout rates have been deemed more accurate than other calculations in the ability to assess student results over time and since 2011 the federal government has mandated that states calculate and report cohort rates to support comparisons across states. For Massachusetts schools and districts, these rates are presented on the DESE website graduation rate page. The user can select rates by district or school, by year, adjusted or unadjusted (for transfers), for more than 10 student groupings.

Over the ten years from 2006 to 2016, Lawrence reduced its four-year cohort adjusted dropout rate from 35.8 percent to 10.8 percent, while Salem reduced its rate from 12.5 to 4.6 percent. In the same period Haverhill’s dropout rate improved in some periods but slid back in others to end up in 2016 at 11.3 percent, not much different from where it started at (11.5 percent). Haverhill’s lack of net progress on the overall measure and other data presented on the DESE website show that Haverhill’s has been much less successful than other Massachusetts cities in preventing dropouts over the four-year high school period.

Main point: Cohort based measures provide the best way to assess how well schools prevent dropouts over time and get students through to graduation. However, one has to wait until a cohort completes its senior year for the rate to be reported. Looking at a cohort rate is something like looking through a telescope at light from a star that was emitted years ago. We can turn to some other measures of dropouts to get more current, if less complete, information on student dropouts.

Annual Dropout Measures Show Slow Decline in Dropout Rates, Remaining More Than Twice State Average

Annual dropout rates indicate how many students drop out anytime during one school year and do not return by October of the next year. Annual rates are reported by district and school, year, and student group in the DESE website Dropout Report. For 2015-16 Haverhill’s district annual dropout rate of students in grades 9 to 12 was 5.9 percent, compared with the statewide rate of 1.9%. Among the 301 reporting school districts Haverhill ranks among the worst – 289 out of 301. Of the 26 Gateway cities, only New Bedford and Chelsea (cities with much lower income levels) had a higher annual dropout rate in 2015-2016. The dropout rate for Haverhill High School alone is 4.4 percent; this rate for HHS is lower than the district rate because it does not include the Haverhill district’s alternative and TEACH schools.

Haverhill’s annual has dropout rate has come down from higher levels of 2008 and 2011 but it remains more than twice the state average. (See graph below)

With annual measures (as opposed to cohort measures), students who drop out are removed from the analysis in subsequent years. So a particular class that lost most of its dropouts in the ninth grade may show low dropout rates in the following three high school years. Because these rates are affected by on the timing of dropouts, annual measures can do a poor job of representing overall student success.

Main point: Haverhill’s dropout rates are higher than almost all of the other Massachusetts school districts. Single-year dropout rates do not well represent the experience of a class over time and are unreliable indicators of overall school and district performance.

Annual Measures by Graduating Class Show Temporary Progress, Then Slipping Back

Annual dropout rates are also reported by grade. With data from successive years, this enables us to piece together the experience of a group through the most recent available year (see graph). Annual year dropout rates vary greatly from class to class and year to year within a class. A particular class may experience a high dropout rate one year and a low dropout rate the next (after the most-at-risk students have dropped out).

We see the Haverhill Class of 2017 is on track toward a lower cohort dropout rates than other recent graduating classes. Congratulations HHS Class of 2017! During their critical middle school to high school transition period this class benefitted from Haverhill’s participation in the grant funded Youth Engaged for Success (YES), a federally funded program aimed at keeping kids in school, which was awarded for five years from 2010 to 2015. However, this program ended when the grant funding ran out and we see a spike upward in the dropout rate for ninth graders who have transitioned to high school since this program ended.

Main point: Looking at annual dropout rate by graduating class for the past several years, we see evidence of lower dropout experiences for the HHS class of 2017, which benefitted from the YES program during its transition and early high school years. But these advances do not appear to be sustained for subsequent classes.

Overall Conclusion

Each of the three types of dropout measures sheds some light on Haverhill’s dropout problem. To answer the questions posed at the beginning of this blog post:

Do we have a problem? Dropout rates have been declining nationally, across the state, and in Gateway cities. Haverhill has participated in this decline and we expect cohort statistics will show improvement when they are posted for the class of 2017. However, Haverhill dropout rates remain more than double the state average. By nearly all measures, Haverhill dropout rates are high relative to other Massachusetts cities with comparable income and population characteristics. Haverhill’s dropout rates have been inconsistent over time. Annual results by class seem to suggest that the lower dropout for this year’s graduating class, but show a recent spike in 9th grade dropouts in each of the past two years. This suggests a continuing problem, particularly in the middle-school/high-school transition period.

How bad is it? Haverhill ranks near the very bottom of Massachusetts school districts – worse than 289 of 301 Massachusetts school districts. That the improvement observed in this year’s graduating class is not evident in data from the current 9th grade class does not bode well for future reports.

What can we do about it? We do not have to look far to see how to do better. Lawrence and Salem provide examples of cities that have significantly reduced dropouts and improved graduation rates. And right here in Haverhill we have seen ups and downs that may be related to changes dropout prevention efforts at the Haverhill High School, through the YES program and other efforts. One look at the literature shows how difficult it can be to address the many complex interrelated issues that affect dropouts. But there are many resources to work with. See, just for example, this School-Level Approach to Dropout Prevention, or the APEX program in New Hampshire or this from Washington State and dropout prevention resource centers at Clemson University and Johns Hopkins.

It will be a real challenge to find the best, most effective, most affordable methods that will work in Haverhill. As we consider the 2017-18 school budget, the school committee should consider earmarking specific support for an evidence-based program to reduce Haverhill’s dropout rate in a cost-effective way. By making such a commitment and sustaining it over time, we can give all our students a better shot at a successful life with a diploma in hand.