“Haverhill Plans for the Student Opportunity Act” Workshop Summary

The Student Opportunity Act (SOA) passed in 2019 will provide additional funds for Massachusetts cities that submit a three-year evidence-based plan to reduce achievement gaps among student groups.

On February 29th 2020 the Haverhill Education Coalition held a workshop, “Haverhill Plans for the Student Opportunity Act. The workshop for city, school, and community leaders considered the SOA financing for and identified evidence-based interventions that could be used to help close achievement gaps among Haverhill students.

The workshop was not intended to advocate for any one position or approach, but rather to help make city leaders aware of educational reforms that have been shown by research to produce good outcomes for students.

We Invited all Haverhill School Committee Members and City Councilors. Attending were School Committee members Mayor Fiorentini, Gail Sullivan Rich Rosa, Paul Magliocchetti and City Councilors Melinda Barrett, John Michitson, Joe Bevilacqua, Colin LePage, and Tim Jordan. Superintendent Margaret Marotta, Assistant Superintendent Michael Pfifferling and Tilton Upper and Lower Principal Bonnie Antkowiak were speakers at the workshop.

The workshop ran for nearly three hours on Saturday morning. This post contains a summary of the main points. I have also posted here the HEC SOA Workshop Slides and the HEC SOA Workshop Resources Document distributed to participants.

SOA authorizes more funds for Haverhill schools

This mostly comes as additional state funding targeted to districts with many low-income students. We expect $6.9 million more in Chapter 70 aid for FY 21 (the 2020-21 school year) and smaller additional increases each year over the following six years. In total the expected funds are enough to increase real school resources per student.  This funding depends on money being appropriated by the state legislature

Haverhill has gaps to fill

Where are Haverhill’s Gaps? Here are some indications of where Haverhill could close gaps among student groups:

  • Among 20-17-18 graduates of Haverhill High School by race ethnicity, 56 percent white went on to attend 4-year colleges; but only 16 percent of Latino students did.
  • The percentage of Haverhill’s third grade students meeting or exceeding expectations on MCAS ELA (reading) tests for economically disadvantaged and Latino student are only two-third that of not economically disadvantaged and white students.
  •   2018-19, prior to the recent Right-Size Plan, resources, particularly the number of teachers per 100 students varied widely among Haverhill schools.
  • Funding from the SOA could be used to make sure opportunities are more equally available among schools and student groups.

Haverhill can build on previous successes at Tilton and other schools

The experience at Tilton is particularly instructive. Tilton has seen improved performance measures including attendance, test scores, and state percentile ranking. Principal Bonnie Antkowiak reported on efforts at the school that made this possible including creating leadership team, coaching support in the classroom, data and conversations about data, and extended-day tutoring. All these things require time and/or resources. SOA funds could be used to support replicating recent local successes in other Haverhill schools.

Maximizing Teacher Impacts

A straightforward way to affect student outcomes is to add teachers to reduce class sizes. The research shows that the largest effects of reducing class size on student achievement are found when reducing class size in lower grades, for students from less advantaged families and for less well prepared teachers. We note that the Tilton success involved having more teachers relative to the number of students.

For many years research failed to show much measurable benefit from teacher professional development (often conferences and courses). Recent studies, however, have shown that instructional coaching, works to improve both teacher skills in the classroom and student achievement. Note that Coaching was part of the transformation intervention at Tilton Elementary School.

Enhancing the Educational Experience

The workshop looked at ways to change the content of education. Participants watched two short videos that are linked here. One shows E.D. Hirsch, who argues that students reading comprehension improves when the have more contextual knowledge to understand the text. Providing all students exposure to defined set of core knowledge allows teachers in upper grades to build on a foundation shared by all students.   

A second video on Expeditionary Learning showed an example of a school that adopted a project-based approach to developing students’ collaborative problem solving skills. The approach is designed to better prepare students for college and careers by focusing on skills that won’t be replaced by computers.

Extending learning time is another way to enrich student learning. A 2016 study of extend learning in 46 Boston public schools reported  favorable results for ELA and math, especially for black and Hispanic students. Haverhill’s Mayor noted an article in the Atlantic that    The Superintendent noted the many types of extended learning already in place in Haverhill schools.

Summary of Evidence

This workshop was not a comprehensive review of education research, but with our targeted effort we did find evidence to support several types of initiatives that could be consistent with the Student Opportunity Act goals and could help address achievement gaps within Haverhill Public Schools. We note:

  • Evidence reviewed at the workshop supports:
    • Local multifaceted transformation with added resources
    • More teachers (smaller class sizes)
    • Instructional coaching over other forms of professional development
    • Expanded learning time for minority and economically disadvantaged students
    • More diverse staff that matches student race/ethnicity
  • Evidence may support:
    • Project-based learning
    • Content-rich curriculum

Next Steps

Haverhill Public Schools Superintendent, Dr. Margaret Marotta, plans for obtaining input form teachers and the community and presenting Haverhill’s Three-Year plan to the School Committee before the April 1st due date as requires by the Student Opportunity Act

Haverhill school spending still in the basement despite some good efforts to move up

Realtors, city businesses, and others concerned with Haverhill’s reputation will need to wait at least another year to see if Haverhill will move up from the bottom of rankings in per-student school spending.

Recently (December 5, 2019) the Massachusetts Department of Elementary and Secondary Education (DESE) posted its annual measures of net school spending for FY 2018. In these latest DESE statistics Haverhill appears to be stuck in the bottom 10 percent of Massachusetts districts in per-student spending, moving up only one position from #303 to #302 among 322 school districts and remaining dead last in per-student spending among the 26 Gateway Cities.

Haverhill increased its overall school budget for 2017-18 by 7% over the prior year and enrollment increased less than 1 percent. However, Haverhill per-student spending was up only 4 percent and still below 82 percent of the state average. (See chart below.) So what happened – or didn’t happen?

A closer look at the DESE data shows that spending under the School Committee’s budget was in fact up 7 percent. Spending on administration, instruction, and maintenance all up more than 7 percent; and spending in the category of “athletics, student activities, and security” was up more than 15 percent. However, “net school spending” as defined by DESE includes expenditures from the City budget, in addition to the School Committee’s budget. The City portion covers such things as employee and retiree benefits. For FY 2018, school spending on the city side went down 4.4% in FY 2018.

The reduction in the city portion of Haverhill school spending is attributable to employee and retiree insurance in the DESE Net School Spending Compliance reports for FYs 2017 and 2018. The total of these two lines was $1.26 million less in FY 2018 than 2017. This reduction reflects the city’s switching to the Group Insurance Commission (GIC) to obtain lower cost health insurance coverage from the state’s “largest single purchaser of health insurance”. Haverhill was the fourth Gateway city to join GIC’s health insurance program.

So, while the 7 percent increase in the FY 2018 School Budget did result in 7 percent more spending on instruction, overall spending per student (including the city portion) went up less than the state average, and not enough more than other Gateway Cities to move Haverhill up in the per-student spending rankings.

School budget increases for FY 2019 and FY 2020 were each less than 7 percent but more than inflation and enrollment growth. Perhaps that will be enough to affect Haverhill’s rank in DESE statistics for those years, but with tax revenues up, wage growth exceeding overall inflation, and the prospect of additional state funds, other cities will be increasing their school spending too.

As we enter 2020, Haverhill faces choices about the uses of additional Chapter 70 funding from the Student Opportunity Act (SOA). That Act will provide both additional funds and new accountability requirements for Haverhill and other Gateway Cities. This may help reverse the recent downward slide in Gateway City school spending relative to the state average (as can be seen in the chart).

Getting a better deal on health insurance helps Haverhill keep school costs a bit lower. However, if Haverhill is ever to get out of the basement in per-student spending rankings, leaders will need to keep pressing for more adequate funding and to make smart choices about the use of SOA funding.

Note on DESE data: Per-student spending is one the most prominent measures, reported by the Massachusetts Department of Elementary and Secondary Education (DESE). This past month (December) DESE released its data on per-pupil spending for fiscal 2018 (the 2017-18 school year). DESE data, particularly for spending, lags behind due to reporting. It is like looking in a rear view mirror – where we have been not where we are going. But the DESE data provide the best way to compare Haverhill school metrics those of other Massachusetts cities.

Coming Out of the Dark on Dropouts

A feeling of relief about recently reported Haverhill dropout numbers needs to be tempered with continuing concerns about how those numbers are reported. Recently released data on dropouts may reveal less about students leaving school than they do about shortcomings of the Haverhill school administration in reporting and using school performance data. Following the recent misreporting of data on dropouts, Haverhill Public Schools will need to rebuild trust with the School Committee and the public and work toward building its capability to use accountability data to drive improvements in school performance.

A recent (February 26, 2018) press release from the Department of Elementary and Secondary Education (DESE) highlights a decline in dropout rates over the past five years in five urban school districts, including Haverhill. However, with the Haverhill Superintendent himself invalidating past dropout figures in his report last October, it is hard to know what to make of the recent data. The chart below shows Haverhill’s dropout rate as reported by DESE based on district-submitted data. Starting in 2009 Haverhill’s reported dropout rates ran well above the trend of other cities; only in 2017 did reported rates return to more typical levels by means of one dramatic drop (for the most recent reported school year 2016-17).


It is important to recognize that this chart may not accurately represent the students who actually dropped out. Rather it represents what the district reported to DESE. The Superintendent has stated that data on dropouts and transfers were incorrectly reported to DESE for some unspecified number of years. He publicly provided revised data from one year (2015-16) indicating that more than half the students that were reported as dropouts (59 students for that school year) should have been classified as transfers.  (This was discussed in a Benchmark Blog post of October 29, 2017.) We have no data from earlier years to substantiate just when a meaningful decrease in dropouts may have occurred. This information is lost to history. The decline in reported dropouts for the 2016-17 school year is based on data Haverhill submitted in the fall of 2017 after revising its procedures for identifying transfers. The sudden decline in the reported dropout rate is thus a result of the change in reporting procedures, made at least in part to address long-standing errors in Haverhill reporting, not of a sudden change in the number of students dropping out.

For eight years Haverhill was an outlier with its high dropout numbers. If these figures were not correct, it seems that district leadership was flying blind for years on the dropout situation. Now we are asked to trust that the dropout rate really did decline. We are told that more than 50 students annually were reported as dropouts who had actually transferred to other schools, mostly out of state and outside the DESE student tracking system. The district is now reporting more students as transfers and fewer as dropouts. If previously too few were identified as transfers, one might reasonably ask whether too many are now being classified as transfers – that is, labeled transfers without proper documentation that they have reenrolled elsewhere.

DESE expects school districts to identify transfers when they receive a transcript request from another school or communication from a parent. Non-returning students not identified as transfers are to be reported to DESE as dropouts. While DESE is able to identify in-state transfers to public schools through its tracking system, it must rely on the district to accurately report out-of-state transfers and to count only those for whom reenrollment elsewhere can be appropriately documented. See DESE Dropout Reporting Guidelines.

A remaining problem for Haverhill is a lack of trust in the capabilities of the current school administration to accurately report and effectively use school performance data. Whether dropout reporting problems arose due to inattention, lack of effort, or just poor communications and lax oversight of reporting, Haverhill citizens do not now know what to believe. The dropout rate, a useful performance indicator in many cities, has been reduced to an unanswered question in Haverhill. We do not know to what extent good work in the schools went unrecognized and what opportunities were lost to better understand student needs and craft initiatives to address them. That situation needs to be avoided going forward. In today’s performance-oriented world of data-driven management, urban schools will fall behind if they are not capable of effectively using the best evidence to guide improvement.

Trust and transparency are important if data are to be used to improve Haverhill schools. A new superintendent will need to reestablish trust that Haverhill is accurately reporting performance statistics. That will require careful review of procedures, close supervision, and direct administrative oversight of data submission to ensure that Haverhill is neither over-reporting nor under-reporting dropouts. A new superintendent deserves to start with a verified baseline of key performance measures. He or she will need to ensure that DESE guidelines are being followed with clear and transparent procedures for distinguishing between transfers and dropouts. The School Committee should expect a candid report on procedures, past and present, to ensure they are now in good order to support accurate reporting.

There is much work ahead if we are to achieve the School Committee’s goal to “Make the district more data driven and performance oriented.” Accurate and openly reported data practices would be a good start. Using the data effectively to guide improvements would be a next step.

Bright Spots Amid Wide Variation in Haverhill 2017 MCAS Results

Haverhill MCAS results overall.

The MCAS student testing results for 2017 released this fall by the Department of Elementary and Secondary Education (DESE) for Haverhill show some areas of distinct accomplishments and some areas of possible concern.  Overall, Haverhill student performance is comparable to benchmarks for Gateway Cities but below benchmarks for the state, for similar-income Gateway Cities, and for cities identified by DESE as having similar students. (See the 2017 MCAS page on this site for details). In measures of student improvement (student growth percentiles, SGPs), particular grades in some Haverhill middle schools stand out as positive outliers. Positive results were also reported for economically disadvantaged students; for these students across all grades as a group, Haverhill performed above the benchmarks on SGPs for both English Language Arts (ELA) and Mathematics compared with the state, Gateway Cities, similar-income Gateway Cities, and those identified by DESE as comparable.

Haverhill SGPs Show Student Improvement

Student Growth Percentile (SGP) scores represent improvement of individual students relative to other students statewide who scored similarly on previous MCAS tests. Each student is scored relative to his/her previous level of proficiency and ranked relative to the improvement of other students starting at the same level. At the class, grade, or school level the median SGP provides a measure of improvement that can better reflect the contribution of the school and its teachers to student academic growth in a year. DESE considers the “normal range” of these scores to be between 40 and 60, which means that the median student SGP score for a groups of students will most often fall between the 40th and 60th percentile. Above this range are positive outliers; below it are negative outliers.

The pattern of MCAS student growth percentile by grade and school are shown in the chart below. The scatter plot of ELA and Math scores show the wide variation across Haverhill grades and schools (each dot represents the average SGP for both ELA and Math for single grade in a particular school and represents the combined results for all students at that grade level in a school). The upward trend of the dots left to right indicates a positive correlation between ELA and Math SGPs.  This means that, in Haverhill, if a particular grade in a school shows high student growth in English Language Arts it is more likely to show high student growth in Math.

The chart below is based on the same information as the one above. It identifies the grade and school combinations that achieved different levels of student growth (as measured by 2017 MCAS SGPs). Particularly notable, in the upper right hand cell, are instances in which the median student growth percentiles were above 60 for both ELA and Math. This positive outlier status was achieved by Consentino School for grades 4, 6, and 8, Nettle School for grade 7, and Whittier school grade 7. SGPs for Nettle School grade 6 in Math and Whitter School grade 8 in ELA were also notably above the normal range.

At the other end of the widely varying results was Hunking school grade 6, which was below the normal range for growth on both ELA and Math. In the yellow cells we see in the bottom row seven of the Haverhill school-grades reported median SGPs below the 40th percentile in math and in the left column 4 school-grades reported median ELA scores below the 40th percentile. The Hunking school grade 6 reported below normal median student growth scores for both ELA and Math. The Hunking 6th grade median SGP score of 18 in ELA (reported on the 2017 MCAS page) indicates that the SGP of more than 50 percent of the Hunking 6th graders was in the bottom 20 percent of similar students statewide. Such a low outlier  deserves attention to determine the source of the problem, whether related to teaching, test administration, or something else.

Considering all grades, Haverhill performed above the benchmarks in student growth for economically disadvantaged students in both ELA and math. See more details on 2017 MCAS page. Amid mixed performance in other grades and schools, the positive outliers may provide models to be emulated. The community and the School Committee should examine these accomplishments, seek to understand the factors producing good results, and work to replicate effective practices across all grades, schools, and student groups.

What to Make of Haverhill Dropout Reporting Irregularities

In a recent news report on WHAV and in a presentation to the Haverhill School Committee on October 26th Superintendent Scully reported that more than half of the dropouts reported by Haverhill to the Department of Elementary and Secondary Education (DESE) for the 2015-2016 school year were in error.

This post provides a summary of what we know at present. This post will be updated as new information becomes available.


For now, we expect revisions to recent data for 2016, and perhaps earlier years. We hope for higher quality reporting from Haverhill in the future. At this point, however, given the limitations and questions about the newly presented information (which so far applies to one year and has not been verified as consistent with DESE reporting) we should not dismiss previous evidence that Haverhill has higher dropout rates than would be expected for a city with its income and demographic characteristics.


What the Superintendent reported

The Superintendent provided a single sheet with a pie chart and a few statistics. It identified 107 reported dropouts for FY2016 [which correspond to the state statistics for the Haverhill school district.] It suggests 59 of these are “errors” leaving 43 actual dropouts. It identifies the reported dropout rate of 5.9% [which is reported by DESE as the Haverhill district dropout rate for 2015-2016] and lists 2.36% as an “actual dropout rate”. The pie chart contains the following information identified as “coding errors”:

  • 22% Transferred in state private (such as Phillips Andover)
  • 30% Transferred out-of-state (but were recorded as dropouts)
  • 40% Transferred out-of-state (but were recorded as transfer in MA and never reported by another MA school)
  • 8% Our alternative school (should not have been included)

The presentation did not include details about how the information was collected.

Inconsistencies and questions about findings presented

There are some apparent inconsistencies in the data on the sheet provided by the Superintendent. First, subtraction: the 107 reported dropouts minus 59 “errors” is 48, not 43, for “actual dropouts.” Second, the treatment of the Alternative school is confused: The pie chart suggests they are part of the coding errors but indicates they “should not have been counted”. This is odd because the alternative school should be included in the district (as opposed to Haverhill High School) figures. Third, there is no evidence that the denominator was adjusted appropriately when the dropout rate was calculated. It is hard to know what to make of these figures until they are precisely defined. My calculations suggest that, even accepting all the cases reported as coding errors, the “actual dropout rate” would be 2.7% rather than the 2.36 reported. So questions remain about the reliability of the Superintendent’s reported numbers and calculations.

Followers of this website and blog will know that there are many variations of dropout rates. Although not specified, the Superintendent’s numbers appear to apply to the Haverhill school district grades 9-12 for FY 2016 (the 2015-2016 school year). Although the Superintendent’s comments suggested an ongoing problem for more than one year, the data provided appear to apply to just the 2015-2016 school year (FY 2016). Consequently we have been provided no evidence of a problem before 2016.

The majority of the reported “coding errors” (70%) come for those identified as “transferred out of state.” But we do not have any information on just what was done to determine errors. How were transfers out of state documented? How was enrollment elsewhere established (documentation of records, calls, visits to homes, etc.)? Are these procedures consistent with what DESE would accept as evidence of a transfer, or something less?

It is clear, however, that Haverhill has not been following Massachusetts Department of Education Dropout Reporting Guidelines that require attempts to contact the parent or guardian by certified mail and by a home visit:

Investigating and Recording Extended Absences: A school may not remove a non-attending student from the enrollment without evidence that the student does not intend to return to school. Each district/school must have a procedure for investigating extended absences and must document reasonable efforts to locate the student and determine the reason for not attending. The procedure should include attempts to contact the parents/guardian by phone, through certified mail, and by a home visit.

Then for grades 6 to 12, “if the student has transferred to another school (may be demonstrated through a transcript request from the receiving school or documentation of notice of transfer from the parent or guardian) that the student should be reported as a transfer.” Also “If the student has moved to another city/town or state and as a result is no longer attending school in your district and there is no indication whether the student has enrolled in school elsewhere, then the student should be reported as a dropout and any subsequent change reported to the Department.”

It is also worth noting that DESE tracks students across schools with a standardized ID number and identifies errors in district reporting. When a student shows up at another school, public or private, DESE will reclassify a student as a transfer. It is not clear how many of Haverhill’s reporting errors may have been already corrected in DESE statistics.

What can we conclude at this point?

It appears the Haverhill school district was not following these guidelines at least in FY 2016. Nor does it seem to have had in place procedures to ensure accurate reporting, such as employed by other cities such as Boston (see last page of this document).

In making comparisons of Haverhill with other districts, it is important that data be collected and processed in a standardized way, as established by DESE. It is normal for some students whose status is unknown to be classified as dropouts when transcript requests or parent statements are not available. This is an inherent limitation of the DESE dropout data reporting system, not an error. Since it applies to all school systems, comparisons are still meaningful.

The methods of the Superintendent’s investigation of the status of FY 2016 dropout are unclear to the public at this point. No documentation of methods has been provided. We do not know, for example, how many of those identified as errors have been corrected based on documentation received from parents or transcript requests that would meet DESE criteria for reclassification of the student and revision to the published dropout rates.

This blog has previously noted the inconsistent pattern observed in Haverhill’s 9th grade dropout rate for 2016 (see chart on annual dropout rate by class and cohort in the May 15 post to this blog). Based on the extraordinarily high recent 9th grade dropout out rates, it seems likely that this rate is overstated, though perhaps not nearly as much as reported by the Superintendent. From what we know at present, based on the Superintendent’s report and the unusually high dropouts for 9th graders in 2016, it is very likely that there was over-reporting of dropouts for 2016, especially for the 9th grade. However, this conclusion cannot be relied upon until the evidence is clarified and other years may be less affected.

Going forward: next steps

Overall, until we have evidence that reporting irregularities extend into prior years at similar rates, and until we better understand how transfers were investigated and documented in the Superintendent’s recent review, there is little reason to believe that Haverhill dropouts are nearly as low as suggested by the Superintendent’s one-page sheet with pie chart.

We will look for the Superintendent to provide:

  • Documentation of just how “errors were identified” and what evidence supports the conclusion that cases identified are transfers.
  • A report on whether DESE has found such evidence satisfactory to justify adjustment to reported dropout rates and to what extent DESE will adjust its statistics based on recent finding.
  • Copies of communications from DESE on any revised dropout rates, so the school committee and the public will have a solid foundation for future policy.
  • Copies of the revised procedures put in place to ensure accurate reporting in the future.

Implications for Haverhill school leadership

If the reporting problems are confirmed with fuller evidence, they would suggest:

  • The importance of ensuring good data for public decision making
  • The positive role of heightened public interest in school performance as a stimulus for ensuring accountability
  • The possibility that the YES program was more successful in reducing dropouts than previously reported.
  • The need to devote management attention and resources to following up with students who do not return to school.

Implications for BenchmarkHaverhillSchools.com data and reporting

This website seeks to present data accepted and reported by DESE as the best means of ensuring comparable data for diverse cities. Of course, this depends on each school district reporting data consistent with DESE definitions and procedures. When errors are identified, corrections also must meet DESE standards for acceptance of the revision. We will revise figures when corrected and accepted by DESE. At that point, if there are still outstanding issues with the data, these will be noted.

For now, we expect revisions to recent data for 2016, and perhaps earlier years. We hope for higher quality reporting from Haverhill in the future. At this point, however, given the limitations and questions about the newly presented information (which so far applies to one year and has not been verified as consistent with DESE reporting) we should not dismiss previous evidence that Haverhill has higher dropout rates than would be expected for a city with its income and demographic characteristics.

The  results reported on this site are largely based on four-year cohort adjusted graduation and dropout rates. We expect these results through 2016 will be minimally affected by the coding problems identified to date by the Superintendent. While one-year dropout rates reported for 9th grade have been particularly high (9 percent in 2016), the cohort rates are based on four years of data, and for the 2016 graduating class and include 2016 data for only the senior year, when transfers to other school are less likely. The 2016 graduating class had a DESE-reported dropout rate of only 5% when it was in 9th grade, and for the class of 2017 it was only 3.5 percent (see May 15 Benchmark Blog post bar chart by cohort and grade). This suggests that the coding problem did not much affect these classes at the critical time. We hope any corrections will be in place for the classes of 2018 and 2019, for which aberrantly high 9th grade dropout rates were reported. This will enable this website to continuously report trends in graduation and dropout rates along with other measures of school spending and performance.

Patterns for Improving Graduation Rates: Lessons from Ten Years of Experience in Gateway Cities

Haverhill’s relative standing in school performance among Gateway Cities has been slipping. But what do we know about the cities that are doing better? If Haverhill wants to improve its graduation rates it can look for examples in other cities that have recently improved their graduation rates.

Gateway Cities with Graduation Rates Similar to Haverhill in 2006

As reported in data from the Department of Elementary and Secondary Education (DESE) for 2006, 11 of the 26 Massachusetts Gateway Cities had graduation rates (four-year cohort adjusted) within 5 percentage points of Haverhill’s at 77.0 percent. By focusing on these particular Gateway cities we exclude cities such as Quincy, which already had a much higher graduation rate in 2006, and Lawrence that started with a much lower rate and showed substantial improvement under receivership. Despite a similar start, the experience of these 11 cities diverged over the ensuing decade. By 2016, six of these cities had improved their graduation rate by more than 10 percentage points, four had improved rates by 5 to 10 percentage points, and one (besides Haverhill) had improved less than 5 percentage points. In this post I look at DESE data to see what distinguishes the top performers from those at the bottom.

For this report I have divided these 11 cities into three group based on improvement from 2006 to 2016. The most improved group, which includes Attleboro, Pittsfield, Revere, Salem, Taunton, and Worcester, started with an average graduation rate in 2006 of 75.2 percent. The middle group includes Brockton, Fitchburg, Leominster, Lynn, Malden, and Westfield with an average graduation rate in 2006 of 76.9 percent. The lowest group includes just one city, Lowell, with a graduation rate in 2006 of 79.0%.

Overall Trends

In the graph below, we can see Haverhill’s graduation rate ended the decade where it started while all of the other groups improved, with the highest performers averaging an improvement of 12 percentage points over ten years, starting 2 points below Haverhill and finishing 10 points above. Clearly improvement is possible for Gateway Cities with graduation rates similar to Haverhill.

Low-Income Students

For low income students, we see substantially greater improvement in graduation rates in this period when federal and state program were targeted to low income students. Haverhill’s rate for low-income students improved by 6 percentage points, but the most-improved group improved their graduation rates for low income students by 20 percentage points. It should be noted that the number of low-income students in Haverhill’s high school cohort increased by 122 percent in this period, no doubt putting a strain on the schools to address the needs of this group. But the number increased in other cities as well, and even at the end of the period, the percent of students classified as low income in Haverhill (60%) was not greater than the percentage in the most-improved cities (75%).

Hispanic Students

The most improved among the Gateway Cities whose 2006 graduation rates were similar to Haverhill showed marked improvement in graduation rates for Hispanic students – 22 percentage points, from 67% to 89%. This group also showed a 103% increase in Hispanic enrollment in this period. This suggests that success with the growing number of Hispanic students is an important part of the overall improvement among the top performers. This contrasts with Haverhill, which showed a drop in graduation rates for Hispanic students to 61% in 2016, down from 71% in 2006.

Student/Teacher Ratios

Another measure of resources is the student/teacher ratio. Here we need to shift our thinking a bit as the higher the student/teacher ratio the lower the resources per student. We do not see dramatic differences among the graduation-improvement groups. In all three groups, the number of students per teacher increased somewhat, but the increase was smaller for those most improved (an additional 0.5 student per teacher) compared with the middle and least improved groups (with more than 1 additional student per teacher). Haverhill actually decreased the number of students per teacher and ended with only 0.4 more students per teacher than average for the most improved. This suggests that Haverhill’s under-performance on graduation rates from 2006 to 2016 s is not attributable to its somewhat higher student /teacher ratios.

Teacher Salaries

The chart below suggest that higher teacher salaries have not been the driver of graduation rate improvements. Among Gateway cities with graduation rates similar to Haverhill in 2006, the most improved group ended in 2015 with teacher salaries lower than the middle-improved group. Haverhill, however, become an outlier in teacher salaries in this period, starting in the middle and ending well below the average of the other groups. So, while salary levels do not explain the variation in improvement for these cities as a whole, we cannot rule out markedly lower salaries as a possible barrier to improvement for Haverhill.

Per Pupil Spending

The most improved group also supported their schools with greater increases in per-pupil spending. Haverhill started the period somewhat below the others and by 2015 was spending substantially less per pupil.

Summing Up

The chart below shows the changes in key measures by improvement group.

So what have we learned from this look at improvements in graduation rates over ten years? Among the 11 Gateway Cites starting in 2006 with graduation rates similar to Haverhill, the most improved districts:

  • Were able to improve graduation rates by 12 percentage points overall
  • Showed even greater improvements for low-income (20 percentage points) and Hispanic (22 percentage points) students than other students
  • Had notably greater increases in per pupil spending, exceeding the middle group by 11 percentage points over 10 years
  • Increased average teacher salaries only slightly more than others – by 2% over 10 years
  • Allowed student/teacher ratios to increase slightly less than the other groups

In contrast, Haverhill:

  • Improved graduation rates by less than 1%
  • Showed a 6 percentage point improvement in graduation rates for low-income students, but saw a 10 percentage point drop in graduation rates for Hispanic students
  • Started with per pupil spending 4% below the most improved group and slipped to 13% below this group in per pupil spending by 2016
  • Increased average teacher salaries substantially less than the most-improved, ending the period 9 percent below the most improved group and 11 percent below the other cities in our analysis
  • Reduced its student/teacher ratio slightly

Conclusion

What Haverhill has been doing has not been working to increase graduation rates as other Massachusetts cities have done. The results for the most-improved of the Gateway cities with 2006 graduation rates similar to Haverhill show that Haverhill has missed an opportunity for school improvement. That opportunity need not be missed going forward. We can learn from the experiences of others.

The results of the past decade suggest a possible path to improvement: focus less on student/teacher ratios and more on providing adequate resources (as reflected in per-pupil spending) and find ways to better meet the needs of low-income and Hispanic students. This may mean investing in more supporting resources to help our teachers better serve a changing student population. Other cities have shown how this can be done. Adapting their methods to Haverhill’s particular situation can be expected to produce meaningful improvements in graduation rates and greatly benefit our city for this and the next generation.

Three Ways to Look at Haverhill School Dropouts

Does Haverhill have a dropout problem? If so, how bad is it? And what can be done about it? Statistics on dropout rates from the Massachusetts Department of Elementary and Secondary Education (DESE) can help us answer these questions.

At the Reach Higher community forum at Hunking School on April 26, 2017, I presented statistics that show Haverhill’s irregular and persistently high dropout rates. I presented dropout rates calculated with the cohort (or longitudinal method) showing the percentage of dropouts among a class or cohort of students over the four-year period from 9th to 12th grade. Some have asked about why I used that particular measure rather than annual dropout rates or other measures. In this blog post I review the evidence on Haverhill dropouts from three types of measures. I also note the implications for addressing Haverhill’s dropout issue in the 2017-2018 school budget.

Cohort Based Measures Show Haverhill’s Irregularly High Dropout Rates

Cohort based measures show what percentage of a class cohort starting at grade 9 have graduated or dropped out four years later. The adjusted cohort formula adjusts for transfers in, so schools are not held accountable for students they did not serve from 9th grade on. Adjusted cohort graduation and dropout rates have been deemed more accurate than other calculations in the ability to assess student results over time and since 2011 the federal government has mandated that states calculate and report cohort rates to support comparisons across states. For Massachusetts schools and districts, these rates are presented on the DESE website graduation rate page. The user can select rates by district or school, by year, adjusted or unadjusted (for transfers), for more than 10 student groupings.

Over the ten years from 2006 to 2016, Lawrence reduced its four-year cohort adjusted dropout rate from 35.8 percent to 10.8 percent, while Salem reduced its rate from 12.5 to 4.6 percent. In the same period Haverhill’s dropout rate improved in some periods but slid back in others to end up in 2016 at 11.3 percent, not much different from where it started at (11.5 percent). Haverhill’s lack of net progress on the overall measure and other data presented on the DESE website show that Haverhill’s has been much less successful than other Massachusetts cities in preventing dropouts over the four-year high school period.

Main point: Cohort based measures provide the best way to assess how well schools prevent dropouts over time and get students through to graduation. However, one has to wait until a cohort completes its senior year for the rate to be reported. Looking at a cohort rate is something like looking through a telescope at light from a star that was emitted years ago. We can turn to some other measures of dropouts to get more current, if less complete, information on student dropouts.

Annual Dropout Measures Show Slow Decline in Dropout Rates, Remaining More Than Twice State Average

Annual dropout rates indicate how many students drop out anytime during one school year and do not return by October of the next year. Annual rates are reported by district and school, year, and student group in the DESE website Dropout Report. For 2015-16 Haverhill’s district annual dropout rate of students in grades 9 to 12 was 5.9 percent, compared with the statewide rate of 1.9%. Among the 301 reporting school districts Haverhill ranks among the worst – 289 out of 301. Of the 26 Gateway cities, only New Bedford and Chelsea (cities with much lower income levels) had a higher annual dropout rate in 2015-2016. The dropout rate for Haverhill High School alone is 4.4 percent; this rate for HHS is lower than the district rate because it does not include the Haverhill district’s alternative and TEACH schools.

Haverhill’s annual has dropout rate has come down from higher levels of 2008 and 2011 but it remains more than twice the state average. (See graph below)

With annual measures (as opposed to cohort measures), students who drop out are removed from the analysis in subsequent years. So a particular class that lost most of its dropouts in the ninth grade may show low dropout rates in the following three high school years. Because these rates are affected by on the timing of dropouts, annual measures can do a poor job of representing overall student success.

Main point: Haverhill’s dropout rates are higher than almost all of the other Massachusetts school districts. Single-year dropout rates do not well represent the experience of a class over time and are unreliable indicators of overall school and district performance.

Annual Measures by Graduating Class Show Temporary Progress, Then Slipping Back

Annual dropout rates are also reported by grade. With data from successive years, this enables us to piece together the experience of a group through the most recent available year (see graph). Annual year dropout rates vary greatly from class to class and year to year within a class. A particular class may experience a high dropout rate one year and a low dropout rate the next (after the most-at-risk students have dropped out).

We see the Haverhill Class of 2017 is on track toward a lower cohort dropout rates than other recent graduating classes. Congratulations HHS Class of 2017! During their critical middle school to high school transition period this class benefitted from Haverhill’s participation in the grant funded Youth Engaged for Success (YES), a federally funded program aimed at keeping kids in school, which was awarded for five years from 2010 to 2015. However, this program ended when the grant funding ran out and we see a spike upward in the dropout rate for ninth graders who have transitioned to high school since this program ended.

Main point: Looking at annual dropout rate by graduating class for the past several years, we see evidence of lower dropout experiences for the HHS class of 2017, which benefitted from the YES program during its transition and early high school years. But these advances do not appear to be sustained for subsequent classes.

Overall Conclusion

Each of the three types of dropout measures sheds some light on Haverhill’s dropout problem. To answer the questions posed at the beginning of this blog post:

Do we have a problem? Dropout rates have been declining nationally, across the state, and in Gateway cities. Haverhill has participated in this decline and we expect cohort statistics will show improvement when they are posted for the class of 2017. However, Haverhill dropout rates remain more than double the state average. By nearly all measures, Haverhill dropout rates are high relative to other Massachusetts cities with comparable income and population characteristics. Haverhill’s dropout rates have been inconsistent over time. Annual results by class seem to suggest that the lower dropout for this year’s graduating class, but show a recent spike in 9th grade dropouts in each of the past two years. This suggests a continuing problem, particularly in the middle-school/high-school transition period.

How bad is it? Haverhill ranks near the very bottom of Massachusetts school districts – worse than 289 of 301 Massachusetts school districts. That the improvement observed in this year’s graduating class is not evident in data from the current 9th grade class does not bode well for future reports.

What can we do about it? We do not have to look far to see how to do better. Lawrence and Salem provide examples of cities that have significantly reduced dropouts and improved graduation rates. And right here in Haverhill we have seen ups and downs that may be related to changes dropout prevention efforts at the Haverhill High School, through the YES program and other efforts. One look at the literature shows how difficult it can be to address the many complex interrelated issues that affect dropouts. But there are many resources to work with. See, just for example, this School-Level Approach to Dropout Prevention, or the APEX program in New Hampshire or this from Washington State and dropout prevention resource centers at Clemson University and Johns Hopkins.

It will be a real challenge to find the best, most effective, most affordable methods that will work in Haverhill. As we consider the 2017-18 school budget, the school committee should consider earmarking specific support for an evidence-based program to reduce Haverhill’s dropout rate in a cost-effective way. By making such a commitment and sustaining it over time, we can give all our students a better shot at a successful life with a diploma in hand.