|QSAC Joint Committee Hearing--Ravally Testimony 12-4-18|
TESTIMONY OF JOHN RAVALLY, ED.D
JOINT COMMITTEE ON PUBLIC SCHOOLS
TUESDAY, DECEMBER 4, 2018
Thank you, Senator Rice, Assemblywoman Jasey and members of the Joint Committee for the opportunity to share my thoughts on QSAC today. My name is John Ravally, and I have served as a public school superintendent in New Jersey for over 16 years. For the past three, I have been Superintendent of the Franklin Township Schools, a K-12 district in Somerset County. As a superintendent, I have been through QSAC more than a half dozen times. In my current district I dealt successfully with the challenges of two “Focus” schools, defined by QSAC as “schools that have room for improvement in areas that are specific to the schools.”
The intent of QSAC—to ensure student success by providing a high level of accountability for schools and districts--is laudable. The execution is, in many cases, flawed. The following are my observations.
Student Growth and Effective Accountability
On the QSAC home page, the NJDOE states the following, “The system shifts the monitoring and evaluation focus from compliance to assistance, capacity-building and improvement.” In my view, student achievement data is currently presented in a way that is focused more on compliance and less on “assistance and improvement.”
Right now we assess improvement mainly by weighing student achievement against benchmark scores. I believe it would be more effective to place even less emphasis on these benchmarks and greater emphasis on growth in student achievement. The idea that a district is moving student growth in a positive direction is a much stronger indicator of educational progress than simple benchmark comparisons.
Reporting and public accountability are important for every school and district. I believe the process of publicly reporting achievement data and district improvement efforts (such as aligning budgets to areas that are in need of improvement), can and does serve as an excellent accountability system. If the public trusts the district leaders and believes in the ways those leaders plan to create or invest in programs for improvement; the public will largely accept the recommendations and processes of the district. Conversely, if the public is dissatisfied with the ways that achievement is reported or the direction the leadership is taking to solve achievement issues, the public has the ability to make those feelings known at BOE meetings and through district communications channels. If they are really dissatisfied, they can vote to replace BOE members and/or make public judgments about the superintendent’s performance. This system, to me, is the best public accountability mechanism and works much better than comparing districts to each other and relying on “rankings” that are based predominantly on questionable state assessment systems.
Chronic absenteeism is the performance indicator chosen by the NJDOE to meet the requirements of the federal Every Student Succeeds Act (ESSA), but the use of chronic absenteeism as a measure of school success also raises some concerns. Though studies have linked achievement to attendance, those same studies also indicate that for particular subgroup populations the rate of absenteeism is inherently higher. This suggests that schools with students from those populations have greater challenges in overcoming the absenteeism problem.
Timelines for Monitoring and Corrective Action
If monitoring is to be truly effective, the DOE and the districts need to have the time and resources to do it properly. Monitoring every district every three years wastes resources that might be better spent on helping struggling schools and districts. Timelines more consistent with the older versions of district monitoring would seem to make sense, especially for districts that have been successfully designated as “high performing” according to QSAC criteria.
Monitoring “high performing” districts every three years is burdensome to those districts, especially for the personnel involved in the monitoring process. Once a district has been designated “high performing”, it should be exempt from QSAC monitoring for the next cycle, meaning that the district would be monitored again six full years after successfully being labeled “high performing.”
Improvement plans for the districts required to write them should be reviewed annually. Once a district meets the goal of the plan, it should be moved to “high performing” status and monitored again during the next QSAC cycle.
Clear Goals for Struggling Schools
When my current district had two focus schools, the challenge was to gain clearly defined approved goals that would afford the district the opportunity to exit the corrective action plans. The “goal line” seemed to constantly and consistently move from year to year, making it very difficult for the school to gain direction. Schools under corrective action plans need the NJDOE to provide a clear and unequivocal understanding on what it will take to move the schools forward.
The Department’s Role
The NJDOE, through the County Office, should be responsible for overall monitoring, as well as monitoring improvement plans. When the NJDOE template used for creating district improvement plans itself invites unnecessary, redundant work already being done, the NJDOE, through its County Office officials, need to provide flexibility to that district allowing the use of a “made to fit” improvement plan rather than a “one size fits all” approach. The one size fits all approach isn’t necessarily effective, especially in the most challenging of circumstances. The current template requires much detail and involves creating a mélange of action steps when in fact keeping a more focused approach specific to a particular district’s needs would provide more clarity and a shorter path to reaching improvement goals.
Closure or Consolidation for Struggling Schools?
Closing or consolidating consistently struggling schools, or offering parents school choice, may seem like viable options, but I don’t see the benefit of consolidation, closure or choice. Those strategies could result in masking achievement issues rather than facing them head on. Concentrating the most effective resources, personnel and strategies in those schools seems like a more direct and less disruptive option. To that end, we might consider building incentives for administrators in high performing districts to partner with their counterparts in struggling districts so that all parties involved receive a benefit.
Garden State Coalition of Schools