Identifying CA’s lowest performing schools proves difficult

Identifying CA’s lowest performing schools proves difficult

(Calif.) The state’s board of education decided Wednesday to put off identifying California’s lowest performing schools for federal accountability purposes for at least another year.

What the board did do, however, with only one dissenting vote, was establish a framework for evaluating performance that includes test scores, graduation rates and disciplinary outcomes. Missing from that matrix, however, are two other key areas that the state has made a priority, but that will not have data for at least another year: a college and career readiness measure and one for chronic absenteeism.

Facing a September deadline to submit a plan to the U.S. Department of Education explaining how the state will comply with the Every Student Succeeds Act, the California State Board of Education essentially outlined their accountability plan even though key elements won’t be ready to be applied until next year.

“While we have had this single objective of creating a single coherent and integrated accountability system, because of the different timelines, this is not going to be a single system,” said Pat Rucker, originally appointed to the board in 2011. “It is not going to create a great deal of coherence.”

The challenge facing the board in fulfilling its required ESSA plan is that some components do not mesh neatly with the state’s existing accountability system as directed by the Local Control Accountability Plans. Under the LCAPs, which have been evolving since adoption in 2013, districts–not schools–are the primary focus of attention from a list of eight state educational priorities.

ESSA, which President Barack Obama signed into law in 2015, targets schools and calls for each state to establish criteria for identifying the lowest-performing 5 percent of sites that receive Title I funding.

The distinction is not lost on California officials, who endured a significant amount of criticism after Congress adopted the No Child Left Behind Act in 2002 and created two systems for measuring schools in the state.  Four years after NLCB became law, using Annual Yearly Progress as the centerpiece for measuring school success, lawmakers had established the Academic Performance Index as the means for evaluating schools in California.

Although both systems were based on just test scores, the federal AYP established minimum growth targets that student subgroups had to meet while the API emphasized growth from one year to the next.

After toiling for more than four years to build a new accountability system based on the LCAPs and at least five measures besides test scores, the board is loath to make any major adjustment just to conform with the federal planning document, especially since ESSA itself took away virtually all authority of the U.S. Education Secretary to arbitrate school performance.

That said, however, the board clearly doesn’t want to create a new point of confusion among parents and taxpayers. They are also keenly aware of the oddly aggressive approach the Trump administration has taken in reviewing ESSA plans from other states.

Thus, as Mike Kirst, president of the state board has said, the California plan needs to be thought of as a grant application where only those questions asked are answered, and only in the broadest of terms.

Submitting an implementation plan is one of the few mandates on states that Congress imposed when writing ESSA. After decades of a top-down system that gave federal regulators the driving influence over classroom instruction, the architects of ESSA purposefully gave back to the states the authority to design their own accountability systems and decide how to intervene when schools didn’t measure up.

Initially, it looked like California’s LCAP system would align perfectly with the new national law and indeed it generally does. But joining the district-based state system with school-based federal requirements is proving complicated.

The board asked staff earlier this year to look at options for identifying the lowest performing districts and then looking for low-performing schools within that pool. The problem, some critics point out, is that some high-performing districts have some very low performing schools.

“I share a lot of the concerns from stakeholders that focusing only on LEA s might leave out some very high need schools,” said Ting Sun, appointed to the board in 2015, who suggested delaying the actual identification of schools.

Feliza Ortiz-Licon, also appointed in 2015, disagreed.

“Even if we try to delay the conversation because there’s going to be additional indicators and more data, we’re still going to wrestle with the same things we have for the last two years,” she said. “We can delay the conversation to wait for additional information, or we can really try to target and have a very intentional conversation about the areas where there’s still misalignment, confusion and vagueness about what our capacity is, and what we have to offer.”