Each year when AzMerit scores are published charter proponents celebrate the sector’s higher scores relative to district schools. While some charter schools excel, overall evidence does not support that charter schools perform better than district schools on an apples-to-apples comparison.
Jake Logan, President & CEO of the Arizona Charter Schools Association, along with Nina Rees, President & CEO of the National Alliance for Public Charter Schools, recently highlighted the achievement of charters in Arizona in an opinion piece (link) in the Arizona Republic.
The Grand Canyon Institute, a public policy think tank, has published several policy papers on the financial practices of charter schools and their regulatory oversight. Its research has focused attention on how charters spend their taxpayer dollars, including how much is being invested in their classrooms.
In Arizona, district enrollment is flat and while charters have captured enrollment growth, which is a huge success for the sector. At nearly 20 percent of Arizona’s total public school population, a greater percent of students enroll in charter schools here than in any other state.
That said, the actual achievement data isn’t as good as presented. To claim that charter students perform better than district students requires a student-level comparison. Comparing aggregate scores isn’t valid.
Yes, charter students in aggregate perform better than district students on the AzMerit but aggregate data can only be controlled for one variable, e.g., ethnicity. A valid comparison requires that multiple differences between students be controlled for. For example, if, as appears to be the case, the Hispanic students in charter schools are more likely to come from families with higher incomes and are less likely to be English Language Learners then, of course, they would be expected to score higher than their district peers.
A better method is to look at the outcomes for district and charter students from a similar urban area with a set of demographic similarities and compare prior individual records of achievement on standardized tests. This allows for a virtually equivalent comparison of charter and district schools.
Logan and Rees appear to endorse this approach as they cited Stanford University’s Center for Research on Educational Outcomes (CREDO) in their piece. CREDO is affiliated with Stanford’s free market Hoover Institution and uses this approach in its academic research to examine student performance in charter schools versus traditional public schools.
Logan and Rees tout CREDO’s overall findings on urban charters a closer look is warranted at how Arizona’s urban charters (link) in Phoenix, Mesa and Tucson performed.
The results—Phoenix and Mesa charter students perform statistically significantly WORSE than their district school counterparts
In Tucson, they perform modestly better.
Prior CREDO studies have, likewise, also found charter students underperforming district students in Arizona even as they’ve found areas of the country where charters are far more successful.
To use student level data, CREDO needs to protect students and their schools, so CREDO does not disclose the performance of individual charter schools whose student data is used in their analysis. But it does provide a percent of schools doing worse, about the same, and better than district schools. The report also provides more detailed breakdowns for overall performance in Math and Reading across urban areas.
The Arizona Department of Education does provide grades for schools, but, unfortunately, many of the measures of success correlate with socio-economic status—so while charter schools receive grades, they are not as precise as the analysis that CREDO provides to the degree they focus on standardized tests.
The performance of district and charter school sectors vary — sometimes significantly — when student demographics are appropriately taken into account.
Below are two graphs taken from the CREDO report on urban charters cited by Logan and Rees. The charts have been edited to show the overall results of the study across all urban areas studied compared to the three urban areas in Arizona included: Mesa, Phoenix and Tucson.
In Math, a substantial portion of charters in Mesa underperform, with a smaller portion underperforming in Phoenix. Tucson, while not meeting the overall results of the study across 41 regions does show modest improvement over district schools. When these same results were analyzed at the student level instead of by school—in both cases controlling for demographics—charter students in Mesa and Phoenix were in the bottom six of 41 urban areas studied in the report. Tucson was closer to the median.
In Reading, by school, controlling for demographics, a similar pattern emerges—though Tucson does less well than it did in Math. When these same results were looked at by student instead of by school—in both cases controlling for demographics—charter schools in Mesa and Phoenix were again in the bottom six of 41 urban areas studied in the report. Tucson was below the median. CREDO used six years of student data through 2011-2012 in its analysis.
Despite CREDO’s Arizona results, some education researchers consider CREDO to have a pro-charter methodological bias. For technical details on that and CREDO’s response explore this document and its links.
But wait—what about NAEP tests?
The best response charter advocates provide is Arizona’s remarkable improvement in National Assessment for Educational Progress (NAEP) results compared to the country as a whole and, specifically, the charter subset within Arizona. Charter advocate Matt Ladner on Jay Greene’s blog shows both in the aggregate and in following the same students as they move from fourth to eight grade that Arizona leads the nation in gains with the charter subset doing best, as seen in Ladner’s graphic above.
However, keep two things in mind. First, NAEP results are based on a random sample with a larger standard error for smaller samples. The charter subset is about 20 percent of the total Arizona group, so its error is larger—though that error could be in either direction. Second, during this time, some of the key players such as Great Hearts and BASIS have been expanding. These schools tend to pull high achieving students from nearby district schools as well as other charters or they tap into suburban perimeter growth areas with single family homes—not apartments.
NAEP strives to test a representative sample of Arizona’s demographics based on race/ethnicity, location of the school, and median income of the school’s neighborhood. District students in the sample must balance the degree to which charter schools attract students that represent socio-economic demographics more likely to succeed in school, such as higher income families. However, charters make representative sampling more challenging because they aren’t typical neighborhood schools. As a result, the NAEP student sample could be skewed in a direction that would upwardly bias test scores.[1]
Jim Hall of Arizonans for Charter School Accountability has done an analysis of school locations by the four large charter players which have added 40,000 students in metro Phoenix from 2008-2018: BASIS, Great Hearts, Legacy, and American Leadership Academy. He plotted the location of their new schools and noted that only 4 of 49 new schools were located in low-income areas.
Consequently, a reasonable interpretation of the strong charter performance on NAEP is that it may in part reflect demographic skimming — stronger performing students migrate to charter schools resulting a smaller portion of these students attending schools in the district portion of the NAEP sample. In which case, the NAEP graphic under represents district growth performance and/or the growth of charters in more well-off socio-economic neighborhoods has led to a skewed sample (see illustration in footnote 1).
Read about NAEP sampling here and here.
Touting charter sector success over districts will surely occur again, but as noted here—that’s more likely due to student demographics than strong evidence of what’s going on in the classroom. The NAEP data is certainly positive about overall gains in academic achievement by students in Arizona. But context is important when attributing praise.
This blog highlights the need for improved measures that go beyond test scores and zip code alone to assess academic achievement. School choice should be driven by accurate information. Arizona should continue to strive to improve the information available to families and taxpayers about academic performance, so that schools are more accurately evaluated based on what’s going on within them than what zip code they are located in.
[1] If within an area, for instance, families with higher parental education level or higher incomes levels or another factor that correlates with school success (like family and community support due to religious membership) are more likely to enroll in the charter school, then it’s more likely that if such a school is included in the NAEP sample, it will upwardly bias results, as NAEP only controls for three variables: location, ethnicity, and income of the school’s neighborhood.
Dave Wells, Ph.D., Research Director, dwells@azgci.org, (602) 595-1025, ext. 2
The Grand Canyon Institute, a 501(c) 3 nonprofit organization, is a centrist think tank led by a bipartisan group of former state lawmakers, economists, community leaders and academicians. The Grand Canyon Institute serves as an independent voice reflecting a pragmatic approach to addressing economic, fiscal, budgetary and taxation issues confronting Arizona.
Grand Canyon Institute
P.O. Box 1008
Phoenix, Arizona 85001-1008
GrandCanyonInsitute.org