News

Great Schools Now Blog

Five reasons I'm not taking Professor Orfield's "research" seriously

North Lawndale College Prep Oct 2013-030

 

This October, Professor Myron Orfield from the Institute on Metropolitan Opportunity at the University of Minnesota released a report on Chicago charter schools. This report has received substantial media attention, despite major flaws in the data analysis and arguments. As a data analyst[1], I recognize the importance of accurate and credible studies on the public education sector and believe they can contribute to meaningful conversations that have the potential to improve schools for children. Unfortunately, Orfield’s work does not fall into this category and instead distracts from serious discussions of these issues. Here’s why:

1. Patently false data: data in both the study’s “School Characteristics” and “School Performance” tables simply do not align with the data publicly available on the CPS data page.

  • In his most egregious error, Orfield claims the average high school graduation rate at district-run traditional schools is 84%, when the true figure is 63%.
  • In an interview on WTTW, he argued that his graduation numbers are different because he controls for student demographics. However, the “simple statistics” tables contain inaccurate performance data and are the basis for the regression analyses that “control for student demographics.” When the underlying data in a regression is false, the regression results are also invalid.

2. Unclear source data: not once does Orfield mention which test the reported pass and growth rates come from.

  • In 2012-2013, CPS elementary students took both the Illinois Standard Achievement Test (ISAT) – a state-wide test that is now being phased out – and the NWEA MAP test, which is the Common-Core aligned test replacing the ISAT. Neither test name is present anywhere in the paper.
  • The rates appear to be ISAT data, but Orfield also cites a recent Sun Times article about reading growth that was reporting on NWEA scores. The failure to distinguish or even mention the names of the tests does not reflect well on the credibility of the report.

3. Lack of transparency: Orfield fails to report key methodological factors, leaving data-minded readers with the following questions:

  • Were all averages weighted by number of students in the test cohort?
  • Were graduation rates weighted using the original 9th grade cohort or the adjusted cohort?
  • Were alternative schools (which serve special populations, like drop-out recovery students) included in the dataset?[2]
  • Were contract schools included as traditional schools?[3]
  • What definition of selective was used?[4]
  • Were selective magnets separated from non-selective magnets?

We at INCS are transparent about our methods and include a methods and data definitions section on our website. We would expect a scholarly publication to have the same standards.

4.  Misrepresentation of past work and omission of high-quality research

  • RAND study:
    • Orfield glosses over the positive outcomes that charter high schools were found to produce in this RAND study. They are worthy of specification: charter attendance was associated with an advantage of 7 percentage points in the probability of graduating from high school and an advantage of 11 percentage points in the probability of enrolling in college.
    • He discounts the positive charter results by saying they were limited to students who attended charter middle and high schools. But this point is actually further evidence of the impact of charter school environment and instruction, rather than prior characteristics of the students themselves. Similarly, the finding that charter schools do not do as well at raising achievement during their first year of operation, but improve academic performance thereafter is further evidence of the impact of the school itself.
  • CREDO study:
    • Orfield again glosses over the positive results for charters demonstrated by the CREDO study, saying only that “results were more positive for charters, with most comparisons showing charter students out-performing their traditional school peers.” It would have been helpful for readers to know that CREDO found: “the typical student in Illinois charter schools gains more learning in a year than his TPS (Traditional Public School) counterparts, amounting to about two weeks of additional gains in reading and about a month in math” (CREDO, 35).
    • Orfield criticizes the methods of CREDO, claiming that while they controlled for many demographic factors, “the list of matching variables does not include anything that reliably captures parental engagement.” However, Orfield’s own work also fails to capture this difficult-to-determine factor.
  •  Mathematica study:
    • This study, published in January 2014, was omitted from Orfield’s review.
    • The study corroborates key findings from RAND – the researchers found that enrolling in a charter high school in Chicago increased the probability of graduating high school and enrolling in college. Students were 7 to 11 percentage points more likely to graduate from high school and 11 percentage points more likely to enroll in college. Both results were statistically significant.

5. Simple Errors: the paper is riddled with data and writing errors unbefitting of an academic research paper.

  • Errors and inconsistencies in data reporting: for example, in the “School Characteristics” table, the Suspension Rate is reported as a whole number of students, whereas the Expulsion Rate is reported as a rate. Orfield repeatedly mentions that his study represents data from the most recent year available (2012-2013), when in fact data from the 2013-2014 school year was released over the summer for graduation rates, ACT scores, and PSAE scores.
  • Errors in data analysis: for example, attendance rate should not be used as a control variable in the regression analysis– it is instead an outcome that charters work hard to improve. In fact, in the new School Quality Rating Policy (SQRP) from CPS, attendance counts for 20% of elementary schools’ rating and 10% for high schools. Thus, attendance rates should not be treated like race or income characteristics, which schools cannot control.
  • Errors in basic education terminology: for example, the author incorrectly refers to students with an IEP as “the percentage in independent educational programs” when this term means students with individualized educational programs.

Orfield’s real motive is clear:

  • Orfield has admitted his work was funded by the Chicago Teachers Union (CTU).
  • His conclusion section is a restatement of anti-charter bills proposed in Springfield, rather than a thoughtful consideration of the research.
  • His biased language throughout is more fitting of an opinion piece than a scholarly paper.
    • As an example, “This study adds another piece to the pile of research that implies that students in fact perform at lower levels at charters than traditionals.”

The truth about charter schools:

  • Charter schools are preparing students to make it to and through college: college enrollment and persistence is higher at charter schools than other open-enrollment schools.
  • Charter schools are leading open-enrollment public schools in terms of ACT attainment: 12 of the top 12 high schools in the city are charters. In fact, 70% of charter schools outperformed the city average for other open-enrollment school on the most recent ACT. Charters have raised the bar for underserved students.
  • Charter schools are providing high quality public school options across the city, but there are thousands of families who still have no high quality option for their children. In 2013, there were nearly 30,000 more applications for charter schools than seats available.

 

[1] Written by Sophie Wharton, a data analyst at the Illinois Network of Charter Schools and former public school teacher.

[2] In our performance data, we exclude these schools (which include Alternative schools, Achievement Academies, Special Education schools) from both the charter and district averages because they serve a differently situated population. If data from the 18 Youth Connection Charter School campuses – a network of alternative charter schools – were included in charter performance averages in this report, the district alternative schools should have also been included in traditional averages.

[3] In some CPS datasets, contract schools are listed in the same section as charter schools, but they are not charters and should be included with traditional, open-enrollment schools.

[4] At INCS, we consider any school selective if they have selective practices for a least 50% of their seats.