Getting It Wrong On School Choice
Guest column by George Mitchell.
During the last year, three different reports have claimed to compare the academic achievement of students in the Milwaukee Public Schools with students in the Milwaukee Parental Choice Program.
Two conclude, erroneously, that MPS students outperform students in the choice program.
The third reaches far different conclusions.
Two of the three, from Wisconsin’s Department of Public Instruction (DPI) and the Milwaukee-based Public Policy Forum (PPF), used deeply flawed methods to conclude that MPS students outperform those in the choice program. Page one stories in the Journal Sentinel validated these erroneous reports. The paper compounded the errors by wrongly suggesting that the DPI and PPF data allow individual schools to be evaluated.
The third report comes from the School Choice Demonstration Project (SCDP) at the University of Arkansas and is based on rigorous methods. Its reports, including several issued today, draw starkly different conclusions from those advanced by DPI, PPF, and Journal Sentinel news stories.
Responding to widespread attention generated by the DPI and PPF reports, the experts at the University of Arkansas refute the validity of those reports and demonstrate why they provide neither a basis for comparing MPS and Milwaukee’s school choice programs nor for evaluating individual schools.
SCDP researchers constitute the nation’s most knowledgeable scholars in the evaluation of school choice programs. The SCDP director, Patrick Wolf, and his team have more experience than any other researchers in navigating and surmounting the challenges associated with studying school choice programs. Their extensive scholarship has been published in multiple academic journals. The respected periodical Education Week recently published an extensive list of high quality, peer-reviewed research on school choice issues. SCDP scholars were responsible for, or are cited in, much of this research.
In short, the SCDP researchers know what they are talking about. Their credentials overwhelm those who issued the DPI and PPF reports. For reasons that defy explanation, the Journal Sentinel newsroom effectively has given the benefit of the doubt to DPI and PPF. Even though the newspaper is well aware of the methodological issues and pitfalls, it has failed to highlight the serious flaws in both reports.
For example, relying on the DPI report, the Journal Sentinel said, “Students in Milwaukee's school choice program performed worse than or about the same as students in Milwaukee Public Schools in math and reading on the latest statewide test, according to results released Tuesday that provided the first apples-to-apples achievement comparison between public and individual voucher schools [emphasis added].” The paper published this story while in possession of SCDP reports that reached different conclusions and highlighted flaws in the kind of “comparison” DPI presented.
Reporting on the more recent PPF study, the Journal Sentinel said, “Overall, voucher students did not score higher than MPS students in either reading or math.” This reflected language in the PPF study itself, namely, “On the aggregate, a smaller percentage of voucher students earned proficient scores in reading and math than did Milwaukee Public Schools (MPS) students…At no grade level did voucher students, on the whole, out-perform MPS students in either reading or math.”
Thus, news reports about the DPI and PPF reports were unambiguous: MPS students outperformed students in the MPCP. As a consequence, that message appeared in news reports around the country. In the shorthand of journalism, the consistent theme is that “state studies” in Wisconsin showed public school students outperforming those in Milwaukee’s “voucher program.”
In its report on the PPF study the Journal Sentinel gave the last word to an opponent of the Milwaukee choice program: “Critics…argue [that] students in the voucher program haven't shown better overall results than their peers in MPS. Bob Peterson, president of the Milwaukee Teachers' Education Association, points to voucher students' failure to top MPS students' reading and math scores.” There was no mention in this story of SCDP research nor was there an effort to contact SCDP for comment.
In a report issued today, the SCDP director, Professor Patrick Wolf, specifically warns against relying on “snapshot comparisons” based on one year of data. He cites the 2011 DPI report and the 2012 PPF report as examples of how not to interpret data.
SCDP cautions about misinterpreting data go back more than a year. Yet DPI, the PPF, and the Journal Sentinel directly ignore the SCDP warnings.
The shortcomings in the PPF report are particularly stunning. The report’s author served on an advisory team to the SCDP research project. Despite her unique exposure to deliberations on research methods and pitfalls, she has authored a report that is a textbook collection of errors.
As for the Journal Sentinel, the SCDP effectively refutes its claim that the 2011 DPI study was an “apples-to-apples comparison.” The paper’s uncritical validation of the recent PPF comparison underscores a failure to report carefully and use standards with which the paper should be completely familiar. Indeed, more than fifteen years ago a Milwaukee Journal editorial called for the application of “peer reviewed” research techniques to sort through and clarify an early controversy about school choice results. The more recent reporting in the Journal Sentinel consigns that earlier call for high standards to the dustbin.
As for conclusions based on sound social science, the newly released SCDP analyses flatly contradict the DPI and PPF conclusions. For example:
“When similar MPCP and MPS students are matched and tracked over four years, the achievement growth of MPCP students compared to MPS students is higher in reading [and] similar in math.”
“Enrolling in a private high school through MPCP increases the likelihood of a student graduating from high school, enrolling in a four-year college, and persisting in college by 4-7 percentage points.”
These and many other positive conclusions regarding the MPCP reflect the use by SCDP scholars of a rigorous comparison of similar students, something that the one-year DPI and PPF snapshots don’t provide.
It remains to be seen if the Public Policy Forum will correct the record. That would seem unlikely, given that its lead researcher put on blinders and effectively disregarded some of the nation’s leading education scholars.
And DPI? Don’t hold your breath. Its head says the expansion of school choice is “immoral.”
What about the Journal Sentinel? Responding to today’s release of SCDP reports, it is unrepentant. It says the new information “casts the program in a slightly more favorable light.” The words “slightly more” substitute for the more accurate “significantly different.”
Instead of explaining the SCDP’s use of a more rigorous method than used by DPI or PPF, the paper reports that SCDP used a “smaller” sample that did not include “all” students. The “smaller” vs. “all” phrasing will mislead readers. The correct description of the SCDP sample would be “better” and “more rigorous.” Eschewing plain English, the paper instead says the SCDP used a “complex statistical methodology based on growth models.” To cap things off, the paper quotes the president of the Milwaukee teachers’ union as being critical of SCDP methods. His credentials for disputing the SCDP work are nil.
The paper makes no mention of SCDP’s explanation of flaws in the DPI and PPF data and methods. The SCDP analysis and refutation of those studies fundamentally undercut the paper’s earlier validation of the DPI and PPF reports.
George Stanley, the Journal Sentinel’s Managing Editor, is fond of saying the paper’s only job is to “go where the facts take us.” Unless, that is, it means admitting a mistake and setting the record straight.