By Katie Floyd and Cassie Davis. (Supervisor/editor: Paul von Hippel.)


School voucher programs, also known as opportunity scholarships, are scholarship programs – frequently government funded, but sometimes funded by private philanthropy that pay for students to attend private schools of their choice. Many private school voucher programs have been initiated around the world with goals that include improving measures of student performance, such as test scores.

Several studies have estimated the effects of different voucher programs on student test scores. On this page we summarize the most recent and complete review of the most rigorous studies. 

The Arkansas School Voucher Meta-Analysis

In May of 2016, researchers at the University of Arkansas published “The Participant Effects of Private School Vouchers Across the Globe: A Meta-Analytic and Systematic Review” (Shakeel et al., 2016). This study summarizes evidence from all randomized control trials (RCT) of the effects of private school vouchers on student reading and math scores. In a voucher RCT, vouchers are assigned through a random lottery, so that lottery winners who receive vouchers differ only in random ways from those who are denied vouchers. Random assignment is especially valuable in evaluating private school choice programs since, without random assignment, families who choose private schools would be “widely expected to be different from other families in unmeasurable ways that subsequently affect student achievement levels and gains" (Shakeel et al., 2016).

Summary of Meta-Analysis Findings

The University of Arkansas meta-analysis summarized the results of all RCTs that estimated the effect on test scores of receiving a voucher and actually using it to attend private school. The meta-analysis summarizes those results with "forest plots," two of which are shown below. For each study, the forests plot display a point representing the best estimate of the effect, a confidence interval (whiskers) representing a range of uncertainty about the effect; and a box represents the weight given to the study, determined primarily by its sample size. Any confidence interval that crosses zero signals that the estimated effect is not statistically significant and the true effect could have been zero. The diamonds represent the average effect across all US studies, all foreign studies, and all studies together (Shakeel et al., 2016).


Screen Shot 2016-11-19 at 8.56.56 PM.png


Screen Shot 2016-11-19 at 8.57.43 PM.png

Note. Results shown are the effects of "treatment on the treated" (TOT), or the effects on students who used their vouchers to attend private school. The effect of simply receiving a voucher, whether it is used or not, is estimated elsewhere in the meta-analysis.

Interpreting the Meta-Analysis

The results in these forest plots can be interpreted in several ways. We review two different interpretations.

The Friedman Foundation Interpretation

In a booklet published by the Friedman Foundation, which advocates for vouchers, Greg Forster (2016) reviewed a similar though not identical body of 18 RCT studies of 8 programs in 7 US cities and 1 US state (Louisiana). Forster used a "vote counting" system that classified each study as having "any negative effect" or "any positive effect" on "all students" or "some students." To be classified as positive or negative in Forster's system, an effect must be statistically significant; studies that found no significant effects are classified as showing "no visible effects."

Out of 18 studies that he reviewed, Forster reported that 14 studies had positive effects on at least some students; more specifically, 6 programs had positive effects on all students and 8 had positive effects on some student subgroups. Forster reported that 2 studies (in Toledo and New York) found no visible effects, and 2 studies of Louisiana's voucher program had negative effects. "This body of evidence," Forster concludes, "shows that school choice benefits students."

The Authors' Interpretation

The University of Arkansas researchers criticized Forster's vote-counting approach (Shakeel et al., 2016). Meta-analysts commonly criticize vote-counting because it does not distinguish between large and small effects, does not distinguish between large and small studies, and ignores the precision of estimates (Hedges, Laine, & Greenwald 1994). Vote counts like Forster's have also been criticized for counting multiple studies of the same program as independent results (Krueger 2003). In addition, Foster's vote-counting approach counts a study as positive (or negative) if just one of its results, even from a minor part of the analysis, produced a statistically significant estimate (Shakeel et al., 2016).

The University of Arkansas researchers used more conventional meta-analytic methods to select one reading and math estimate from each program and obtain the average effect of school vouchers. In reading, the authors "find null effects in the US and large positive effects...outside of the US, primarily driven by the PACES program" in Colombia. Similarly, in math, they find that "with the inclusion of Louisiana, the...overall effect for US programs is null, but the overall effect for non-US programs is higher" (Shakeel et al., 2016). These summaries refer to the effects of treatment on the treated (TOT)–that is, the effects on the students who were not just offered vouchers but used them to attend private schools. 

Heterogeneity and Outliers

The results of the meta-analysis were very heterogeneous. That is, the results varied substantially from one program to another, making it difficult to predict from past programs whether the results of the next program would be positive or negative. The large negative effects of the Louisiana program, for example, came as a shock to voucher advocates, while the large positive effects of the programs in Milwaukee and Colombia have been awkward for voucher opponents. 

Below we discuss how different authors have tried to explain the largest positive and negative estimates.


Colombia uses a school voucher program that operates through a lottery system called Programa de Ampliacion de Cobertura de la Educacion Secundaria (PACES). It has served over 125,000 students. The vouchers offered through the program cover more than half the cost of private secondary school, and the remaining cost is covered by the students' household funds. It has been speculated that the exceptionally large positive effects of the Colombian program are due to Columbia having larger differences between public and private schools than is typical in the US. In addition, the PACE program provides individual student incentives for performance, so that the effect of the program is due to more than just school choice (Shakeel et al., 2016).


One possible explanation for the negative effects of Louisiana's voucher program is that there was adverse selection of schools into the program. That is, schools that did join the program may have been relatively poor performers.

 It is not clear why adverse selection might have occurred. One possibility is that schools enrolled that would otherwise have had difficulty retaining students and needed cash flow from vouchers (Walters, 2016). Another possibility, favored by school choice advocates, is that some schools did not participate because of the burden of regulation (Forster, 2016). Under the Louisiana program, participating schools had to administer state testing and were subject to inspections by public officials .


Milwaukee is an important program because it was the first publicly funded voucher program in the country, established in 1990 (Rouse, 1997). In interpreting the Milwaukee results, it is important to note that the Milwaukee study was relatively small, with a few hundred students, and as a result the program's effect is highly uncertain and could be much smaller or larger than the point estimate. The forest plot indicates this by showing Milwaukee with a wide confidence interval. Because of this, Milwaukee gets relatively little weight in the US average (Shakeel et al., 2016).

It is also important to know that the Milwaukee data have been analyzed by different researchers with somewhat different results. One study showed a positive effect on both math and reading scores (Greene, 1999), but another study found a positive effect on math but no effect on reading (Rouse, 1997). These discrepant results add to the uncertainty about the effect of the Milwaukee's program.


Angrist, Joshua, Eric Bettinger, Erik Bloom, Elizabeth King, and Michael Kremer. "Vouchers for private schooling in Colombia: Evidence from a randomized natural experiment." The American Economic Review 92, no. 5 (2002): 1535-1558.

Forster, Greg. A Win-Win Solution: The Empirical Evidence on School Choice. Report no. 4th Edition. N.p.: Friedman Foundation, 2016.

Greene, Jay P, and Jiangtao Du. (1999). "Effectiveness of School Choice: The Milwaukee Experiment." Education and Urban Society 31(2): 190-213.

Hedges, L. V., Laine, R. D., & Greenwald, R. (1994). An exchange: Part I: Does money matter? A meta-analysis of studies of the effects of differential school inputs on student outcomes. Educational researcher, 23(3), 5-14.

Krueger, A. B. (2003). Economic considerations and class size. The Economic Journal, 113(485), F34-F63.

Latin America and the Caribbean Regional Office Human Development Sector Management Unit. Colombia The Quality of Education in Colombia An Analysis and Options for a Policy Agenda. By Raja Bentaouet Kattan, Felipe Barrera, Amy Walter, and Bibiana Taboad. Report no. 43906-CO.

Mills, Jonathan N., and Patrick J. Wolf. "The effects of the Louisiana Scholarship Program on student achievement after two years." Available at SSRN 2738805 (2016).

Morgan, Claire, Anthony Petrosino, and Trevor Fronius. "The impact of school vouchers in developing countries: A systematic review." International Journal of Educational Research 72 (2015): 70-79.

Rouse, Cecilia Elena. Private school vouchers and student achievement: An evaluation of the Milwaukee Parental Choice Program. No. w5964. National Bureau of Economic Research, 1997.

Shakeel, M. Danish, Kaitlin P. Anderson, and Patrick J. Wolf (2016). The Participant Effects of Private School Vouchers across the Globe: A Meta-Analytic and Systematic Review. The University of Arkansas, Department of Education Reform (EDRE).

Walters, Christopher. "School Vouchers and Student Achievement: First-Year Evidence from the Louisiana Scholarship Program." In 2016 Fall Conference: The Role of Research in Making Government More Effective. Appam, 2016.


  • No labels