Introduction
Students who receive K-12 school vouchers are scoring above the national average according to the most recent Arkansas Education Freedom Accounts Program Annual Report, but the meaning of those assessment results as they pertain to student achievement in private and homeschool settings remains unclear.
In September 2025, the Department of Education Reform at the University of Arkansas, in collaboration with the Arkansas Department of Education (ADE), published the annual report — as required by law — detailing both the fiscal impact and academic achievement outcomes of the Education Freedom Accounts (EFA) program, the state’s publicly funded K-12 school voucher program. A previous AACF blog post explores how the program has created significant new spending obligations for the state — far exceeding budget estimates — and creates a financial burden for public schools.
The assessment results provided in the 2024-25 Arkansas EFA Program Annual Report also warrant further exploration. According to the report, voucher participants scored above the national average, ranking at the 57th percentile in math and the 59th percentile in English Language Arts (ELA). What these scores mean in terms of student educational outcomes and the effectiveness of the voucher program, however, is difficult to assess. The assessment results section of the report could be strengthened by the recommendations discussed below and, ultimately, help policymakers better evaluate the voucher program.
1. Ensure Assessment Results Control for Family Income
The 2024-25 Arkansas EFA Program Annual Report highlights above-average assessment scores among students participating in the voucher program and uses a model comparing Arkansas students’ results to national averages. Based on this sample, voucher students outperformed national averages. However, the results were not controlled for family income. But socioeconomic status is well documented as a strong predictor of standardized test performance. For example, researchers with the National Institute of Health found that, “a $1,000 increase in annual income increases young children’s achievement by 5-6% of a standard deviation.” Other research examining the effects of wealth on long-term educational outcomes has identified specific mechanisms through which wealth creates disparities, including higher parental expectations and greater investments of time and money in children’s education and development. Studies such as these do not mean that a student from an economically disadvantaged household is unable to perform well academically. They merely provide evidence that there are many other factors beyond the school setting that influence academic achievement, which is why controlling for family income when considering voucher participants’ assessment results would allow for a more informative analysis.
To be clear, the report does not explicitly state that voucher student assessment averages outpaced national averages because of the private school they attend or the homeschool provider with which they are enrolled. However, if state leaders want to have a better understanding of the true impact of the voucher program on student achievement, it would be important to control for family income.
2. Limit the Number of Allowable Student Progress Assessments
The Arkansas Teaching, Learning & Assessment System (ATLAS) was created in 2023 to replace the ACT Aspire as Arkansas’s annual summative assessment of progress for students enrolled in public schools. As the name implies, the ATLAS encompasses a range of assessments intended to measure student progress. It is now a critical component of the state’s public school accountability system (p.17). In fact, the letter grade rating that public school districts now receive annually are based on several factors tied to ATLAS: results such as the percentage of students meeting or exceeding proficiency on ATLAS summative exams in ELA, math, and science and the percentage of students meeting or exceeding individual growth target on the ATLAS in ELA, math, and science.
Although the accountability system for public schools relies heavily on ATLAS results, only 2.1% of voucher recipients took an ATLAS summative exam. This is because the LEARNS Act does not require schools accepting vouchers to administer any particular assessment as long as the assessment used is a “state board-approved nationally recognized norm-referenced test or a statewide assessment, which measures, at a minimum, literacy and math.” The EFA Program Annual Report highlights why this is problematic. Instead of the ATLAS, voucher recipients used 17 other summative exams, which makes one-to-one comparisons of student achievement even among voucher students challenging at best.
Lawmakers approved a $71.4 million, seven-year contract with Cambium Assessment, a Washington, D.C. based company, to develop and maintain the ATLAS (i.e. the state spends approximately $10.2 million annually over the seven-year period). The ACT Aspire costs were also around $10 million per year; however, unlike the previous exam, Cambium allows ADE to choose test questions that align with Arkansas standards. Given the resources the state has invested in the development and maintenance of the ATLAS for the public education system, it would seem reasonable to require participating voucher schools and providers to also administer the ATLAS. After all, these educational settings are now also supported, in part, by state funding.
For voucher students who took the ATLAS, 48% of those students achieved an overall performance of Level 3 (Proficient) or higher on the ATLAS assessment. Amending state law in the future to require the ATLAS across voucher participants would allow the state to conduct a more direct comparison of student achievement for public school students and students receiving vouchers. Whether such an amendment would be considered remains to be seen. However, it is encouraging that the 2024-25 report notes that, “in future years, University of Arkansas researchers will collaborate with ADE to directly equate ATLAS assessment results and the other nationally normed assessments taken by EFA students.”
3. Disaggregate Assessment Results for “Switchers”
The 2024-25 EFA Program Annual Report refers to voucher recipients who would have otherwise enrolled in public school as “switchers.” The report focuses on switchers in the fiscal impact analysis to make an argument that switchers generate potential savings for the state (which AACF counters) and reduce the program’s net cost. However, the report does not separate out assessment results between the voucher recipients who have never attended public schools and the roughly 19% who have switched enrollment from public schools to a private or home school setting over the first two years of the program.
Why is this important to track? In several other states with large scale K-12 voucher programs, students switching enrollment from public school to private school have experienced academic declines. In Louisiana, on average a switcher who had previously scored at the 50th percentile in math saw their math score decline to the 34th percentile after just one year of being enrolled in private school. In Indiana, switchers also experienced declines, with an average math score dropping from the 50th percentile to the 44th percentile. Additional research in Louisiana found negative math score impacts as large as -0.4 standard deviations. For context, current estimates of COVID-19-related learning loss are around -0.25 standard deviations.
Given these concerning trends from other voucher states, it would be instructive for future EFA Program Annual Reports to disaggregate assessment results between EFA recipients who have never been enrolled in public schools from those who switched from public schools to private schools or homeschooling. Doing so would allow the state to assess if Arkansas voucher participants who are also “switchers” are following the alarming trends seen in other voucher states, where students who switched enrollment from public schools to a participating voucher program often experienced academic losses.
Conclusion
While voucher participants’ average assessment scores exceed the national average — an outcome worthy of note — major caveats remain before broad claims about the program’s effectiveness can be made. First, including household income data and controlling for that variable in future reports would allow for deeper analysis of voucher recipients’ academic performance and more meaningful comparisons to national benchmarks.
Second, the use of more than 15 different annual student progress assessments, instead of the ATLAS, creates parallel accountability systems, where public schools are evaluated under one set of standards and private schools and homeschool programs by numerous other standards. Amending state law to require voucher recipients to take the ATLAS would help determine whether they are meeting academic standards established by ADE.
Finally, evidence from other states should serve as a cautionary signal. Numerous studies show that students who switch from public to private schools through voucher programs often experience academic setbacks. The voucher report does not disaggregate results between the voucher recipients who have never attended public schools and the roughly 19% who have switched enrollment from public schools to a private or home school setting over the last two years. If future reports can include this distinction, it would allow leaders to assess whether Arkansas is following the same troubling patterns seen elsewhere.
As the cost of the voucher program continues to increase and with a legislative fiscal session ahead of us, it would behoove the state to collect more comprehensive program data and more rigorously evaluate the program as state lawmakers make decisions regarding the amount to invest in the voucher program going forward and what the cost of that investment is to students, regardless of the school they attend.
