dcsimg

Are ACT scores an accurate indicator of college success?

ACT test

A new study has called into question the validity of the ACT college entrance exam as a reliable indicator of how well students will perform in their college careers.

The study, published by the National Bureau of Economic Research, found that two of the four components of the ACT exam--English and math--"are highly predictive of positive college outcomes."

But the other two subtests--science and reading--"provide little or no additional predictive power," the authors wrote. As a result of their findings, they advised that "focusing solely on the English and mathematics test scores greatly enhances the predictive validity of the ACT exam."

The research calls into question the methods university admissions officers use to evaluate applicants to their institutions. Commonly, ACT scores are evaluated as a composite, without regard to the individual components, so even though admissions officers receive scores of the four subtests, most choose not to weigh separately the different subjects.

The authors of the study contend that lumping two unreliable indicators in with two valid predictors of success in college pollutes the ACT's role in the admissions process, casting doubt on the merits of an upstart entrance exam that has been gaining share and solidified its place as a solid rival to the more entrenched SAT.

The research evaluated data from the Ohio Board of Regents since 1999, correlating data from four-year public colleges of varying quality in that state with students' ACT scores. English and math scores proved better indicators than reading and science not only of success measures, such as college GPA and dropout rate, but they also correlated significantly with high school GPA.

Validity of ACT exam at issue

"This provides further evidence that the reading and science tests have very little predictive merit," the researchers wrote.

"By introducing noise that obscures the predictive validity of the ACT exam, the reading and science tests cause students to be inefficiently matched to schools--admitted to schools that may be too demanding--or too easy--for their levels of ability."

The authors of the study are Eric Bettinger, an associate professor at Stanford University's School of Education, Brent Evans, a doctoral student of higher education at Stanford, and Devin Pope, an assistant professor of behavioral science at the University of Chicago's Booth School of Business.

The nonprofit ACT has several criticisms of the study. In an e-mail, Jon Erickson, interim president of ACT's Education Division, said that the test has broader application than just college admissions, noting that the exam is commonly used for course placement, counseling and other purposes.

ACT defends scoring methodology

Erickson also defended the composite scoring method, arguing that "all four subject areas are important in college," and that composite prediction models are more accurate than the evaluation of individual scores.

"[W]e believe the composite score represents the best overall picture of the student and perhaps is most easily accessible and useable by institutions. In essence, our tests are designed to predict general education course performance in math, reading, English and science. We use the ACT composite to best represent overall college readiness," he said. "We believe an academic achievement test should weigh evenly the four essential subject areas."

Though ACT continues to champion test evaluation by equal weighting, Erickson said that the organization works with hundreds of schools each year to help them develop customized predictive models for the test scores that can separately weigh individual subject scores, as well as other factors.

For related news and other information from Schools.com, see: