Commentary

The Secret About Student Testing


By Julian Betts, senior fellow, Public Policy Institute of California
This opinion article appeared in the San Diego Union Tribune on July 19, 2000

Teachers, parents, and politicians have begun poring over the just-released results from the state student achievement test. The test in question is the Stanford 9, an “off-the-shelf” test that California and others buy from a private publisher. More recently, California has started to augment the Stanford 9 with a small set of questions meant to link the test more strongly to the state’s new content standards. But California still relies on the Stanford 9 for its annual ranking of schools.

The state testing program has a dark little secret, well known to the education establishment but only partly understood by the public. Amazingly, the state has given exactly the same version of the Stanford 9 test to students in each of the last three years.

This decision threatens to undermine California’s ambitious new school accountability program for two reasons. First, even if underlying achievement does not improve, student performance on the test is virtually guaranteed to improve simply because students – and their teachers – will soon memorize the questions.

A principal at one school told me that a student at the school had said, “The test went great – we remembered lots of questions from last year!”

Clearly, any claim that the Stanford 9 test can accurately gauge improvement in student learning has been thrown into serious doubt. Further, the state’s entire system of school accountability, with financial carrots and sticks for schools that succeed or fail to improve, currently hinges upon the Stanford 9. Clearly, the problem has assumed critical importance.

A second difficulty stems from the fact that the test publisher sells other forms of the Stanford 9 test to districts in other states. It now appears that some districts in California are “gaming” the system by obtaining sample questions from outside sources and sharing them with teachers. This practice hardly seems fair to students who will lag behind simply because their districts have not take similar steps to “prep” students.

To preserve the integrity of the state test as a fair measure of student achievement, we need to make two changes. First, the state should start using new test forms each year. Second, the Department of Education should release past tests, to provide a steady guide to all students. In this way, we avoid rewarding certain schools for gaming the system. Call it equal opportunity gaming if you want, but it will level the playing field.

Naysayers will attack both of these ideas. It seems possible that the test publisher will balk at public release of old test, which it may view as proprietary. But such objections may only speed the much-needed development of a state test that is more closely linked to California’s rigorous new content standards.

The second criticism is that new tests are expensive: test developers must do field trials of new questions to ensure comparable difficulty to earlier tests. But several other states including New York, Texas, and Massachusetts have already moved to update tests annually while providing old questions to the public.

So, if even Massachusetts, with only one-fifth the population of California, has decided it can afford to write new tests each year, while making old tests freely available, what is holding California back? All that Sacramento needs is the resolve to spend the money to get the test right. It would be better for the school accountability system – and a better measure of student achievement in California – to test kids’ academic progress, not just their memories.

Publications

Equal Resources, Equal Outcomes? The Distribution of School Resources and Student Achievement in California

Math Matters: The Links Between High School Curriculum, College Graduation, and Earnings