TNReady Growth in the Age of Hackers and Dump Trucks

Last week, the state released (and celebrated) TNReady scores. Local school districts followed suit, often touting reward status based on “growth.” This “growth” is determined by a value-added formula known as TVAAS — Tennessee Value-Added Assessment System. Ken Chilton took some time to explain why the celebration should, at best, be muted.

I’d like to add that using any type of value-added formula for the purpose of evaluating a teacher (or even a school) is at least somewhat suspect.

More importantly, though, I’d like to say this: The test in 2018 was a disaster. Attempting to claim “growth” from results in 2019 using a baseline of 2018 is simply flawed.

Remember the hackers? What about the dump trucks? What about the deception:

While at the time, the hacking excuse sounded pretty far-fetched, today’s hearing confirms that the Department advanced a lie offered by the state’s testing vendor. Of course, later on in the testing cycle, a dump truck was blamed for disrupting testing. That excuse was also later proven untrue.

At a minimum, we’ve seen a mostly online test followed by a mostly pencil-and-paper test. Not only did the online test have big problems, but also students tend to score higher on pencil-and-paper tests. Here’s some analysis from a recent study on that topic:

At least in the time period that we studied, there is pretty compelling evidence that for two students who are otherwise similar, if one took the test on paper and one took the test on a computer, then the student taking the test on paper would score higher. And that’s controlling for everything we can control for, whether it’s the school that a student is in, or their previous history, or demographic information. It looks like there is pretty meaningful differences in how well students score across test modes.

We found mode effects of about 0.10 standard deviations in math and 0.25 standard deviations in English Language Arts. That amounts to up to 5.4 months of learning in math and 11 months of learning in ELA in a single year.

So: Of course moving back to paper-based tests yields “impressive” growth.

A five-year history of testing in Tennessee indicates that the results can’t be said to be reliable predictors of, well, anything. Here’s how it’s gone down:


Cancelled test

Pencil and Paper TNReady

Hackers and Dump Trucks Online TNReady

Pencil and Paper TNReady

We moved from a different type of test to an online test that failed to a paper test, to another online failure, and back to a paper test. Can we really measure any actual growth based on those circumstances?

TDOE says YES. They’re wrong.

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support helps make publishing education news possible.