Valerie Strauss has an interesting piece over at the Washington Post dealing with Value-Added Modeling. More specifically, the post analyzes what can be learned from 20 years of the Tennessee Value-Added Assessment System (TVAAS) implemented as a result of the Education Improvement Act — the Act that created the Basic Education Program (Tennessee’s school funding formula, also known as BEP).
The promise of Value-Added Assessment was that we could learn a lot about which schools were working and which weren’t. We could learn a lot about kids and how they were progressing. We could even learn about teachers and how they were doing with all their students and with specific groups of students. With all this information, Tennessee would intervene and take action that would move schools forward.
Unfortunately, that promise has not been delivered. At all.
Here, I highlight the key takeaways from the Strauss piece. Tennessee parents and policymakers should take note – TVAAS is taking up tax dollars and impacting teacher evaluations and it doesn’t really work all that well.
1. Using TVAAS masked persistently low proficiency rates.
The Tennessee value-added assessment model basically identified the schools that were already making required annual proficiency targets, but it failed to distinguish between schools with rising or declining proficiency scores.
In short, the Sanders Model did little to address the essential unfairness perpetuated by NCLB proficiency requirements, which insisted that those student further behind and with fewer resources than those in economically privileged schools had to work harder to reach the same proficiency point. More importantly, there was no evidence that the Sanders version of value-added testing did anything to help or even predict the future outcomes for those furthest behind.
2. TVAAS is unstable and inappropriate for high-stakes decisions — like hiring and firing teachers, renewing licenses, or determining pay.
And despite the National Research Council and the National Academies’ flagging of value-added assessment as too unstable for high-stakes decisions in education …
…states like Tennessee rushed to implement a federally recommended system whereby value-added growth scores would come to dominate teacher evaluation for educators who teach tested subjects. And contrary to the most basic notions of accountability and fairness, two-thirds of Tennessee teachers who teach non-tested subjects are being evaluated based on school-wide scores in their schools, rather than their own.
3. Continued use of TVAAS as an indicator of “success” leaves the most vulnerable students further and further behind.
In a 2009 Carnegie-funded report, Charles Barone points out that focus on value-added gains, or growth in test scores, may downplay the need for interventions to address low proficiency rates: “Due to the projection toward proficiency being recalculated annually [in the TVAAS model], there is not necessarily a significant progression, over time toward proficiency . . . causing a delay of needed intervention at appropriate developmental times” (p. 8). So while showing academic progress, gain scores or growth scores easily mask the fact that minority and poor children are far below their well-heeled peers in becoming intellectually prepared for life and careers. And in masking the actual academic progress of the poor and minority students, the state (and the nation) is let off the hook for maintaining and supporting an adequate and equally accessible system of public education for all students. At the same time, politicians and ideologues can celebrate higher “progress rates” for poor and minority students who are, in fact, left further and further behind.
4. Tennessee has actually lost ground in terms of student achievement relative to other states since the implementation of TVAAS.
Tennessee received a D on K-12 achievement when compared to other states based on NAEP achievement levels and gains, poverty gaps, graduation rates, and Advanced Placement test scores (Quality Counts 2011, p. 46). Educational progress made in other states on NAEP [from 1992 to 2011] lowered Tennessee’s rankings:
• from 36th/42 to 46th/52 in the nation in fourth-grade math
• from 29th/42 to 42nd/52 in fourth-grade reading
• from 35th/42 to 46th/52 in eighth-grade math
• from 25th/38 (1998) to 42nd/52 in eighth-grade reading.
5. TVAAS tells us almost nothing about teacher effectiveness.
While other states are making gains, Tennessee has remained stagnant or lost ground since 1992 — despite an increasingly heavy use of TVAAS data.
So, if TVAAS isn’t helping kids, it must be because Tennessee hasn’t been using it right, right? Wrong. While education policy makers in Tennessee continue to push the use of TVAAS for items such as teacher evaluation, teacher pay, and teacher license renewal, there is little evidence that value-added data effectively differentiates between the most and least effective teachers.
In fact, this analysis demonstrates that the difference between a value-added identified “great” teacher and a value-added identified “average” teacher is about $300 in earnings per year per student. So, not that much at all. Statistically speaking, we’d call that insignificant. That’s not to say that teachers don’t impact students. It IS to say that TVAAS data tells us very little about HOW teachers impact students.
Surprisingly, Tennessee has spent roughly $326 million on TVAAS and attendant assessment over the past 20 years. That’s $16 million a year on a system that is not yielding much useful information. Instead, TVAAS data has been used to mask a persistent performance gap between middle to upper income students and their lower-income peers. Overall student achievement in Tennessee remains stagnant (which means we’re falling behind our neighboring states) while politicians and policy makers tout TVAAS-approved gains as a sure sign of progress.
In spite of mounting evidence contradicting the utility of TVAAS, Commissioner Huffman and Governor Haslam announced last week they want to “improve” Tennessee teacher salaries along the lines of merit — and in their minds, TVAAS gains are a key determinant of teacher merit.
Perhaps 2014 will at least produce questions from the General Assembly about the state’s investment in an assessment system that has over 20 years yielded incredibly disappointing results.
For more on Tennessee education politics and policy, follow us @TNEdReport
Pingback: Is TVAAS a Useful Tool to Improve Student Outcomes? Edlinks 10/7 | Bluff City Education
Pingback: Kentucky Education Report | Value-Added Value?
You want to know how to boost student achievement? Ask ANY teacher and they will tell you! Pure and simple, and with little if any new money required! Just tie STUDENT ACHIEVEMENT with student promotion. THEY don’t achieve, THEY don’t move to the next grade level. There you have your solution. Now, do something with it!
The more interesting component of this piece to me is the extent to with Horn and Wilburn tie TVAAS back to the growth of the charter school movement. It seems like one of their major problems is that it enables under performing charters to remain open. I think the solution is to introduce more accountability into the system, rather than to end it completely. Lets actually use growth scores to ensure that ineffective charters (those with negative growth) are actually shut down in a timely manner and only good ones stay open.
See “Evidence Presented in the Case Against Growth Models for High Stakes Purposes”
The question we should ask now is why does TN’s DoEd (e.g., Kevin Huffman) cling to policies that have NO empirical evidence as education improvement models?
Educators know what works as do the wealthy private schools. All children should have what Huffman chose for his own children:
Smaller classroom student:teacher ratios with teacher assistants
Integrated, multi-age & heterogeneous ability classrooms- including kids with significant disabilities
Fully resourced classroom materials & technology
Broad range of curricular options- including arts
Well resourced libraries
Regular PE & exercise opportunities
Well prepared professional educators with advanced degrees(not 5 week missionaries)
Pingback: The Empty Harvest From Michelle Rhee
Pingback: Michelle Rhee's Empty Harvest
It’s disappointing, as research oriented as Andy is, that he would write this piece.
Correlation is not causation.
To posit that TVAAS has failed to raise NAEP scores is not a fair conclusion or a fair analysis of data. (It’s not really an analysis of data at all).
NAEP scores for Tennessee have risen over time, meaning gains have been made. Other states have also made gains, yes. But who is to say that having TVAAS around wasn’t part of those scale gains for TN? I’m not going to conclude that, because I don’t have any sort of data to back that up.
Just because you say it and read it on the internet, doesn’t mean it’s true.
Also, the $326 million spent on TVAAS figure seems to be way off.
“value-added scores for individual teachers turn out to be about as reliable as performance assessments used elsewhere for high stakes decisions.”
“…the reliability of teacher evaluation systems that include value-added vs. those that do not — ignoring value-added typically lowers the reliability of personnel decisions about teachers”
“Critics of value-added methods have raised concerns about the statistical validity, reliability, and corruptibility of value-added measures. We believe the correct response to these concerns is to improve value-added measures continually and to use them wisely, not to discard or ignore the data.”
I certainly don’t think TVAAS is the holy grail of anything, but as a tool in education, it is worth the relatively low expense of having around and informing policy decisions.
To say it’s told us nothing over 20 years, or that it hasn’t led to NAEP gains for TN students, it’s a very unfair (and false) narrative.
Pingback: Dad Gone Wild | Teacher Power!