Inherently Unstable

That’s how TEA’s top lobbyist described the state’s teacher evaluation system that is based on so-called “value-added” modeling. The remarks were made during testimony before the House Education Committee. Here’s more from a TEA Facebook post:

MORE on TVAAS:

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support$5 or more – makes publishing education news possible.

Donate Button

The Five Pillars of Privatization

Tennesseans for Student Success recently released a 2020 policy agenda and noted the following five pillars guiding this agenda:


Tennesseans for Student Success is kicking off the 2020 legislative session by outlining our policy pillars and how they affect student success. Our five pillars are higher academic standards, an aligned assessment to those standards, protecting accountability, innovation in education, and securing economic freedom for all. 

This sounds pretty nice, or at least rather innocuous. But, who is Tennesseans for Student Success? Here’s what their website says:


Tennesseans for Student Success is a statewide network of teachers, parents, community leaders, and volunteers who are dedicated to supporting, championing, and fighting for Tennessee’s students and their futures.

This sounds even better, right? Look! It’s everyone! All coming together to fight for our kids! We should ALL love TSS, right?!

Well, let’s take another look. It seems TSS is all about privatizing public schools. Sure, they attacked staunch public education defender and state representative Gloria Johnson a few years back. But, maybe that was an anomaly.

Then, of course, there are the candidates they strongly back.

It’s a who’s who of school voucher backers.

TSS has consistently indicated support for voucher-backers like Senators Dolores Gresham and Brian Kelsey. And, they’ve taken out ads against Republicans who dare stand in the way of Gov. Lee and the school privatization agenda:

The five pillars of TSS are nothing more than the five horsemen of the public education apocalypse. Standards and Assessment simply mean ever more testing. Protecting Accountability means using voodoo science to evaluate (and remove) teachers and keep salaries (costs) low. Education “innovation” means charter schools and vouchers (as seen in the ads above). Economic freedom for all is nothing more than saying the “market” is what should guide education policy — it’s saying we should privatize above all.

TSS is, in fact, non-partisan. They’ll attack anyone, Republican or Democrat, who stands in the way of letting privatizing profiteers get their hands on public schools.

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support$5 or more today — makes publishing education news possible.



Testing and Teachers

Republicans in the Indiana House of Representatives are taking action to remove test scores from teacher evaluations, the Indianapolis Star reports.


A bill to remove student test scores from performance evaluations that can impact teachers’ pay and promotion prospects unanimously passed a key committee Tuesday. Statehouse leadership is championing House Bill 1002, which could end up being one of the most consequential bills of the 2020 legislative session.


Current state law requires that test scores makeup a significant portion of a teacher’s evaluation, which rates them as highly effective, effective, improvement necessary or ineffective. A teacher’s rating can determine their salary, whether or not they’re eligible for raises or bonuses and impact their movement through the profession.

If this legislation is successful, Indiana will join states like Hawaii, Oklahoma, and New York in moving away from using testing — and, especially, value-added modeling — to evaluate teachers.

A study I reported on last year noted that using value-added modeling (as Tennessee does by way of TVAAS) is highly problematic. In fact, this particular study noted that value-added models suggest that your child’s teacher could impact their future height:


We find the standard deviation of teacher effects on height is nearly as large as that for math and reading achievement, raising obvious questions about validity. Subsequent analysis finds these “effects” are largely spurious variation (noise), rather than bias resulting from sorting on unobserved factors related to achievement. Given the difficulty of differentiating signal from noise in real-world teacher effect estimates, this paper serves as a cautionary tale for their use in practice.

In short, value-added data doesn’t tell us much about teacher performance. Additional data indicates further problems with value-added modeling for teacher evaluation — especially as it relates to middle school teachers:


Well, it could mean that Tennessee’s 6th and 7th grade ELA teachers are the worst in the state. Or, it could mean that math teachers in Tennessee are better teachers than ELA teachers. Or, it could mean that 8th grade ELA teachers are rock stars.


Alternatively, one might suspect that the results of Holloway-Libell’s analysis suggest both grade level and subject matter bias in TVAAS.


In short, TVAAS is an unreliable predictor of teacher performance. Or, teaching 6th and 7th grade students reading is really hard.

The study cited above showed that 6th and 7th grade ELA teachers consistently received lower TVAAS scores and that this was true across various districts. Or, as I noted: The study suggests both grade level and subject matter bias in TVAAS results.

Or, maybe, if your kid gets the “right” teacher, s/he WILL end up taller?!

The bottom line: Using value-added modeling to evaluate teachers is total crap.

Indiana’s lawmakers are finally catching on. It’s time for Tennessee to catch up.

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support$5 or more today — makes publishing education news possible.



Growth Scores

Get your kid assigned to the right teacher and they just might grow a little taller, new research suggests.

Tennessee has long used something called “value-added assessment” to determine the amount of academic growth students make from year to year. These “growth scores” are then used to generate a score for teachers. The formula in Tennessee is known as TVAAS — Tennessee Value Added Assessment System. Tennessee was among the first states in the nation to use value-added assessment, and the formula became a part of teacher evaluations in 2011.

Here’s how the Tennessee Department of Education describes the utility of TVAAS:


Because students’ performance is compared to that of their peers, and because their peers are moving through the same standards and assessment transitions at the same time, any drops in proficiency during these transitions have no impact on the ability of teachers, schools, and districts to earn strong TVAAS scores.

Now, research on value-added modeling indicates teacher assignment is almost as likely to predict the future height of students as it is their academic achievement. Here’s the abstract from a National Bureau of Economic Research working paper:

Estimates of teacher “value-added” suggest teachers vary substantially in their ability to promote student learning. Prompted by this finding, many states and school districts have adopted valueadded measures as indicators of teacher job performance. In this paper, we conduct a new test of the validity of value-added models. Using administrative student data from New York City, we apply commonly estimated value-added models to an outcome teachers cannot plausibly affect: student height. We find the standard deviation of teacher effects on height is nearly as large as that for math and reading achievement, raising obvious questions about validity. Subsequent analysis finds these “effects” are largely spurious variation (noise), rather than bias resulting from sorting on unobserved factors related to achievement. Given the difficulty of differentiating signal from noise in real-world teacher effect estimates, this paper serves as a cautionary tale for their use in practice.

The researchers offer a word of caution:

Taken together, our results provide a cautionary tale for the naïve application of VAMs to teacher evaluation and other settings. They point to the possibility of the misidentification of sizable teacher
“effects” where none exist. These effects may be due in part to spurious variation driven by the typically small samples of children used to estimate a teacher’s individual effect.

In short: Using TVAAS to make decisions regarding hiring, firing, and compensation is bad policy.

However, the authors note that policymakers thirst for low-cost, convenient solutions:

In the face of data and measurement limitations, school leaders and state
education departments seek low-cost, unbiased ways to observe and monitor the impact that their teachers have on students. Although many have criticized the use of VAMs to evaluate teachers, they remain a
widely-used measure of teacher performance. In part, their popularity is due to convenience-while observational protocols which send observers to every teacher’s classroom require expensive training and considerable resources to implement at scale, VAMs use existing data and can be calculated centrally at low cost.

While states like Hawaii and Oklahoma have moved away from value-added models in teacher evaluation, Tennessee remains committed to this flawed method. Perhaps Tennessee lawmakers are hoping for the formula that will ensure a crop of especially tall kids ready to bring home a UT basketball national title.

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support — $5 or more today — makes publishing education news possible.



Regression to the Mean

A guest post from Ken Chilton, who teaches education policy at Tennessee State University

When organizations plan for strategic change, one tenet is to cherry pick some easy wins early in the process to build support for the program and new momentum. School districts across the state of Tennessee are doing exactly that. They are parading the recently released TVAAS data that shows big jumps in value-added achievement.

Good news should be trumpeted, but I’m not sure this is good news. Unfortunately, most people have no idea what TVAAS measures. A score of “5” out of a possible 5 sounds impressive; however, it is an extremely flawed measure of success. While TVAAS purports to measure student growth year over year, the Department of Education advises, “Growth scores should be used alongside achievement scores from TNReady to show the fuller picture of students’ performance.”

When happy education administrators state “math scores increased from 27.6% proficient or greater to 28.1%” what does this mean? How do we translate a school district’s TVAAS score or essentially meaningless *increase* in scores to your child’s performance? It simply means that on one day your child took a standardized test and was considered above/below a proficiency threshold designated by an education technocrat. It provides little information on your child’s level of achievement and/or the quality of her/his teacher.

Surely, we wouldn’t spend millions of dollars annually and weeks upon weeks on preparation on a test that is meaningless, would we? Sadly, the answer is yes. In statistics, the term “regression to the mean” is used to explain how extremely low and high performers tend to move toward the average over time. If you start with really low scores, it’s easier to show large gains. Achieving a one-year jump in test scores or value-added algorithms at the school or district level does not mean your district or school is performing at a high level.

For example, let’s take two groups of kids and test them on their ability to complete 50 pushups—our chosen benchmark for measuring fitness proficiency. Let’s assume group A completed an average of 65 pushups last year. Group A participants have private trainers and nutritionists who work with them outside normal training hours. This year, Group A completes an average of 66 pushups. The trainers did not achieve much in terms of value-added.

Group B, on the other hand, has had little training. Last year, they averaged 5 pushups per participant. After concerted efforts to improve their performance, they averaged 10 pushups per participant this year. They DOUBLED their output and would likely show high value-added performance. Granted, they are still woefully below the 50-pushup benchmark.

In a nutshell, superintendents across the state are celebrating a nebulous statistic. Critics of value-added tests to measure teacher performance have long argued that state tests—especially multiple-choice ones—are woefully inadequate measures of a teacher’s impact on learning. TVAAS assumes that teacher effects can be isolated from the array of external variables that are widely recognized as factors that affect student performance. So much of learning occurs outside the school, but none of these factors are controlled for in value-added scores.

Here’s the good news: positive things are happening. Celebrate that. However, don’t mislead the public. One year of data does not make a trend—especially when the 2018 data were massively flawed. What matters is performance. Tennessee’s TN Ready test focuses solely on Tennessee standards. As such, parents cannot compare student results to other states that have different standards.

If you want to know how well Tennessee performs relative to other states, focus on the National Assessment of Education Progress (NAEP). NAEP is a good test and allows state-to-state comparison of performance using rigorous proficiency standards. It is administered every 2-years to randomly selected 4th, 8th, and 12th graders.

If you analyze NAEP data, Tennessee has not experienced sustained improvements in 4th and 8th grade reading and math tests over the last 3 testing periods. In 2017, 33 percent of Tennessee 4th graders and 31 percent of 8th graders achieved NAEP proficiency in reading. In math, 36 percent of 4th graders and 30 percent of 8th graders achieved NAEP proficiency.

The sad truth remains: most of the factors associated with student performance are related to socio-economic status. Inasmuch as poverty rates, absenteeism, parental involvement, household stability, and economic certainty are outside the control of school administrators and teachers, school performance data will underwhelm. Thus, we celebrate improvements in TVAAS algorithms that are not valid predictors of teacher performance.  

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support$5 or more today — helps make publishing education news possible.

Nullification

Remember when the Tennessee General Assembly first past “hold harmless” legislation and then added “no adverse action” language so that TNReady scores from another failed administration would not negatively impact students, teachers, or schools?

It turns out, the return of TVAAS scores may in fact result in some adverse actions. I’ve reported on how the incorporation of TVAAS scores based on this year’s TNReady test into overall student growth projections could have lasting, negative impacts on teachers.

Now, Coffee County educator Mike Stein has a blog post up about this year’s TVAAS scores and a teacher’s Level of Effectiveness (LOE).

Here are a couple key takeaways:

Today is Thursday, October 25th and, as of today, I am 29% of the way into the school year. This afternoon, I received my overall teacher evaluation score from last school year (called the “level of effectiveness,” or L.O.E. for short). I have some major issues with how all of this is playing out.

To begin with, why am I just now finding out how well I did last school year? Teachers often times use the summer to make any kind of major adjustments to their curriculum and to their teaching strategies. It’s quite difficult to make changes in the middle of a unit in the middle of the second grading period–a situation where most teachers will find themselves right now. I remember a time not so long ago when teachers knew their L.O.E. by the end of the school year. Since the state’s implementation of TNReady, that hasn’t happened.

If I were a principal, I’m also upset about the timing of the release of the L.O.E. scores. They shouldn’t have to wait this long into the school year before finding out who their effective and ineffective teachers were last year. Part of their job is to help the ineffective teachers get back on track. Granted, a good principal will probably already know who these teachers are, but nothing can be made official until the L.O.E. scores are released. These scores are also used to determine whether teachers are rehired the following school year and if teachers will be granted tenure. Personnel decisions should be made over the summer, and the late release of these teacher effectiveness scores is not helpful in the least.

NULLIFY

If you find out, as Mike did, that including the scores may have an undesirable impact, you have the option of nullifying your entire LOE — in fact, even if the score is good, if TNReady makes up any part of your overall LOE, you have the nullification option. Here’s more from Mike on that:

What immediately struck me is that all three of these options include my students’ growth on a flawed test that, by law, isn’t supposed to hurt me if last year’s test results are included, which they are. My overall L.O.E. score is a 4 out of 5, which still isn’t too bad, but the previous three years it has been a 5 out of 5. This means that the TNReady scores are, in fact, hurting my L.O.E. So what do I do now?

As the president of the Coffee County Education Association, I received the following message from TEA today that I quickly forwarded to my members: “To comply with the [hold harmless] legislation, teachers and principals who have 2017-18 TNReady data included in their LOE may choose to nullify their entire evaluation score (LOE) for the 2017-18 school year at their discretion. An educator’s decision to nullify the LOE can be made independently or in consultation with his/her evaluator during the evaluation summative conference. Nullification is completed by the educator in the TNCompass platform. The deadline for an educator to nullify his/her LOE is midnight CT on Nov. 30.”

In addition to the valid concerns Mike raises, I’ve heard from teachers in several districts noting mistakes in the LOE number. These may result from including TVAAS data in a way that negatively impacts a teacher or using the incorrect option when it comes to factoring in scores. It is my understanding that several districts have alerted TDOE of these errors and are awaiting a response.

One key question is: What happens if you nullify your scores, and therefore have no LOE this year? Here’s an answer from TDOE:

Educators who choose to nullify their 2017-18 LOE may still be able to earn Professional Development Points (PDPs). Educators who choose to nullify their 2017-18 LOE may use their 2016-17 score to earn applicable PDPs;

So, PDPs are covered if you nullify. Great.

For educators who nullify their 2017-18 LOE, the number of observations required in 2018- 19 will be calculated based on 2016-17 data in conjunction with the educator’s current license type.

Looks like classroom observations have also been covered.

If a teacher chooses to nullify his or her 2017-18, LOE he or she may still become eligible for tenure this year. Pursuant to T.C.A. § 49-5-503(4), “a teacher who has met all other requirements for tenure eligibility but has not acquired an official evaluation score during the last one (1) or two (2) years of the probationary period due to an approved extended leave; transfer to another school or position within the school district; or invalidated data due to a successful local level evaluation grievance pursuant to § 49-1-302(d)(2)(A) may utilize the most recent two (2) years of available evaluation scores achieved during the probationary period.”

The bottom line: If you do nullify (and many are in situations where that’s a good idea), there should be no future adverse impact according to TDOE’s guidance.

The larger issue, in my view, is the one Mike raises: It’s pretty late in the year to be returning evaluation feedback to teachers and principals. The LOE determines the number of observations a teacher is to have (which impacts principal workload). It could, as Mike indicates, also point to areas for improvement or teachers who need additional support. But providing those numbers well into the school year significantly reduces the opportunity for meaningful action on those fronts.

Despite all these stubborn facts, Tennessee’s Commissioner of Education points to the teacher evaluation process a “key driver” of our state’s education success.

It seems highly unlikely a process this flawed is making much of a positive impact on teachers and schools.

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support keeps the education news flowing!


 

Deleted

In the wake of last year’s TNReady troubles, the Tennessee General Assembly passed legislation saying “no adverse action” could be taken against teachers, students, or schools based on the results. While legislators passed the bill late in the session, the Tennessee Department of Education was left to implement policy.

As this school year is up and running, teachers and administrators are asking what to do with data from 2017-18. Helpfully, the TDOE released this handy guidance document. The document lets teachers know they can choose to nullify their entire Level of Effectiveness (LOE) score from 2017-18 if TNReady scores were included in any part of a teacher’s overall TEAM evaluation score.

But nullifying your score could lead to unintended “adverse actions,” couldn’t it? Well, maybe. But, the always thoughtful TDOE is ahead of the game. They also have a guide to nullification.

This guide makes clear that even if a teacher chooses to nullify his or her entire LOE for 2017-18, no adverse action will impact that teacher.

Here are a couple key points:

Educators who choose to nullify their 2017-18 LOE may still be able to earn Professional Development Points (PDPs). Educators who choose to nullify their 2017-18 LOE may use their 2016-17 score to earn applicable PDPs;

So, PDPs are covered if you nullify. Great.

For educators who nullify their 2017-18 LOE, the number of observations required in 2018- 19 will be calculated based on 2016-17 data in conjunction with the educator’s current license type.

Looks like classroom observations have also been covered.

If a teacher chooses to nullify his or her 2017-18, LOE he or she may still become eligible for tenure this year. Pursuant to T.C.A. § 49-5-503(4), “a teacher who has met all other requirements for tenure eligibility but has not acquired an official evaluation score during the last one (1) or two (2) years of the probationary period due to an approved extended leave; transfer to another school or position within the school district; or invalidated data due to a successful local level evaluation grievance pursuant to § 49-1-302(d)(2)(A) may utilize the most recent two (2) years of available evaluation scores achieved during the probationary period.”

Worried about tenure? TDOE has you covered!

So far, so good, right?

Well, then there was an email sent by the Education Value-Added Assessment System (the vendor that calculates TVAAS).

Here’s what teachers saw in their inboxes this week:

Due to the upcoming release of TVAAS reports for the 2017-18 school year, some of the data from the 2016-17 reporting will no longer be available.

*    The current student projections will be removed and replaced with new projections based on the most recent year of assessment data.
*    Current Custom Student reports will be removed.
*    District administrators will lose access to Teacher Value-Added reports and composites for teachers who do not receive a Teacher Value-Added report in their district in 2017-18.
*    School administrators will lose access to Teacher Value-Added reports and composites for teachers in their school who do not receive a Value-Added report in 2017-18.

If you would like to save value-added and student projection data from the 2016-17 reporting, you must print or export that data by September 26. TVAAS users are reminded to follow all local data policies when exporting or printing confidential data.

But wait, the 2016-17 data is crucial for teachers who choose to nullify their 2017-18 LOE. Why is a significant portion of this data being deleted?

Also, note that student projections are being updated based on the 2017-18 scores.

What?

The 2017-18 test was plagued by hackers, dump trucks, and mixed up tests. Still, the TDOE plans to use that data to update student projections. These projections will then be used to assign value-added scores going forward.

That’s one hell of an adverse impact. Or, it could be. It really depends on how the 2017-18 scores impact the projected performance of given students.

The legislation in plain language indicated teachers and schools would face “no adverse action” based on the 2017-18 TNReady administration. Now, teachers are being told that future student growth projections will be based on data from this test. It’s possible that could have a positive impact on a teacher’s future growth score. It certainly could also have a rather negative impact.

The potentially adverse action of allowing the 2017-18 TNReady scores to impact future growth scores for teachers and schools has not been addressed.

By the way, we now have the following set of apples, oranges, and bananas from which we are determining student growth:

2015 — TCAP

2016 — NO TNReady

2017 — pencil and paper TNReady

2018 — Hacker and Dump Truck TNReady

It’s difficult to see how any reliable growth score can be achieved using these results.

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support keeps the education news coming!


 

No Adverse Action

After much wrangling in a day that saw the Tennessee House of Representatives hold up proceedings in order to move forward with an effort to truly hold students, teachers, and schools harmless in light of this year’s TNReady trouble, it appears a compromise of sorts has been reached.

Here’s the language just adopted by the Senate and subsequently passed by the House:

SECTION 1. Tennessee Code Annotated, Title 49, Chapter 6, Part 60, is amended by adding the following language as a new section: Notwithstanding any law to the contrary, no adverse action may be taken against any student, teacher, school, or LEA based, in whole or in part, on student achievement data generated from the 2017-2018 TNReady assessments. For purposes of this section, “adverse action” includes, but is not limited to, the identification of a school as a priority school and the assignment of a school to the achievement school district.

This language does not explicitly address the issue of using TNReady for TVAAS, but it has an effect similar to legislation passed in 2016 during that year’s TNReady trouble. Yes, it seems problems with testing in Tennessee are the norm rather than the exception.

Here’s what this should mean for teachers: Yes, a TVAAS score will be calculated based on this year’s TNReady. But, if that TVAAS score lowers your overall TEAM score, it will be excluded — lowering your TEAM score would be an “adverse action.”

While not perfect, this compromise is a victory — the TNReady data from a messed up test will not harm grades or be used in the state’s A-F report card for schools or be used to give a negative growth score to a teacher via TVAAS.

Yes, TVAAS is still suspect, but there’s an election in November and a new Commissioner of Education coming after that. Heading into the November election is a great time to talk with candidates for the legislature and for Governor about the importance of evaluations that are fair and not based on voodoo math like TVAAS. Remember, even under the best of circumstances, TVAAS would not have yielded valid results this year.

While it is disappointing that Senators did not want to follow the lead of their House counterparts and explicitly deal with the TVAAS issue, there’s no doubt that persistent outreach by constituents moved the needle on this issue.

For more on education politics and policy in Tennessee, follow @TNEdReport

If you enjoy the education news provided here, consider becoming a patron!


 

Would You Eat This Pie?

After last week’s TNReady failure, the Tennessee General Assembly took some action to mitigate the impact the test would have on students and teachers.

I wrote at the time that the legislature’s action was a good step, but not quite enough:

  1. The law does say that districts and schools will not receive an “A-F” score based on the results of this year’s test. It also says schools can’t be placed on the state’s priority list based on the scores. That’s good news.

  2. The law gives districts the option of not counting this year’s scores in student grades. Some districts had already said they wouldn’t count the test due to the likelihood the scores would arrive late. Now, all districts can take this action if they choose.

  3. The law says any score generated for teachers based on this year’s test cannot be used in employment/compensation decisions.

Here’s what the law didn’t say: There will be NO TVAAS scores for teachers this year based on this data.

In other words, this year’s TNReady test WILL factor into a teacher’s evaluation.

The Department of Education took some steps to clarify what that means for teachers and offered a handy pie chart to explain the evaluation process:

First, this chart makes clear that this year’s TNReady scores WILL factor into a teacher’s overall evaluation.

Second, this chart is crazy. A teacher’s growth score is factored on tests from three different years and three types of tests.

15% of the growth score comes from the old TCAP (the test given in 2014-15, b/c the 2015-16 test had some problems). Then, 10% comes from last year’s TNReady, which was given on paper and pencil. Last year was the first year of a full administration of TNReady, and there were a few problems with the data calculation. A final 10% comes from this year’s TNReady, given online.

So, you have data from the old test, a skipped year, data from last year’s test (the first time TNReady had truly been administered), and data from this year’s messed up test.

There is no way this creates any kind of valid score related to teacher performance. At all.

In fact, transitioning to a new type of test creates validity issues. The way to address that is to gather three or more years of data and then build on that.

Here’s what I noted from statisticians who study the use of value-added to assess teacher performance:

Researchers studying the validity of value-added measures asked whether value-added gave different results depending on the type of question asked. Particularly relevant now because Tennessee is shifting to a new test with different types of questions.

Here’s what Lockwood and McCaffrey (2007) had to say in the Journal of Educational Measurement:

We find that the variation in estimated effects resulting from the different mathematics achievement measures is large relative to variation resulting from choices about model specification, and that the variation within teachers across achievement measures is larger than the variation across teachers. These results suggest that conclusions about individual teachers’ performance based on value-added models can be sensitive to the ways in which student achievement is measured.
These findings align with similar findings by Martineau (2006) and Schmidt et al (2005)
You get different results depending on the type of question you’re measuring.

The researchers tested various VAM models (including the type used in TVAAS) and found that teacher effect estimates changed significantly based on both what was being measured AND how it was measured.

And they concluded:

Our results provide a clear example that caution is needed when interpreting estimated teacher effects because there is the potential for teacher performance to depend on the skills that are measured by the achievement tests.

If you measure different skills, you get different results. That decreases (or eliminates) the reliability of those results. TNReady is measuring different skills in a different format than TCAP. It’s BOTH a different type of test AND a test on different standards. Any value-added comparison between the two tests is statistically suspect, at best. In the first year, such a comparison is invalid and unreliable. As more years of data become available, it may be possible to make some correlation between past TCAP results and TNReady scores.

I’ve written before about the shift to TNReady and any comparisons to prior tests being like comparing apples and oranges.

Here’s what the TN Department of Education’s pie chart does: It compares an apple to nothing to an orange to a banana.

Year 1: Apple (which counts 15%)

Year 2: Nothing, test was so messed up it was cancelled

Year 3: Orange – first year of TNReady (on pencil and paper)

Year 4: Banana – Online TNReady is a mess, students experience login, submission problems across the state.

From these four events, the state is suggesting that somehow, a valid score representing a teacher’s impact on student growth can be obtained. The representative from the Department of Education at today’s House Education Instruction and Programs Committee hearing said the issue was not that important, because this year’s test only counted for 10% of the overall growth score for a teacher. Some teachers disagree.

Also, look at that chart again. Too far up? Too confusing? Don’t worry, I’ve made a simpler version:

For more on education politics and policy in Tennessee, follow @TNEdReport


 

 

Your support keeps Tennessee Education Report going strong — thank you!

TNReady and TVAAS: A Teacher’s Perspective

Nashville teacher Amanda Kail talks about the connection between TNReady and TVAAS and the importance of legislation moving TODAY that could actually hold teachers harmless.

QUESTION: I thought the legislature said the tests wouldn’t count. What’s going on?
ANSWER: The state legislature was moved by all the horror stories surrounding testing problems to tack a bunch of amendments on to the only remaining education bill of the session (HB1109/SB0987) which attempted to “hold harmless” students, teachers, and schools for the results of the test. What this technically means is that local boards of education can vote on how much they want the students’ scores to count towards their grades (0-15%), and that the data cannot be used to issue a letter grade to schools (A-F, another asinine idea designed to find new ways to punish schools that serve mostly poor kids, but I digress).
However, for teachers the bill specified only that the results of the testing could not be used for decisions regarding employment and compensation. It does not say anything about the scores not being used for EVALUATIONS. Because of this, many teachers across the state pushed TEA to go back to the legislature and demand that the legislation be amended to exclude this year’s scores from TVAAS. You can read more about the particulars of that in Andy Spears’ excellent article for the Tennessee Education Report.
As a result, the House Finance Committee voted to strip all the amendments from HB1109 and start over again with the “hold harmless” language. That needs to happen TOMORROW (4/24/18 — TODAY).
QUESTION: What is TVAAS?
ANSWER: Teachers in Tennessee have evaluations based partly on value-added measures (we called it “TVAAS” here). What this means is that the Tennessee Department of Education uses some sort of mystical secret algorithm (based on cattle propagation– REALLY!) to calculate how much growth each student will generate on statewide tests. If a student scores less growth (because, like, maybe their test crashed 10 times and they weren’t really concentrating so much anymore) than predicted, that student’s teacher receives a negative number that is factored into their yearly effectiveness score. Generally, TVAAS has been decried by everyone from our state teacher union to the American Statistical Association (and when you upset the statisticians, you have really gone too far), but the state continues to defend its use.
QUESTION: What if I am a teacher who didn’t experience any problems, and I think my students did great on the test? Why would I want to oppose using this year’s data for TVAAS?
ANSWER: Thousands of your colleagues around the state don’t have that luxury, because they DID have problems, and their students’ scores suffered as a result. In fact, even in a good year, thousands of your colleagues have effectiveness scores based on subjects they don’t even teach, because TVAAS is only based on tested subjects (math, ELA, and depending on the year science and social studies). The fact is that TVAAS is a rotten system. If it benefits you individually as a teacher, that’s great for you. But too many of your colleagues are driven out of the classroom by the absurdity of being held accountable for things completely beyond their control. As a fellow professional, I hope you see the wisdom in advocating for a sane system over one that just benefits you personally.
QUESTION: Okay. So what do we do now?
ANSWER: Contact your state house and senate representatives! TODAY! These are the last days of the legislative session, so it is IMPERATIVE that you contact them now and tell them to support amendments to HB1109 and SB0987 that will stop the use of this year’s testing data towards TVAAS. You can find your legislators here.
Don’t leave teachers holding the bag for the state’s mistakes. AGAIN.
For more on education politics and policy in Tennessee, follow @TNEdReport