Growth Scores

Get your kid assigned to the right teacher and they just might grow a little taller, new research suggests.

Tennessee has long used something called “value-added assessment” to determine the amount of academic growth students make from year to year. These “growth scores” are then used to generate a score for teachers. The formula in Tennessee is known as TVAAS — Tennessee Value Added Assessment System. Tennessee was among the first states in the nation to use value-added assessment, and the formula became a part of teacher evaluations in 2011.

Here’s how the Tennessee Department of Education describes the utility of TVAAS:


Because students’ performance is compared to that of their peers, and because their peers are moving through the same standards and assessment transitions at the same time, any drops in proficiency during these transitions have no impact on the ability of teachers, schools, and districts to earn strong TVAAS scores.

Now, research on value-added modeling indicates teacher assignment is almost as likely to predict the future height of students as it is their academic achievement. Here’s the abstract from a National Bureau of Economic Research working paper:

Estimates of teacher “value-added” suggest teachers vary substantially in their ability to promote student learning. Prompted by this finding, many states and school districts have adopted valueadded measures as indicators of teacher job performance. In this paper, we conduct a new test of the validity of value-added models. Using administrative student data from New York City, we apply commonly estimated value-added models to an outcome teachers cannot plausibly affect: student height. We find the standard deviation of teacher effects on height is nearly as large as that for math and reading achievement, raising obvious questions about validity. Subsequent analysis finds these “effects” are largely spurious variation (noise), rather than bias resulting from sorting on unobserved factors related to achievement. Given the difficulty of differentiating signal from noise in real-world teacher effect estimates, this paper serves as a cautionary tale for their use in practice.

The researchers offer a word of caution:

Taken together, our results provide a cautionary tale for the naïve application of VAMs to teacher evaluation and other settings. They point to the possibility of the misidentification of sizable teacher
“effects” where none exist. These effects may be due in part to spurious variation driven by the typically small samples of children used to estimate a teacher’s individual effect.

In short: Using TVAAS to make decisions regarding hiring, firing, and compensation is bad policy.

However, the authors note that policymakers thirst for low-cost, convenient solutions:

In the face of data and measurement limitations, school leaders and state
education departments seek low-cost, unbiased ways to observe and monitor the impact that their teachers have on students. Although many have criticized the use of VAMs to evaluate teachers, they remain a
widely-used measure of teacher performance. In part, their popularity is due to convenience-while observational protocols which send observers to every teacher’s classroom require expensive training and considerable resources to implement at scale, VAMs use existing data and can be calculated centrally at low cost.

While states like Hawaii and Oklahoma have moved away from value-added models in teacher evaluation, Tennessee remains committed to this flawed method. Perhaps Tennessee lawmakers are hoping for the formula that will ensure a crop of especially tall kids ready to bring home a UT basketball national title.

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support — $5 or more today — makes publishing education news possible.



Regression to the Mean

A guest post from Ken Chilton, who teaches education policy at Tennessee State University

When organizations plan for strategic change, one tenet is to cherry pick some easy wins early in the process to build support for the program and new momentum. School districts across the state of Tennessee are doing exactly that. They are parading the recently released TVAAS data that shows big jumps in value-added achievement.

Good news should be trumpeted, but I’m not sure this is good news. Unfortunately, most people have no idea what TVAAS measures. A score of “5” out of a possible 5 sounds impressive; however, it is an extremely flawed measure of success. While TVAAS purports to measure student growth year over year, the Department of Education advises, “Growth scores should be used alongside achievement scores from TNReady to show the fuller picture of students’ performance.”

When happy education administrators state “math scores increased from 27.6% proficient or greater to 28.1%” what does this mean? How do we translate a school district’s TVAAS score or essentially meaningless *increase* in scores to your child’s performance? It simply means that on one day your child took a standardized test and was considered above/below a proficiency threshold designated by an education technocrat. It provides little information on your child’s level of achievement and/or the quality of her/his teacher.

Surely, we wouldn’t spend millions of dollars annually and weeks upon weeks on preparation on a test that is meaningless, would we? Sadly, the answer is yes. In statistics, the term “regression to the mean” is used to explain how extremely low and high performers tend to move toward the average over time. If you start with really low scores, it’s easier to show large gains. Achieving a one-year jump in test scores or value-added algorithms at the school or district level does not mean your district or school is performing at a high level.

For example, let’s take two groups of kids and test them on their ability to complete 50 pushups—our chosen benchmark for measuring fitness proficiency. Let’s assume group A completed an average of 65 pushups last year. Group A participants have private trainers and nutritionists who work with them outside normal training hours. This year, Group A completes an average of 66 pushups. The trainers did not achieve much in terms of value-added.

Group B, on the other hand, has had little training. Last year, they averaged 5 pushups per participant. After concerted efforts to improve their performance, they averaged 10 pushups per participant this year. They DOUBLED their output and would likely show high value-added performance. Granted, they are still woefully below the 50-pushup benchmark.

In a nutshell, superintendents across the state are celebrating a nebulous statistic. Critics of value-added tests to measure teacher performance have long argued that state tests—especially multiple-choice ones—are woefully inadequate measures of a teacher’s impact on learning. TVAAS assumes that teacher effects can be isolated from the array of external variables that are widely recognized as factors that affect student performance. So much of learning occurs outside the school, but none of these factors are controlled for in value-added scores.

Here’s the good news: positive things are happening. Celebrate that. However, don’t mislead the public. One year of data does not make a trend—especially when the 2018 data were massively flawed. What matters is performance. Tennessee’s TN Ready test focuses solely on Tennessee standards. As such, parents cannot compare student results to other states that have different standards.

If you want to know how well Tennessee performs relative to other states, focus on the National Assessment of Education Progress (NAEP). NAEP is a good test and allows state-to-state comparison of performance using rigorous proficiency standards. It is administered every 2-years to randomly selected 4th, 8th, and 12th graders.

If you analyze NAEP data, Tennessee has not experienced sustained improvements in 4th and 8th grade reading and math tests over the last 3 testing periods. In 2017, 33 percent of Tennessee 4th graders and 31 percent of 8th graders achieved NAEP proficiency in reading. In math, 36 percent of 4th graders and 30 percent of 8th graders achieved NAEP proficiency.

The sad truth remains: most of the factors associated with student performance are related to socio-economic status. Inasmuch as poverty rates, absenteeism, parental involvement, household stability, and economic certainty are outside the control of school administrators and teachers, school performance data will underwhelm. Thus, we celebrate improvements in TVAAS algorithms that are not valid predictors of teacher performance.  

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support$5 or more today — helps make publishing education news possible.

Nullification


Remember when the Tennessee General Assembly first past “hold harmless” legislation and then added “no adverse action” language so that TNReady scores from another failed administration would not negatively impact students, teachers, or schools?

It turns out, the return of TVAAS scores may in fact result in some adverse actions. I’ve reported on how the incorporation of TVAAS scores based on this year’s TNReady test into overall student growth projections could have lasting, negative impacts on teachers.

Now, Coffee County educator Mike Stein has a blog post up about this year’s TVAAS scores and a teacher’s Level of Effectiveness (LOE).

Here are a couple key takeaways:

Today is Thursday, October 25th and, as of today, I am 29% of the way into the school year. This afternoon, I received my overall teacher evaluation score from last school year (called the “level of effectiveness,” or L.O.E. for short). I have some major issues with how all of this is playing out.

To begin with, why am I just now finding out how well I did last school year? Teachers often times use the summer to make any kind of major adjustments to their curriculum and to their teaching strategies. It’s quite difficult to make changes in the middle of a unit in the middle of the second grading period–a situation where most teachers will find themselves right now. I remember a time not so long ago when teachers knew their L.O.E. by the end of the school year. Since the state’s implementation of TNReady, that hasn’t happened.

If I were a principal, I’m also upset about the timing of the release of the L.O.E. scores. They shouldn’t have to wait this long into the school year before finding out who their effective and ineffective teachers were last year. Part of their job is to help the ineffective teachers get back on track. Granted, a good principal will probably already know who these teachers are, but nothing can be made official until the L.O.E. scores are released. These scores are also used to determine whether teachers are rehired the following school year and if teachers will be granted tenure. Personnel decisions should be made over the summer, and the late release of these teacher effectiveness scores is not helpful in the least.

NULLIFY

If you find out, as Mike did, that including the scores may have an undesirable impact, you have the option of nullifying your entire LOE — in fact, even if the score is good, if TNReady makes up any part of your overall LOE, you have the nullification option. Here’s more from Mike on that:

What immediately struck me is that all three of these options include my students’ growth on a flawed test that, by law, isn’t supposed to hurt me if last year’s test results are included, which they are. My overall L.O.E. score is a 4 out of 5, which still isn’t too bad, but the previous three years it has been a 5 out of 5. This means that the TNReady scores are, in fact, hurting my L.O.E. So what do I do now?

As the president of the Coffee County Education Association, I received the following message from TEA today that I quickly forwarded to my members: “To comply with the [hold harmless] legislation, teachers and principals who have 2017-18 TNReady data included in their LOE may choose to nullify their entire evaluation score (LOE) for the 2017-18 school year at their discretion. An educator’s decision to nullify the LOE can be made independently or in consultation with his/her evaluator during the evaluation summative conference. Nullification is completed by the educator in the TNCompass platform. The deadline for an educator to nullify his/her LOE is midnight CT on Nov. 30.”

In addition to the valid concerns Mike raises, I’ve heard from teachers in several districts noting mistakes in the LOE number. These may result from including TVAAS data in a way that negatively impacts a teacher or using the incorrect option when it comes to factoring in scores. It is my understanding that several districts have alerted TDOE of these errors and are awaiting a response.

One key question is: What happens if you nullify your scores, and therefore have no LOE this year? Here’s an answer from TDOE:

Educators who choose to nullify their 2017-18 LOE may still be able to earn Professional Development Points (PDPs). Educators who choose to nullify their 2017-18 LOE may use their 2016-17 score to earn applicable PDPs;

So, PDPs are covered if you nullify. Great.

For educators who nullify their 2017-18 LOE, the number of observations required in 2018- 19 will be calculated based on 2016-17 data in conjunction with the educator’s current license type.

Looks like classroom observations have also been covered.

If a teacher chooses to nullify his or her 2017-18, LOE he or she may still become eligible for tenure this year. Pursuant to T.C.A. § 49-5-503(4), “a teacher who has met all other requirements for tenure eligibility but has not acquired an official evaluation score during the last one (1) or two (2) years of the probationary period due to an approved extended leave; transfer to another school or position within the school district; or invalidated data due to a successful local level evaluation grievance pursuant to § 49-1-302(d)(2)(A) may utilize the most recent two (2) years of available evaluation scores achieved during the probationary period.”

The bottom line: If you do nullify (and many are in situations where that’s a good idea), there should be no future adverse impact according to TDOE’s guidance.

The larger issue, in my view, is the one Mike raises: It’s pretty late in the year to be returning evaluation feedback to teachers and principals. The LOE determines the number of observations a teacher is to have (which impacts principal workload). It could, as Mike indicates, also point to areas for improvement or teachers who need additional support. But providing those numbers well into the school year significantly reduces the opportunity for meaningful action on those fronts.

Despite all these stubborn facts, Tennessee’s Commissioner of Education points to the teacher evaluation process a “key driver” of our state’s education success.

It seems highly unlikely a process this flawed is making much of a positive impact on teachers and schools.

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support keeps the education news flowing!


 

Deleted


In the wake of last year’s TNReady troubles, the Tennessee General Assembly passed legislation saying “no adverse action” could be taken against teachers, students, or schools based on the results. While legislators passed the bill late in the session, the Tennessee Department of Education was left to implement policy.

As this school year is up and running, teachers and administrators are asking what to do with data from 2017-18. Helpfully, the TDOE released this handy guidance document. The document lets teachers know they can choose to nullify their entire Level of Effectiveness (LOE) score from 2017-18 if TNReady scores were included in any part of a teacher’s overall TEAM evaluation score.

But nullifying your score could lead to unintended “adverse actions,” couldn’t it? Well, maybe. But, the always thoughtful TDOE is ahead of the game. They also have a guide to nullification.

This guide makes clear that even if a teacher chooses to nullify his or her entire LOE for 2017-18, no adverse action will impact that teacher.

Here are a couple key points:

Educators who choose to nullify their 2017-18 LOE may still be able to earn Professional Development Points (PDPs). Educators who choose to nullify their 2017-18 LOE may use their 2016-17 score to earn applicable PDPs;

So, PDPs are covered if you nullify. Great.

For educators who nullify their 2017-18 LOE, the number of observations required in 2018- 19 will be calculated based on 2016-17 data in conjunction with the educator’s current license type.

Looks like classroom observations have also been covered.

If a teacher chooses to nullify his or her 2017-18, LOE he or she may still become eligible for tenure this year. Pursuant to T.C.A. § 49-5-503(4), “a teacher who has met all other requirements for tenure eligibility but has not acquired an official evaluation score during the last one (1) or two (2) years of the probationary period due to an approved extended leave; transfer to another school or position within the school district; or invalidated data due to a successful local level evaluation grievance pursuant to § 49-1-302(d)(2)(A) may utilize the most recent two (2) years of available evaluation scores achieved during the probationary period.”

Worried about tenure? TDOE has you covered!

So far, so good, right?

Well, then there was an email sent by the Education Value-Added Assessment System (the vendor that calculates TVAAS).

Here’s what teachers saw in their inboxes this week:

Due to the upcoming release of TVAAS reports for the 2017-18 school year, some of the data from the 2016-17 reporting will no longer be available.

*    The current student projections will be removed and replaced with new projections based on the most recent year of assessment data.
*    Current Custom Student reports will be removed.
*    District administrators will lose access to Teacher Value-Added reports and composites for teachers who do not receive a Teacher Value-Added report in their district in 2017-18.
*    School administrators will lose access to Teacher Value-Added reports and composites for teachers in their school who do not receive a Value-Added report in 2017-18.

If you would like to save value-added and student projection data from the 2016-17 reporting, you must print or export that data by September 26. TVAAS users are reminded to follow all local data policies when exporting or printing confidential data.

But wait, the 2016-17 data is crucial for teachers who choose to nullify their 2017-18 LOE. Why is a significant portion of this data being deleted?

Also, note that student projections are being updated based on the 2017-18 scores.

What?

The 2017-18 test was plagued by hackers, dump trucks, and mixed up tests. Still, the TDOE plans to use that data to update student projections. These projections will then be used to assign value-added scores going forward.

That’s one hell of an adverse impact. Or, it could be. It really depends on how the 2017-18 scores impact the projected performance of given students.

The legislation in plain language indicated teachers and schools would face “no adverse action” based on the 2017-18 TNReady administration. Now, teachers are being told that future student growth projections will be based on data from this test. It’s possible that could have a positive impact on a teacher’s future growth score. It certainly could also have a rather negative impact.

The potentially adverse action of allowing the 2017-18 TNReady scores to impact future growth scores for teachers and schools has not been addressed.

By the way, we now have the following set of apples, oranges, and bananas from which we are determining student growth:

2015 — TCAP

2016 — NO TNReady

2017 — pencil and paper TNReady

2018 — Hacker and Dump Truck TNReady

It’s difficult to see how any reliable growth score can be achieved using these results.

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support keeps the education news coming!


 

No Adverse Action


After much wrangling in a day that saw the Tennessee House of Representatives hold up proceedings in order to move forward with an effort to truly hold students, teachers, and schools harmless in light of this year’s TNReady trouble, it appears a compromise of sorts has been reached.

Here’s the language just adopted by the Senate and subsequently passed by the House:

SECTION 1. Tennessee Code Annotated, Title 49, Chapter 6, Part 60, is amended by adding the following language as a new section: Notwithstanding any law to the contrary, no adverse action may be taken against any student, teacher, school, or LEA based, in whole or in part, on student achievement data generated from the 2017-2018 TNReady assessments. For purposes of this section, “adverse action” includes, but is not limited to, the identification of a school as a priority school and the assignment of a school to the achievement school district.

This language does not explicitly address the issue of using TNReady for TVAAS, but it has an effect similar to legislation passed in 2016 during that year’s TNReady trouble. Yes, it seems problems with testing in Tennessee are the norm rather than the exception.

Here’s what this should mean for teachers: Yes, a TVAAS score will be calculated based on this year’s TNReady. But, if that TVAAS score lowers your overall TEAM score, it will be excluded — lowering your TEAM score would be an “adverse action.”

While not perfect, this compromise is a victory — the TNReady data from a messed up test will not harm grades or be used in the state’s A-F report card for schools or be used to give a negative growth score to a teacher via TVAAS.

Yes, TVAAS is still suspect, but there’s an election in November and a new Commissioner of Education coming after that. Heading into the November election is a great time to talk with candidates for the legislature and for Governor about the importance of evaluations that are fair and not based on voodoo math like TVAAS. Remember, even under the best of circumstances, TVAAS would not have yielded valid results this year.

While it is disappointing that Senators did not want to follow the lead of their House counterparts and explicitly deal with the TVAAS issue, there’s no doubt that persistent outreach by constituents moved the needle on this issue.

For more on education politics and policy in Tennessee, follow @TNEdReport

If you enjoy the education news provided here, consider becoming a patron!


 

Would You Eat This Pie?


After last week’s TNReady failure, the Tennessee General Assembly took some action to mitigate the impact the test would have on students and teachers.

I wrote at the time that the legislature’s action was a good step, but not quite enough:

  1. The law does say that districts and schools will not receive an “A-F” score based on the results of this year’s test. It also says schools can’t be placed on the state’s priority list based on the scores. That’s good news.

  2. The law gives districts the option of not counting this year’s scores in student grades. Some districts had already said they wouldn’t count the test due to the likelihood the scores would arrive late. Now, all districts can take this action if they choose.

  3. The law says any score generated for teachers based on this year’s test cannot be used in employment/compensation decisions.

Here’s what the law didn’t say: There will be NO TVAAS scores for teachers this year based on this data.

In other words, this year’s TNReady test WILL factor into a teacher’s evaluation.

The Department of Education took some steps to clarify what that means for teachers and offered a handy pie chart to explain the evaluation process:

First, this chart makes clear that this year’s TNReady scores WILL factor into a teacher’s overall evaluation.

Second, this chart is crazy. A teacher’s growth score is factored on tests from three different years and three types of tests.

15% of the growth score comes from the old TCAP (the test given in 2014-15, b/c the 2015-16 test had some problems). Then, 10% comes from last year’s TNReady, which was given on paper and pencil. Last year was the first year of a full administration of TNReady, and there were a few problems with the data calculation. A final 10% comes from this year’s TNReady, given online.

So, you have data from the old test, a skipped year, data from last year’s test (the first time TNReady had truly been administered), and data from this year’s messed up test.

There is no way this creates any kind of valid score related to teacher performance. At all.

In fact, transitioning to a new type of test creates validity issues. The way to address that is to gather three or more years of data and then build on that.

Here’s what I noted from statisticians who study the use of value-added to assess teacher performance:

Researchers studying the validity of value-added measures asked whether value-added gave different results depending on the type of question asked. Particularly relevant now because Tennessee is shifting to a new test with different types of questions.

Here’s what Lockwood and McCaffrey (2007) had to say in the Journal of Educational Measurement:

We find that the variation in estimated effects resulting from the different mathematics achievement measures is large relative to variation resulting from choices about model specification, and that the variation within teachers across achievement measures is larger than the variation across teachers. These results suggest that conclusions about individual teachers’ performance based on value-added models can be sensitive to the ways in which student achievement is measured.
These findings align with similar findings by Martineau (2006) and Schmidt et al (2005)
You get different results depending on the type of question you’re measuring.

The researchers tested various VAM models (including the type used in TVAAS) and found that teacher effect estimates changed significantly based on both what was being measured AND how it was measured.

And they concluded:

Our results provide a clear example that caution is needed when interpreting estimated teacher effects because there is the potential for teacher performance to depend on the skills that are measured by the achievement tests.

If you measure different skills, you get different results. That decreases (or eliminates) the reliability of those results. TNReady is measuring different skills in a different format than TCAP. It’s BOTH a different type of test AND a test on different standards. Any value-added comparison between the two tests is statistically suspect, at best. In the first year, such a comparison is invalid and unreliable. As more years of data become available, it may be possible to make some correlation between past TCAP results and TNReady scores.

I’ve written before about the shift to TNReady and any comparisons to prior tests being like comparing apples and oranges.

Here’s what the TN Department of Education’s pie chart does: It compares an apple to nothing to an orange to a banana.

Year 1: Apple (which counts 15%)

Year 2: Nothing, test was so messed up it was cancelled

Year 3: Orange – first year of TNReady (on pencil and paper)

Year 4: Banana – Online TNReady is a mess, students experience login, submission problems across the state.

From these four events, the state is suggesting that somehow, a valid score representing a teacher’s impact on student growth can be obtained. The representative from the Department of Education at today’s House Education Instruction and Programs Committee hearing said the issue was not that important, because this year’s test only counted for 10% of the overall growth score for a teacher. Some teachers disagree.

Also, look at that chart again. Too far up? Too confusing? Don’t worry, I’ve made a simpler version:

For more on education politics and policy in Tennessee, follow @TNEdReport


 

 

Your support keeps Tennessee Education Report going strong — thank you!

TNReady and TVAAS: A Teacher’s Perspective


Nashville teacher Amanda Kail talks about the connection between TNReady and TVAAS and the importance of legislation moving TODAY that could actually hold teachers harmless.

QUESTION: I thought the legislature said the tests wouldn’t count. What’s going on?
ANSWER: The state legislature was moved by all the horror stories surrounding testing problems to tack a bunch of amendments on to the only remaining education bill of the session (HB1109/SB0987) which attempted to “hold harmless” students, teachers, and schools for the results of the test. What this technically means is that local boards of education can vote on how much they want the students’ scores to count towards their grades (0-15%), and that the data cannot be used to issue a letter grade to schools (A-F, another asinine idea designed to find new ways to punish schools that serve mostly poor kids, but I digress).
However, for teachers the bill specified only that the results of the testing could not be used for decisions regarding employment and compensation. It does not say anything about the scores not being used for EVALUATIONS. Because of this, many teachers across the state pushed TEA to go back to the legislature and demand that the legislation be amended to exclude this year’s scores from TVAAS. You can read more about the particulars of that in Andy Spears’ excellent article for the Tennessee Education Report.
As a result, the House Finance Committee voted to strip all the amendments from HB1109 and start over again with the “hold harmless” language. That needs to happen TOMORROW (4/24/18 — TODAY).
QUESTION: What is TVAAS?
ANSWER: Teachers in Tennessee have evaluations based partly on value-added measures (we called it “TVAAS” here). What this means is that the Tennessee Department of Education uses some sort of mystical secret algorithm (based on cattle propagation– REALLY!) to calculate how much growth each student will generate on statewide tests. If a student scores less growth (because, like, maybe their test crashed 10 times and they weren’t really concentrating so much anymore) than predicted, that student’s teacher receives a negative number that is factored into their yearly effectiveness score. Generally, TVAAS has been decried by everyone from our state teacher union to the American Statistical Association (and when you upset the statisticians, you have really gone too far), but the state continues to defend its use.
QUESTION: What if I am a teacher who didn’t experience any problems, and I think my students did great on the test? Why would I want to oppose using this year’s data for TVAAS?
ANSWER: Thousands of your colleagues around the state don’t have that luxury, because they DID have problems, and their students’ scores suffered as a result. In fact, even in a good year, thousands of your colleagues have effectiveness scores based on subjects they don’t even teach, because TVAAS is only based on tested subjects (math, ELA, and depending on the year science and social studies). The fact is that TVAAS is a rotten system. If it benefits you individually as a teacher, that’s great for you. But too many of your colleagues are driven out of the classroom by the absurdity of being held accountable for things completely beyond their control. As a fellow professional, I hope you see the wisdom in advocating for a sane system over one that just benefits you personally.
QUESTION: Okay. So what do we do now?
ANSWER: Contact your state house and senate representatives! TODAY! These are the last days of the legislative session, so it is IMPERATIVE that you contact them now and tell them to support amendments to HB1109 and SB0987 that will stop the use of this year’s testing data towards TVAAS. You can find your legislators here.
Don’t leave teachers holding the bag for the state’s mistakes. AGAIN.
For more on education politics and policy in Tennessee, follow @TNEdReport


 

Not So Harmless


After a fourth day of TNReady trouble, the Tennessee General Assembly took action today to make changes to how the test impacts schools, students, and teachers.

While some are billing the report of a joint committee of the House and Senate as a “hold harmless” for schools, students, and teachers, that’s not entirely accurate.

Also, the legislature stopped short of putting a stop to TNReady entirely, claiming federal law “requires” them to test students.

Here’s the deal: Federal law does say that districts should administer tests to at least 95% of students and that states should test all students in reading and math from grades 3-8 and at least once in high school, with a suggestion for additional high school testing as appropriate.

BUT: Is there really a penalty for districts (or states) where the testing threshold falls below 95%?

As I reported in 2016, the last time we had a major failure of online testing in Tennessee:

There’s just one problem: The federal government has not (yet) penalized a single district for failing to hit the 95% benchmark. In fact, in the face of significant opt-outs in New York last year (including one district where 89% of students opted-out), the U.S. Department of Education communicated a clear message to New York state education leaders:  Districts and states will not suffer a loss of federal dollars due to high test refusal rates. The USDOE left it up to New York to decide whether or not to penalize districts financially.

In other words, the likelihood of a single Tennessee district losing funds due to stopping a test that isn’t working is very close to zero. Tennessee is not having problems due to opt-outs or a low number of students being tested. Kids in districts across the state are showing up for a test that is not happening. Districts are doing everything right and a vendor and the Tennessee Department of Education are failing to serve students. Unless TNDOE is going to fine districts, there is truly no risk of funds being lost.

Now, about the “hold harmless” law (pictured below):

  1. The law does say that districts and schools will not receive an “A-F” score based on the results of this year’s test. It also says schools can’t be placed on the state’s priority list based on the scores. That’s good news.
  2. The law gives districts the option of not counting this year’s scores in student grades. Some districts had already said they wouldn’t count the test due to the likelihood the scores would arrive late. Now, all districts can take this action if they choose.
  3. The law says any score generated for teachers based on this year’s test cannot be used in employment/compensation decisions.

Here’s what the law didn’t say: There will be NO TVAAS scores for teachers this year based on this data.

Commissioner McQueen said yesterday that the data from these tests will be used to generate a TVAAS score and it will count for 20% of a teacher’s evaluation. This law does NOT change that. It just says if you get a low score based on this number, you can’t be fired or denied compensation.

Below is an excerpt from current law (taken from TCA 49-1-302, the section governing teacher evaluation):

(E)  For teachers with access to individual data representative of student growth as specified in subdivision (d)(2)(B)(ii), the following provisions shall apply:

  • (i)  In the 2016-2017 school year, the evaluation criteria identified in subdivision (d)(2)(B)(ii) shall be adjusted so that student growth data generated by assessments administered in the 2016-2017 school year shall account for ten percent (10%) of the overall evaluation criteria identified in subdivision (d)(2)(B);
  • (ii)  In the 2017-2018 school year, the evaluation criteria identified in subdivision (d)(2)(B)(ii) shall be adjusted so that student growth data generated by assessments administered in the 2016-2017 and 2017-2018 school years shall account for twenty percent (20%) of the overall evaluation criteria identified in subdivision (d)(2)(B);
  • (iii)  In the 2018-2019 school year and thereafter, the student growth component of the evaluation criteria shall be determined under subdivision (d)(2)(B)(ii);
  • (iv)  The most recent year’s student growth evaluation composite shall account for the full thirty-five percent (35%) of growth data required in a teacher’s evaluation if such use results in a higher evaluation score;
  • (v)  For the 2015-2016 through 2017-2018 school years, student growth evaluation composites generated by assessments administered in the 2015-2016 school year shall be excluded from the student growth measure as specified in subdivision (d)(2)(B)(ii) if such exclusion results in a higher evaluation score for the teacher or principal. The qualitative portion of the evaluation shall be increased to account for any necessary reduction to the student growth measure.

Here’s what this means: If the current tests give you a “good” evaluation score, it will count for 35% of your total evaluation. If the score is not “good,” it only counts for 20% this year. The legislation adopted today by way of the Conference Committee does NOT change that.

In other words, the test data from the 2017-18 administration of TNReady WILL count in a teacher’s evaluation.

Here’s why that matters: An educator’s evaluation score factors into the number of observations they have each year as well as Professional Development Points (PDPs). PDPs are needed for license advancement or renewal.

The Department of Education addresses PDPs and notes:

Overall level of effectiveness rating (approved TN model) Overall Score of 5 = 20 PDPs
Overall Score of 4 = 15 PDPs

Overall Score of 3 = 10 PDPs

Information is maintained by the department. No additional documentation is required; points may be accrued annually.

Even if this year’s scores only end up counting 20%, that’s enough to change a teacher’s overall TEAM rating by a level. A TEAM score below a three means no PDPs, for example. The overall TEAM score also impacts the number of observations a teacher has in a year — which also places an additional burden on administrators.

Also, districts now have to meet to decide how to handle the tests and student grades. For some, that decision has already been made. For others, this will require a meeting in pretty short order to let students, parents, and teachers know what’s happening.

Here’s the language of the conference committee report:

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Your support is appreciated and helps keep content like this coming!


 

Outlier


Statisticians define an outlier as an observation point that is distant from other observations in a statistical analysis. Often, this occurs by chance. Additional modeling or deeper analysis (including more data, for example, or a longer range of data) can often correct for this. Outliers that are not the result of measurement error are often excluded from analysis about a data set.

Today, the 2017 results from the National Assessment of Educational Progress (NAEP) were released. This release made me think of a particular outlier.

Back in 2013, Tennessee demonstrated what some heralded as an incredible achievement on the NAEP. In fact, a press release from Governor Haslam at the time noted:

Gov. Bill Haslam today announced that Tennessee had the largest academic growth on the 2013 National Assessment of Educational Progress (NAEP) of any state, making Tennessee the fastest improving state in the nation. (emphasis added)

Those words — “fastest improving state in the nation” — have been uttered by Haslam and many political leaders in our state for years now. Often, this 2013 “success” is used as justification for “keeping our foot on the gas” and continuing an aggressive agenda of test-based accountability and teacher evaluation based on methods lacking validity.

Here’s what I wrote back in 2013 when these results were released:

Yes, Tennessee should celebrate its growth.  But policymakers should use caution when seeing the results from the last 2 years as a validation of any particular policy.  Long-term trends indicate that big gains are usually followed by steady maintenance. And, even with the improvement, Tennessee has a long way to go to be competitive with our peers. Additionally, education leaders should be concerned about the troubling widening of the rich/poor achievement gap  – an outcome at odds with stated policy goals and the fundamental principle of equal opportunity.

Two years later, when the 2015 results were released, I noted:

This year’s scores, in which Tennessee remained steady relative to the 2013 scores suggest, if anything, that the 2013 jump was likely an outlier. Had the 2013 gains been followed by gains in 2015 and again in 2017, more could be suggested. And frankly, it is my hope that we see gains (especially in reading) in 2017. But, it’s problematic to suggest that any specific reform or set of reforms caused the one-time jump we saw in 2013. Saying we are the fastest improving state in the nation over the last 4 years when we only saw a jump in 2013 is like saying we started the first quarter of a football game way behind, scored a bunch in the second quarter, (so we’re not as far behind), and then scored the same number of points in the third quarter. The result is we’re still behind and still have a long way to go.

Fast forward to today. The leveling off I suggested was likely back in 2013 has happened. In fact, take a look at this chart put out by the Tennessee Department of Education:

First, notice that between 2009 and 2011, Tennessee saw drops in 4th and 8th grade reading and 8th grade math. That helps explain the “big gains” seen in 2013. Next, note that in 4th and 8th grade reading and 4th grade math, our 2017 scores are lower than the 2013 scores. There’s that leveling off I suggested was likely. Finally, note that in 4th and 8th grade reading, the 2017 scores are very close to the 2009 scores. So much for “fastest-improving.”

Tennessee is four points below the national average in both 4th and 8th grade math. When it comes to reading, we are 3 points behind the national average in 4th grade and 5 points behind in 8th grade.

All of this to say: You can’t say you’re the fastest-improving state on NAEP based on one testing cycle. You also shouldn’t make long-term policy decisions based on seemingly fabulous results in one testing cycle. Since 2013, Tennessee has doubled down on reforms with what now appears to be little positive result. Instead, as Rep. Jeremy Faison said recently, our policies are “driving teachers crazy.”

Oh, and that new TNReady test has so far not been very ready.

But what about the good policy coming from this? You know, like Governor Haslam’s plan to make Tennessee the “fastest-improving state in teacher pay?”

About that:

Average teacher salaries in the United States improved by about 4% from the Haslam Promise until this year. Average teacher salaries in Tennessee improved by just under 2% over the same time period. So, since Bill Haslam promised teachers we’d be the fastest improving in teacher pay, we’ve actually been improving at a rate that’s half the national average. No, we’re not the slowest improving state in teacher pay, but we’re also not even improving at the average rate.

Surely, though, all this focus on education since the NAEP buzz has meant meaningful investment in schools, right? Well, no:

Tennessee earns a grade of F when it comes to funding effort compared to funding ability. The researchers looked at Gross State Product and Personal Income data in order to determine a state’s funding ability then looked at dollars spent per $1000 (in either GSP or Personal Income) to determine effort. Tennessee spends $29 on schools for every $1000 generated in Gross State Product. When it comes to Personal Income, Tennessee spends just $33 per $1000 of average personal income. That’s a rank of 42 in both.

Then, the report looks at wage competitiveness — how much teachers earn relative to similarly-educated professionals. I’ve written about this before, and Tennessee typically doesn’t do well in this regard.

Maybe we’ve taken a minute to get serious about investing in programs targeting struggling students? Also, no:

One possible solution would be to embed funding for school-level RTI2 specialists in the state’s funding formula for schools, the BEP. In fact, Rep. Joe Pitts offered legislation that would do just that last year. His plan would have added funding for three RTI2 specialists at each school for a total projected cost of $167 million. Commissioner McQueen was quick to shoot that idea down and came back this year with the funding proposal of $13 million, or one specialist per district. That’s only $154 million short of adopting a plan that would actually meet the needs of a program many suggest is an important way to improve educational outcomes for Tennessee students.

Maybe we are closing achievement gaps? Again, no.

Back in 2013, Tennessee students eligible for free/reduced lunch had an average NAEP reading score of 256 and scored 20 points below the non-eligible students. Now, that average score is 252 (four points worse) and 19 points below. For 4th grade, there’s a similar story, with free/reduced lunch eligible students scoring 25 points below their non-eligible peers this year. Four years ago, it was 26 points.

We’re not moving the needle. Our most vulnerable students continue to be left behind. Meanwhile, we hear nice words from top policymakers and see little actual result in terms of tangible improved investment in schools or any meaningful upgrade in teacher pay. Our testing system has yet to be proven.

Maybe now Tennessee policymakers will stop repeating the “fastest-improving” line and start doing the actual work of investing in and supporting our schools.

In any case, the next time you hear someone spout off that tired “fastest-improving” line, just yell back: OUTLIER!

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Personally


It’s that time of year again. The time when the Tennessee Department of Education asks teachers for feedback so they can compile it and put into pretty graphs and ignore absolutely all of the responses.

Well, one teacher from Sumner County received an email about having not yet responded. Here it is:

My name is Isaiah Bailey, and I am part of a team working to amplify educators’ voices through various means, including the annual Tennessee Educator Survey. I consider it an honor to be so deeply engaged with advancing the interests of Tennessee educators, and look forward to continuing this work.

I have included your personalized Tennessee Educator Survey link here. I understand that you may have been asked to complete various other surveys around this time of year, and I apologize for any confusion this may have caused. Please note that this is the same survey for which you received an invitation from the Tennessee Education Research Alliance.

More than 31,000 educators around the state have already shared their thoughts on various issues including school climate, testing, professional learning, and more via this year’s survey. But given that the current teacher participation rate for Sumner County is 38 percent, it feels especially important that this survey incorporate more of the perspectives that only you and your colleagues in Sumner County can speak to.

At the same time, the current teacher participation rate for your school is 34%. If your school reaches at least 67 percent by the end of the day tomorrow, your staff will become eligible for a drawing that will award several grants of $500 to be used toward staff appreciation.

Please let me know if you have any questions.

Since Mr. Bailey asked, this teacher responded:

Isaiah,
Thank you for personally reaching out. I have some thoughts I’d like to share.
First, as a veteran educator, I’m familiar with the adage that “students don’t care how much you know until they know how much you care.”
You are correct, I have not responded to the state’s survey. Your email indicates that a majority of my colleagues at my school and in my district have not responded, either.
Here’s why: The Tennessee Department of Education has demonstrated time and again that you don’t care.
Teachers speak out on testing, portfolios, RTI, adequate resources, and pay – and year after year we are ignored.
Teachers inquire about the validity of measures such as TVAAS and we are ignored.
Teachers clamor for schools staffed with guidance counselors and nurses to care for the children we teach, and we are ignored.
I’ve filled out this survey in the past, and nothing has changed.
Tennessee keeps building the plane while it is flying — this is unacceptable.
You asked for my personal perspective. Now, you have it.
I’d suggest you share it with your bosses, but I know that even if you did, nothing would change.
Now, I’ll go back to showing my students I care — about them, their interests, their futures.
It seems Mr. Bailey and the TDOE broke a key rule — don’t ask a question if you don’t actually want the answer.
For more on education politics and policy in Tennessee, follow @TNEdReport