Testing Time

While Tennessee teachers are raising concerns about the amount of time spent on testing and test preparation, the Department of Education is lauding the new TNReady tests as an improvement for Tennessee students.

According to an AP story:

However, the survey of nearly 37,000 teachers showed 60 percent say they spend too much time helping students prepare for statewide exams, and seven out of ten believe their students spend too much time taking exams.

“What teachers recognize is the unfortunate fact that standardized testing is the only thing valued by the state,” said Jim Wrye, assistant executive director of the Tennessee Education Association, the state’s largest teachers’ union.

“Teachers and parents know there are so many things that affect future student success that are not measured by these tests, like social and emotional skills, cooperative behaviors, and academic abilities that do not lend themselves to be measured this way.”

Despite teacher concerns, the Department of Education says the new tests will be better indicators of student performance, noting that it will be harder for students to “game” the tests. That’s because the tests will include some open-ended questions.

What they don’t mention is that the company administering the tests, Measurement, Inc., is seeking test graders on Craigslist. And, according to a recent New York Times story, graders of tests like TNReady have, “…the possibility of small bonuses if they hit daily quality and volume targets.”  The more you grade, the more you earn, in other words.

Chalkbeat summarizes the move to TNReady like this:

The state was supposed to move in 2015 to the PARCC, a Common Core-aligned assessment shared by several states, but the legislature voted in 2014 to stick to its multiple-choice TCAP test while state education leaders searched for a test similar to the PARCC but designed exclusively for Tennessee students.

Except the test is not exactly exclusive to Tennessee.  That’s because Measurement, Inc. has a contract with AIR to use test questions already in use in Utah for tests in Florida, Arizona, and Tennessee.

And, for those concerned that students already spend too much time taking standardized tests, the DOE offers this reassurance about TNReady:

The estimated time for TNReady includes 25-50 percent more time per question than on the prior TCAP for English and math. This ensures that all students have plenty of time to answer each test question, while also keeping each TNReady test short enough to fit into a school’s regular daily schedule.

According to the schedule, the first phase of testing will start in February/March and the second phase in April/May. That means the tests are not only longer, but they also start earlier and consume more instructional time.

For teachers, that means it is critical to get as much curriculum covered as possible by February. This is because teachers are evaluated in part based on TVAAS — Tennessee Value-Added Assessment System — a particularly problematic statistical formula that purports to measure teacher impact on student learning.

So, if you want Tennessee students to spend more time preparing for and taking tests that will be graded by people recruited on Craigslist and paid bonuses based on how quickly they grade, TNReady is for you. And, you’re in luck, because testing time will start earlier than ever this year.

Interestingly, the opt-out movement hasn’t gotten much traction in Tennessee yet. TNReady may be just the catalyst it needs.

For more on education politics and policy in Tennessee, follow @TNEdReport

Quickly Inflated

Jon Alfuth has a piece over at Bluff City Ed that answers the question: Did this year’s method of calculating quick scores on TCAP result in grade inflation? The short answer is yes.

The post is complete with math and graphs that explain the two different methods for calculating quick scores and the possible grade inflation that resulted this year when the TN Department of Education switched to the cubed root method.

Here’s an excerpt that explains the point difference that would be expected based on the different methods for calculation:

The cube root method yielded on average a quick score, the score that goes for a grade, of 4.46 points higher. In other words, a student scoring basic with a raw score of 30 or higher would, on average, receive an extra 4.46% on their final quick score grade, which goes on their report card. A student who scored a 70 last year could expect to receive a 74 under the new quick score calculation.

The additional points do drop as one goes up the raw score scale, however. For the average basic student grades 3-8 with a raw score between 30 and 47, they would receive an extra 5.41 extra points under the new method.

The average proficient student grades 3-8 with a raw score between 48 and 60 would get 4.32 extra points under the new method.

The average advanced student grades 3-8 with a raw score of between 61 and 67 would receive an extra 1.97 extra points under the new method.

The difference varies much more widely for below basic students, but the difference can be as much as 25 points in some cases.

In short, final grades in subjects required to factor in TCAP scores were higher this year than they have been in the past. In some cases, these “extra points” would have moved a student up a full letter grade.

Commissioner McQueen has indicated that this method will be used going forward as the state transitions to the TNReady test, starting next year. Of course, that test is entirely different from TCAP, so comparisons between the two are of limited value — at least until there are multiple years of TNReady data to use for comparative analysis.

More on Quick Scores:

A Call for Testing Transparency

That Was Quick

Quick and Confusing

 

For more on education politics and policy in Tennessee, follow @TNEdReport

 

A Call for Testing Transparency

Advocacy groups from across the state have issued a call for testing transparency, even starting an online petition calling for the ability to review questions and answers after standardized tests are administered.

Here’s the latest email from TREE:

Tennessee’s public education system finds itself mired in TCAP controversy for the second year in a row. The Tennessee Department of Education’s (TDOE) release of seemingly inflated quick scores, without clarification on how they were calculated, left educators and parents befuddled and upset. After considerable questioning of the TDOE’s actions they released a statement attempting to clarify the situation, claiming a lack of communication on their part as the culprit, but didn’t actually address the gross deficits of a testing system that is completely lacking in transparency and accountability. The TDOE continues to move the goal posts of a high stakes testing system that remains off limits for public scrutiny. Tennesseans are tired of blindly accepting TCAP results from the TDOE. So, TREE has joined with more than a dozen grassroots organizations that support strong public schools across Tennessee to demand accountability from the TDOE in the wake of confusion created by the latest release of “quick scores” and associated raw “cut scores” from recent TCAP tests. [view press release]

We also want to draw attention to another concerning problem with standardized testing: Our children are losing immeasurable amounts of instruction time due to test preparation and administration. Please review the graphic attached to this post, based on the 2014-15 school year. (story continues below graphic)

The TN Department of Education’s state testing calendar and information from teachers were used as reference.

The TN Department of Education’s state testing calendar and information from teachers were used as reference to create this calendar. 2015-16 TDOE testing calendar>>

As you can see, our children are spending the large majority of their school year taking or preparing for tests. It is unfair to our children, teachers, and our society that data collection and high stakes testing has trumped instruction time. Public education was created to provide our society with a well-educated electorate and work force. It is the single most important factor in making our country the world leader it is today. But our nation’s leaders are fixated on excessive data collection with a focus solely on subjects covered on high stakes tests. This has led to the devaluation of a well-rounded education and in some instances the removal of arts, language and music education in our schools. Our reputation for being the most creative and innovative country in the world is in jeopardy as our nation now values honing test scores over fostering critical thinking and creativity. There are ways of evaluating the academic growth of a student that do not limit instruction and enable our teachers to hone their education delivery in turn fostering student achievement. Some examples include portfolio reviews, research projects, peer review committees, and standards-based evaluations, etc.


Sign the petition to demand transparency. E-mail Commissioner McQueen and Governor Haslam and tell them you want our tax dollars to go to teaching, not testing. Commissioner McQueen – Commissioner.McQueen@tn.gov Governor Haslam – bill.haslam@tn.gov Then contact your legislators and send them a copy of this testing calendar and post. Tell them why you are concerned about the excessive testing and demand transparency for the standardized tests that our state’s legislature and department of education require our students to take. Let them know you are holding them accountable and urge them to explore alternatives to boxing in our students and schools with high stakes testing. With their and your help, we can take back our schools and turn them into breeding grounds for a level of creativity, critical thinking, and problem solving that has never before been seen in human history.


Thank you to our growing number of grassroots organizations coming together to support strong public schools across Tennessee and demand accountability from the TDOE. Groups participating in this network include:

Strong Schools (Sumner County)

Williamson Strong (Williamson County)

SPEAK (Students, Parents, Educators Across Knox County)

SOCM (Statewide Organizing for Community eMpowerment)

Momma Bears Blog

Gideon’s Army, Grassroots Army for Children (Nashville)

Advocates for Change in Education (Hamilton County)

Concerned Parents of Franklin County (Franklin County)

The Dyslexia Spot

Parents of Wilson County, TN, Schools

Friends of Oak Ridge Schools (City of Oak Ridge Schools)

TNBATs (State branch of National BATs)

East Nashville United

Tennessee Against Common Core (Statewide)

**For full disclosure, I’m a co-founder and the volunteer Executive Director of Strong Schools, a co-signer of the call for testing transparency.

More on TNReady, next year’s standardized test replacing TCAP

An Alternative to Standardized Testing

For more on education politics and policy in Tennessee, follow @TNEdReport

That was Quick

The Tennessee Department of Education is out with an apology for miscommunication that caused confusion regarding this year’s standardized testing “quick scores.”

Grace Tatter over at Chalkbeat has the story, and this quote from a letter sent to Directors of schools from Assistant Commissioner Nakia Towns:

“Our goal is to communicate early and often regarding the calculation and release of student assessment data. Unfortunately, it appears the office of assessment logistics did not communicate decisions made in fall 2014 regarding the release and format of quick scores for the 2014-15 school year in a timely manner. . . . We regret this oversight, and we will continue to improve our processes such that we uphold our commitment to transparency, accuracy, and timeliness with regard to data returns, even as we experience changes in personnel.”

As Tatter notes, this is the second year in a row that release of quick scores has been a problem for the Department of Education.

Read her full story and see the complete text of the letter sent to Directors.

It remains to be seen whether the “commitment to transparency” referenced in the letter from Towns will mean that parents and teachers can see the test questions and answers after next year’s TNReady test is administered.

For more on education politics and policy in Tennessee, follow @TNEdReport

Quick and Confusing

Over at Bluff City Ed, Jon Alfuth digs into the questions surrounding this year’s release of TCAP quick scores and their correlation to student performance on the TCAP.

This year, the way quick scores were calculated in relation to raw scores was shifted so that grades 3-8 (TCAP) scores matched the EOC scores students see in high school.

One key question is why make this change in the last year of TCAP? Next year, Tennessee students will see TNReady — so, making the calculation change now doesn’t seem to serve much purpose.

Alfuth does a nice job of explaining what’s going on and why it matters. Here are some key highlights:

Lack of Communication

They (TN DOE) didn’t make it clear to teachers, parents or students that they were changing the policy, resulting in a lot of confusion and frustration over the past few days as everyone grapples with these new quick scores.

An Explanation?

From the second memo, they note that they changed to raw scores because of concerns about getting final quick scores out on time during the transition to a new test, stating that if they did it based on proficiency, it would take until the middle of the summer to make them happen.

I’d buy that…except that the Department of Education has always been able to get the quick scores out on time before. And last I checked, we weren’t transition to TNReady this year – the transition occurs next year. So why mess with the cut scores this year? Is this just a trial run, an experiment? It feels like we’re either not getting the whole story, or that if we are there is some serious faulty logic behind this decision that someone is just trying to explain away.

It’s worth noting that last year, the quick scores weren’t available on time and most districts received a waiver from including TCAP scores in student grades. I note this to say that concern about getting quick scores out on time has some merit given recent history.

To me, though, this raises the question: Why are TCAP scores factored into a student’s grades? Ostensibly, this is so 1) students take the tests seriously and 2) how a teacher assesses a student matches up with the desired proficiency levels on the appropriate standards.

Of course, quick scores are only available for tested subjects, leaving one to wonder if other subjects are less important or valuable to a student’s overall academic well-being. Or, if there’s another way to assess student learning beyond a bubble-in test or even a test with some constructed response, such as TNReady.

I’d suggest a project-based learning approach as a means of assessing what student’s have actually learned across disciplines. Shifting to project-based learning with some grade-span testing would allow for the accountability necessary to ensure children are meeting state standards while also giving students (and their teachers) a real opportunity to demonstrate the learning that has occurred over an academic year.

Trust

The Department has also opened itself to some additional criticism that it is “massaging” the scores – that is, trying to make parents happy by bringing grades up in the last year under the old testing regime. We can’t say for certain that this is the motivating factor behind this step, but in taking this step without more transparency the Department of Education has opened itself up to this charge. And there will definitely be some people who accuse the state of doing this very thing, especially given the reasons that they cited in their memo. I personally don’t ascribe any sinister motives to the state, but you have to admit that it looks a little fishy.

In fact, TC Weber is raising some important questions about the process. He notes:

If people don’t believe in the fidelity of the system, it becomes too easy to attribute outside factors to the results. In other words, they start to feel that data is being manipulated to augment an agenda that they are not privy to and not included in. I’m not saying results are being manipulated or not being manipulated when it comes to our student evaluation system, but I am saying that there seems be a growing belief that they are, and without some kind of change, that perception will only grow. I’ve always maintained that perception is nine-tenths of reality.

As both Alfuth and Weber note, the central problem is lack of communication and transparency. As we shift to a new testing regime with uncertain results, establishing confidence in the system and those administering it is critical. After last year’s late score debacle and this year’s quick score confusion, establishing that trust will be difficult. Open communication and a transparent process can go a long way to improving perception and building support.

For more on education politics and policy in Tennessee, follow @TNEdReport

Candice Clarifies

Commissioner of Education Candice McQueen issued an email to teachers today clarifying an email she sent Monday regarding Tennessee standards and the upcoming TNReady tests.

It seems there was some confusion about what standards to teach in the 2015-16 academic year and what Tennessee standards may look like going forward.

Below is today’s email followed by the one sent Monday:

Teachers,

I’m writing to clarify information I shared on Monday about the standards review and development process. We have received several questions about which standards teachers should use during the 2015-16 school year. We want to make sure that your questions are answered quickly, so you can move into summer with clear expectations for the upcoming school year.

Tennessee teachers should continue to use the state’s current academic standards in English language arts and math, not the previous SPI’s. The current state standards are available on our website.

TNReady, the state’s new and improved TCAP test in English language arts and math, will assess the state’s current academic standards in English language arts and math, not SPI’s.

As we shared on Monday, the standards review and development process that Gov. Haslam and the State Board of Education established last fall will continue. Teams of educators will work to review public input and will then recommend new sets of math and English language arts standards to the State Board of Education to be fully implemented during the 2017-18 school year. TNReady will evolve as our math and English language arts standards do, ensuring that our state assessment will continue to match what is being taught in Tennessee classrooms.

Please feel free to reach out with additional questions or clarifications. We look forward to sharing more information about TNReady and the standards review and development process in the coming weeks.

Best,
Candice

_________________________________________________________________
From: Commissioner.McQueen@tn.gov
Date: Monday, May 11, 2015 3:20 PM
To: Tennessee teachers
Subject: Update on Standards Review Process

Teachers,

The Tennessee General Assembly recently voted to support our administration’s efforts to ensure that Tennessee students graduate from high school ready for post-secondary education or the workforce.

The vote complements the academic standards review and development process established by Gov. Haslam and the State Board of Education last October, and it will maintain the participation of Tennessee educators and parents in the process.

At the conclusion of the review process, Tennessee’s new academic standards, which will include public input and are established by Tennessee educators, will replace the existing set of standards in English language arts and math. These standards will be fully implemented during the 2017-18 school year.

In addition to the teams of educators established by the State Board of Education that will review the existing standards, the adopted legislation also provides for a 10-member standards recommendation committee appointed by the Governor, Lieutenant Governor, and Speaker of the House. This committee will review the recommendations of our educator groups and will then make a final recommendation to the State Board of Education for consideration and approval.

In addition, the state’s academic standards in math and English language arts will also inform and help guide the state’s new assessment, TNReady. TNReady begins during the 2015-16 school year, and it will be aligned to the state’s existing academic standards in math and English language arts. TNReady will then evolve as the standards do, ensuring that our state assessment matches what is actually being taught in Tennessee classrooms.

As I travel around the state listening to teachers, I continue to hear teachers’ confidence in Tennessee’s higher standards and the positive impact they are having on students. I also continue to hear your desire for stability and alignment, so teachers and school leaders can make informed decisions about what works best for your students. We hope this process encourages you to continue on the path that you boldly started – great teaching to high expectations every day – as we all continue to work together to improve the standards during the review process.

We are proud that Tennessee is the fastest-improving state in the nation in student achievement, and your work this year to ensure that Tennessee stays on a path of high academic standards to help continue that success has been critical. Thank you to those that commented on the math and English language arts standards on the review website, www.tn.gov/standardsreview.

I am confident that the process that the General Assembly has now adopted will only enhance our efforts to improve outcomes for all of our students.

We look forward to sharing more updates with you as the standards review and development process continues this summer. Thank you again for all you do in support of Tennessee families and students.

Best,
Candice

For more on education politics and policy in Tennessee, follow @TNEdReport

The End of an Era

Over at Bluff City Ed, Jon Alfuth celebrates the end of the EOC testing era. Those tests will be replaced with TNReady next year.

Alfuth notes that there are many challenges with the current testing regime, including gaming the system and misalignment with current standards.

Here’s what he says he hopes the new tests provide:

First, I’d personally like to see aligned pre- and formative assessments to allow teachers to track tests throughout the year. These could be given to the districts and used to develop a benchmark for where students are starting and track their progress throughout the year. These should be designed by Measurement Inc. to ensure close alignment to the actual test.

Second, we need to see shorter tests. Asking students to sit for between 2 to 4 three hour assessments in a four day period is a lot, and it does stress kids out. I’d like to see the number of questions reduced on the new TNReady assessments to reflect this reality.

Third, we need better special education and special needs accommodations. I’m not a special education teacher myself, but from talking to some of my colleagues my understanding is that the accommodations for the EOC regime aren’t the greatest. Hopefully a technologically advanced test like TNReady (it can be given on paper or on a computer) could include better accommodations for kids with special needs. I also hope it makes automatic adjustments for students who, say, speak English as a second language.

Fourth, we need to see a substantial increase of resources aligned to the new assessments and SOON. Teachers need time to internalize the format at the types of questions that students will be asked to complete on the new assessments. That was one of the failings of PARCC and one reason I believe we no longer have it in Tennessee – teachers didn’t have enough supporting resources and backed off support for the assessment. Lets hope that TNReady doesn’t make the same mistake.

More on TNReady:

TNReady to Borrow Questions from Utah

Transition to TNReady Creates TVAAS Problems

For more on education politics and policy, follow @TNEdReport

A Little Less Bad

From a story in Chalkbeat:

Tennessee’s teacher evaluation system is more accurate than ever in measuring teacher quality…

That’s the conclusion drawn from a report on the state’s teacher evaluation system conducted by the State Department of Education.

The idea is that the system is improving.

Here’s the evidence the report uses to justify the claim of an improving evaluation system:

1) Teacher observation scores now more closely align with teacher TVAAS scores — TVAAS is the value-added modeling system used to determine a teacher’s impact on student growth

2) More teachers in untested subjects are now being evaluated using the portfolio system rather than TVAAS data from students they never taught

On the second item, I’d note that previously, 3 districts were using the a portfolio model and now 11 districts use it. This model allows related-arts teachers and those in other untested subjects to present a portfolio of student work to demonstrate that teacher’s impact on growth. The model is generally applauded by teachers who have a chance to use it.

However, there are 141 districts in Tennessee and 11 use this model. Part of the reason is the time it takes to assess portfolios well and another reason is the cost associated with having trained evaluators assess the portfolios. Since the state has not (yet) provided funding for the use of portfolios, it’s no surprise more districts haven’t adopted the model. If the state wants the evaluation model to really improve (and thereby improve teaching practice), they should support districts in their efforts to provide meaningful evaluation to teachers.

A portfolio system could work well for all teachers, by the way. The state could move to a system of project-based learning and thus provide a rich source of material for both evaluating student mastery of concepts AND teacher ability to impact student learning.

On to the issue of TVAAS and observation alignment. Here’s what the report noted:

Among the findings, state education leaders are touting the higher correlation between a teacher’s value-added score (TVAAS), which estimates how much teachers contribute to students’ growth on statewide assessments, and observation scores conducted primarily by administrators.

First, the purpose of using multiple measures of teacher performance is not to find perfect alignment, or even strong correlation, but to utilize multiple inputs to assess performance. Pushing for alignment suggests that the department is actually looking for a way to make TVAAS the central input driving teacher evaluation.

Advocates of this approach will tell suggest that student growth can be determined accurately by TVAAS and that TVAAS is a reliable predictor of teacher performance.

I would suggest that TVAAS, like most value-added models, is not a significant differentiator of teacher performance. I’ve written before about the need for caution when using value-added data to evaluate teachers.

More recently, I wrote about the problems inherent in attempting to assign growth scores when shifting to a new testing regime, as Tennessee will do next year when it moves from TCAP to TNReady. In short, it’s not possible to assign valid growth scores when comparing two entirely different tests.  Researchers at RAND noted:

We find that the variation in estimated effects resulting from the different mathematics achievement measures is large relative to variation resulting from choices about model specification, and that the variation within teachers across achievement measures is larger than the variation across teachers. These results suggest that conclusions about individual teachers’ performance based on value-added models can be sensitive to the ways in which student achievement is measured.
These findings align with similar findings by Martineau (2006) and Schmidt et al (2005)
You get different results depending on the type of question you’re measuring.

The researchers tested various VAM models (including the type used in TVAAS) and found that teacher effect estimates changed significantly based on both what was being measured AND how it was measured. 

And they concluded:

Our results provide a clear example that caution is needed when interpreting estimated teacher effects because there is the potential for teacher performance to depend on the skills that are measured by the achievement tests.

So, even if you buy the idea that TVAAS is a significant differentiator of teacher performance, drawing meaningful conclusions from next year’s TNReady simply is not reliable.

The state is touting improvement in a flawed system that may now be a little less bad.  And because they insist on estimating growth from two different tests with differing methodologies, the growth estimates in 2016 will be unreliable at best. If they wanted to improve the system, they would take two to three years to build growth data based on TNReady — that would mean two t0 three years of NO TVAAS data in teacher evaluation.

Alternatively, the state could move to a system of project-based learning and teacher evaluation and professional development based on a Peer Assistance and Review Model. Such an approach would be both student-centered and result in giving teachers the professional respect they deserve. It also carries a price tag — but our students are worth doing the work of both reallocating existing education dollars and finding new ways to invest in our schools.

For more on education politics and policy in Tennessee, follow @TNEdReport

 

 

 

Validating the Invalid?

The Tennessee House of Representatives passed legislation today (HB 108) that makes changes to current practice in teacher evaluation as Tennessee transitions to its new testing regime, TNReady.

The changes adjust the percentage of a teacher’s evaluation that is dependent on TVAAS scores to 10% next year, 20% the following year, and back to the current 35% by the 2017-18 academic year.

This plan is designed to allow for a transition period to the new TNReady tests which will include constructed-response questions and be aligned to the so-called Tennessee standards which match up with the Common Core State Standards.

Here’s the problem: There is no statistically valid way to predict expected growth on a new test based on the historic results of TCAP. First, the new test has (supposedly) not been fully designed. Second, the test is in a different format. It’s both computer-based and it contains constructed-response questions. That is, students must write-out answers and/or demonstrate their work.

Since Tennessee has never had a test like this, it’s impossible to predict growth at all. Not even with 10% confidence. Not with any confidence. It is the textbook definition of comparing apples to oranges.

Clearly, legislators feel like at the very least, this is an improvement. A reasonable accommodation to teachers as our state makes a transition.

But, how is using 10% of an invalid number a good thing? Should any part of a teacher’s evaluation be made up of a number that reveals nothing at all about that teacher’s performance?

While value-added data alone is a relatively poor predictor of teacher performance, the value-added estimate used next year is especially poor because it is not at all valid.

But, don’t just take my word for it. Researchers studying the validity of value-added measures asked whether value-added gave different results depending on the type of question asked. Particularly relevant now because Tennessee is shifting to a new test with different types of questions.

Here’s what Lockwood and McCaffrey (2007) had to say in the Journal of Educational Measurement:

We find that the variation in estimated effects resulting from the different mathematics achievement measures is large relative to variation resulting from choices about model specification, and that the variation within teachers across achievement measures is larger than the variation across teachers. These results suggest that conclusions about individual teachers’ performance based on value-added models can be sensitive to the ways in which student achievement is measured.
These findings align with similar findings by Martineau (2006) and Schmidt et al (2005)
You get different results depending on the type of question you’re measuring.

The researchers tested various VAM models (including the type used in TVAAS) and found that teacher effect estimates changed significantly based on both what was being measured AND how it was measured. 

And they concluded:

Our results provide a clear example that caution is needed when interpreting estimated teacher effects because there is the potential for teacher performance to depend on the skills that are measured by the achievement tests.

If you measure different skills, you get different results. That decreases (or eliminates) the reliability of those results. TNReady is measuring different skills in a different format than TCAP. It’s BOTH a different type of test AND a test on different standards. Any value-added comparison between the two tests is statistically suspect, at best. In the first year, such a comparison is invalid and unreliable. As more years of data become available, it may be possible to make some correlation between past TCAP results and TNReady scores.

Or, if the state is determined to use growth scores (and wants to use them with accuracy), they will wait several years and build completely new growth models based on TNReady alone. At least three years of data would be needed in order to build such a model.

It seems likely that the Senate will follow the House’s lead on Monday and overwhelmingly support the proposed evaluation changes. But in doing so, they should be asking themselves if it’s really ok to base any part of a teacher’s evaluation on numbers that reliably predict nothing.

More on Value-Added:

Real World Harms of Value-Added Data

Struggles with Value-Added Data

 

TNReady … Already?

Back in November, the State of Tennessee awarded a contract to Measurement Inc. to develop the new assessment that would replace TCAP.

This assessment is to be aligned to state standards (largely based on Common Core State Standards) and should take into account feedback from Tennesseans.

Measurement Inc. will be paid $108 million for the contract.

Chalkbeat noted at the time the contract was awarded:

Measurement Inc. is subcontracting to AIR, a much larger player in the country’s testing market. AIR already has contracts with Utah and Florida, so Tennessee educators will be able to compare scores of Tennessee students with students from those states “with certainty and immediately.” AIR is also working with Smarter Balanced, one of two federally funded consortia charged with developing Common Core-aligned exams. That means that educators in Tennessee will also likely be able to measure their students’ progress with students in the 16 states in the Smarter Balanced Consortium.

The Department of Education notes on its website:

Comparability: While the assessments will be unique to Tennessee, TNReady will allow Tennesseans to compare our student progress to that of other states. Through a partnership between Measurement Inc. and American Institutes for Research, TNReady will offer Tennessee a comparison of student performance with other states, likely to include Florida and Utah.

While Measurement Inc. has an interesting approach to recruiting test graders, another item about the contract is also noteworthy.

The Department and Chalkbeat both noted the ability to compare Tennessee test scores with other states, including Utah and Florida.

Here’s why that’s possible. On December 5th, the Utah Board of Education approved the use of revenue from test licensing agreements with Florida, Arizona, and Tennessee based on contracts with AIR, the organization with which Measurement Inc. has a contract, as noted by Chalkbeat.

The contract notes that Utah’s expected arrangement in Tennessee is worth $2.3 million per year (running from 2015-2017) and that Tennessee will use questions licensed for the Utah assessment in Math and ELA in its 2015-16 assessment.

So, Tennessee’s new test will use questions developed for Utah’s assessment and also licensed to Florida and Arizona.

The contract further notes that any release of the questions either by accident or as required by law, will result in a fee of $5000 per test item released. That means if Tennessee wants to release a bank of questions generated from the Utah test and used for Tennessee’s assessment, the state would pay $5000 per question.

While Tennessee has said it may change or adapt the test going forward, it seems that the 2016 edition of the test may be well underway in terms of its development.

For more on education politics and policy in Tennessee, follow @TNEdReport