Ready for a Fight

Yesterday, Williamson County Director of Schools Mike Looney issued a statement saying his district would not be administering the high school end of course tests in addition to the suspension of the grades 3-8 TNReady tests.

Commissioner McQueen is not very happy about that. She served notice to Looney and all other directors that refusing to administer the EOC would be considered a violation of state law.

Here’s the email she sent to Directors of Schools:

First, I want to thank you for your partnership and support as we have worked together to implement and administer the first year of a new assessment. I know you share my disappointment and frustration with the inability of our vendor to deliver on this higher quality assessment in grades 3-8, and I truly appreciate your patience and leadership.


I want to reiterate that the state’s termination of its contract with the testing vendor Measurement Incorporated (MI) and the related suspension of grades 3-8 testing does not apply to high school and End of Course (EOC) exams, and, therefore, all school districts are required to administer these assessments.


The state of Tennessee and local districts are under an obligation under both federal and state law, as well as state board of education rules and regulations, to administer annual assessments to our students. My decision to suspend grade 3-8 testing was based on the impossibility of testing and made in close consultation with the U.S. Department of Education (USDOE). Based on the fact that testing in grades 3-8 was not feasible due to the failure of MI to meet its contractual obligations, the USDOE has acknowledged that the department made a good faith effort to administer the assessments to all students in grades 3-8. Unlike grades 3-8, districts are in receipt of EOC exams and the challenges associated with the delivery of grades 3-8 do not exist.


Because EOC exams have been delivered, students should have the opportunity to show what they know to measure their progress toward postsecondary and the workforce. Failure to administer the high school assessments will adversely impact students who will not only lose the experience of an improved, high quality test aligned to our higher standards but also the information we plan to provide to students, parents and educators relative to student performance. In addition, districts will eliminate the option for their teachers to use this year’s student achievement data as part of their teacher evaluation if the data results in a higher score.


Because of these factors and because state or district action to cancel high school testing would willfully violate the laws that have been set forth relative to state assessment, neither the state nor districts have the authority to cancel EOC exams. Districts that have taken action to cancel EOC exams or communicated such action are in violation of the law and should rescind this action or communication.

What Does This Mean?

In response to the Murfreesboro City School Board considering refusing to administer Phase II of TNReady, the Department of Education issued a statement noting that doing so would be considered a major violation of state law and that withholding state funds was a possible penalty.

McQueen doesn’t say what the penalty would be if districts like Williamson proceed with their refusal to administer the EOCs, but she may well attempt to impose a financial penalty.

In her email, McQueen says:

Failure to administer the high school assessments will adversely impact students who will not only lose the experience of an improved, high quality test aligned to our higher standards but also the information we plan to provide to students, parents and educators relative to student performance.

Just what students want and need: Another test. Some have proposed using the ACT battery of tests as the high school testing measure rather than the current EOC structure.

McQueen also says:

In addition, districts will eliminate the option for their teachers to use this year’s student achievement data as part of their teacher evaluation if the data results in a higher score. 

While the idea of flexibility seems nice, I want to reiterate that any data gleaned from this year’s test is invalid as a value-added indicator of teacher performance. As such, there’s no useful information to be gained relative to teacher performance from this year’s EOCs. Put another way, McQueen’s argument about depriving teachers of an opportunity is invalid.

While the use of value-added data to assess teacher performance is of limited usefulness under optimum conditions, under this year’s transition, it is clearly and plainly invalid. If the goal of using such data is to improve teacher performance, why use data that yields essentially no information?

I have not yet seen a response from Dr. Looney or any other directors. But a fight could be brewing.

For more on education politics and policy in Tennessee, follow @TNEdReport


TEA on TNReady

The Tennessee Education Association is out with a statement on TNReady:

“Tennessee teachers and students have lost countless hours of instruction time this school year preparing for the new TNReady assessment,” said TEA President Barbara Gray. “The call to cancel this year’s test should have come more than two months ago when the first phase was such a disaster.”

“The state is so focused on testing that it overlooked the opportunity to salvage what was left of the school year and let teachers get back to educating our students. Instead, the state placed gathering data above the best interests of Tennessee students.”

“Moving forward, we have serious concerns about the state’s ability to find a new vendor and have an assessment ready to go next school year,” Gray continued. “It is time to slow way down on the state’s testing craze and make sure we are doing what is best for our students.”

“The passage of the Every Student Succeeds Act at the federal level gives Tennessee a chance to reevaluate how it measures student and teacher performance. The new law allows for the development of innovative assessments, giving states a way out of the test-and-punish system we have operated under for many years. It will also allow us to look at other success indicators, as opposed to relying on a single test to determine if a school is meeting students’ needs.”

“We have the opportunity now to not just continue with the way things have always been done, but instead explore the opportunities afforded to us through ESSA to make sure every student receives a quality education.”

For more on education politics and policy in Tennessee, follow @TNEdReport


Why TNReady Wasn’t

Grace Tatter over at Chalkbeat has an informative interview with the President of Measurement Inc., the company charged with delivering TNReady this year.

As I read the interview, a couple of items stood out. First, the company had never delivered an entire state’s online testing program. Tatter notes:

It was also an unprecedented task for Measurement Inc., which had never before developed and delivered a state’s entire online testing program.

Despite this, they somehow won the bid to deliver Tennessee’s program.

Second, the magnitude of the failure. Tatter:

About 48,000 students logged on that day, and about 18,000 submitted assessments. It’s unknown the number of students who weren’t having troubles with the test, but stopped after McQueen sent an email instructing districts to halt testing.

“It was a failure in some respects because we were supposed to design a system that would take 100,000 students in at one time… We had a problem with 48,000,” Scherich said.

Read that again. Measurement Inc. was tasked with developing an online platform that could handle 100,000 students taking a test at the same time. The system they developed couldn’t handle 48,000 students. They didn’t even develop a system that could handle HALF of what they were contracted to provide.

The company president goes on to detail the challenges of printing the tests in a short timeframe. However, back in February, Education Commissioner Candice McQueen expressed confidence in the printed tests:

“I want to stress to you that the paper version of TNReady is still TNReady,” McQueen wrote of the new test aligned to the state’s current Common Core academic standards.

She said the paper tests are being shipped to each district at no additional taxpayer cost.

Unfortunately, it wasn’t. Phase I tests did arrive, albeit quite late. And Phase II tests were not delivered in time to be administered this year.

Now, the state is seeking another vendor who can deliver the test in the 2016-17 academic year.



For more on education politics and policy in Tennessee, follow @TNEdReport


One Step Further

On the heels of the announcement from the Tennessee Department of Education that TNReady testing was being suspended for grades 3-8, Williamson County Director of Schools Mike Looney went one step further and suspended end of course testing for high school students in his district.

Here’s the email he sent yesterday:

You are an incredible group of professionals and I am exceedingly proud of your work. This year has been full of surprises and uncertainty as it relates to state assessment and yet you still have prepared students for success. Your work is important and matters. I am proud of you.

Unfortunately, sometimes events happen outside of our direct control. Today the Commissioner of Education announced the suspension of Part II of the TnReady/TCAP Assessment in grades 3-8.

In addition, because of my continued concerns, I am suspending End of Course tests at the high school level.

I truly believe in the importance of measuring student progress. It is, from my perspective, a critical piece of our work. And I look forward to us being able to appropriately assess students as soon as possible.

Mike Looney


For more on education politics and policy in Tennessee, follow @TNEdReport

Ready for What’s Next?

Following yesterday’s announcement that Measurement Inc. had been fired and TNReady testing suspended, Commissioner of Education Candice McQueen sent an email to teachers explaining the decision and providing information for what’s next.

The email included a line with a bit of an apology: You were ready, and I am sorry that we were unable to provide all students with a consistent and complete testing experience this year.

Because, of course, all students want a consistent and complete testing experience.

Here’s the email in it’s entirety:

Earlier today, I announced that the department is terminating its statewide assessment contract with Measurement Inc., effective immediately. The state awarded Measurement Inc. a contract in November 2014 to provide an online testing platform with a paper and pencil backup. Not only did Measurement Inc. fail to deliver a reliable online platform for students across the state, it has also failed to print and ship paper tests by deadlines the company had set. Terminating our contract with Measurement Inc. was a challenging decision because we’ve been working to honor the effort and investment of Tennessee teachers and students, but we’ve exhausted every option in problem solving with this vendor to get these tests delivered. TNReady was designed to provide Tennessee students, teachers, and families with better information about what students know and understand, and the failure of this vendor has let down the educators and students of our state.

As a result of repeated failures from this vendor, we are suspending Part II of TNReady for grades 3-8 this year. However, because districts have already received all high school testing materials, high school testing will move forward as planned.

While Measurement Inc. had previously assured us it would have all grade 3-8 materials delivered by April 27, the third deadline the company had set and missed this month, 100 percent of districts are still waiting on some 3-8 materials to arrive, and few districts have complete sets of tests for any grade or subject. Last week, the department told districts that we would not ask them to continue to change schedules and extend the testing window beyond May 10. We know the transition to TNReady has presented unexpected challenges for educators, schools, and districts, and we will not ask you to further disrupt your end of year schedule.

Many of you have shared with me that, despite the challenges with implementation this year, you were excited for your students to show what they were able to do on a new assessment. I understand that many of you will share in my disappointment that we won’t have detailed score reports from this year’s assessment for students in grades 3-8. We will provide as much information to schools and students as possible related to results from Part I testing for grades 3-8. This will be used for informational purposes only, and no score will be associated with Part I for grades 3-8. High school tests will be scored, and these results will be shared in the fall.

The transition to a new assessment this year has been challenging, but aligned assessments are critical to ensure that all students are making progress on their path to postsecondary opportunities and the workforce. I have seen first-hand how hard you have worked to align your instruction to our newer, more rigorous standards. You and your students have risen to higher expectations, and I hope you are encouraged by the growth in critical thinking and problem solving that you have seen in your classrooms every day. You have been flexible and patient beyond what we should expect, especially as you planned for Part II amid important field trips and end-of-year celebrations. You were ready, and I am sorry that we were unable to provide all students with a consistent and complete testing experience this year.

I recognize that our apology is not enough. You also deserve clear guidance as we look ahead to next year. We know teachers’ jobs don’t stop when the final bell rings in May. You spend a great deal of time in the summer months planning for the following year. We’ll share updated math and ELA blueprints in May, which will help you plan for the coming year.

While navigating the challenges of this year’s administration of TNReady, we’ve also been working to improve our assessments for next year. Earlier this month, we shared with districts several changes to TNReady for next year. We’re eliminating Part I for the math assessment, and we will include one to three math problems on Part II called integrated tasks that will maintain the rigor of the performance tasks. As a result of eliminating Part I for math, we will be able to reduce overall testing time for math in grades 3-11. Additionally, we’re adding multiple choice items to ELA Part I, which will encourage students to closely engage with reading passages before they write their constructed response. This will reduce the number of items on ELA Part II and decrease the overall testing time for ELA. These changes were made in response to feedback we received from teachers, as well as school and district leaders, and we’re dedicated to continuously improving our assessments as we move forward with a new assessment vendor.

The department is currently working with the state’s Central Procurement Office to expedite the selection of a vendor for both the scoring of this year’s high school assessment and the administration of next year’s test. In the meantime, I encourage you to read our new frequently asked questions document (here), which offers more detailed information about how this latest announcement will affect you, your schools, and your districts. We will continue to update this resource. It’s also important to note that today’s announcement doesn’t change the flexibility that has already been provided on teacher evaluation. If a teacher has 2015-16 TNReady data, in this case high school teachers, TNReady data will only be used if it helps the teacher.

Thank you for your hard work and dedication to your students. I appreciate your patience and understanding as we look ahead to the 2016-17 school year.



For more on education politics and policy in Tennessee, follow @TNEdReport

Not Ready at All

It turns out TNReady wasn’t ready at all. From a technological failure on day one to a shipping delay in March to all out chaos in April, the transition to the new test has been, in the words of one Director of Schools, an “unmitigated disaster.”

As of today, in the face of yet another delay, the Tennessee Department of Education has terminated the contract with Measurement, Inc. and suspended further testing in grades 3-8 for this year.

Here’s the full press release from the Department of Education:

Education Commissioner Candice McQueen announced today the department will terminate its statewide testing contract with Measurement Inc., effective immediately. While high school testing will continue as planned, the state will suspend grade 3-8 testing during the 2015-16 school year due to Measurement Inc.’s inability to
deliver all testing materials.
After revising their shipping schedule for a third time this month, the state’s testing vendor, Measurement Inc., failed to meet its most recent deadline. As of this morning, all districts were still waiting on some grade 3-8 materials to arrive with a total of two million documents yet to be shipped. In February, the department was forced to move from the originally planned online assessment delivery to a paper-based format due to the failure of the vendor’s online platform.
“Measurement Inc.’s performance is deeply disappointing. We’ve exhausted every option in problem solving with this vendor to assist them in getting these tests delivered,” Commissioner Candice McQueen said. “Districts have exceeded their responsibility and obligation to wait for grade 3-8 materials, and we will not ask districts to continue waiting on a vendor that has repeatedly failed us.”
If districts have received materials for a complete grade or subject in grades 3-8 (i.e. fifth-grade math), they will have the option to administer that specific grade or subject level; however, the department will only be able to deliver limited student performance information for these particular grades and subjects. High school tests will be fully scored, and these results will be delivered later this fall.
“Challenges with this test vendor have not diverted us from our goals as a state. Tennessee has made historic and tremendous growth over the past several years. Higher standards and increased accountability have been a key part of this progress,” Commissioner McQueen said. “Our work toward an aligned assessment plays a critical role in ensuring that all students are continuing to meet our high expectations and are making progress on their path to postsecondary
and the workforce.”
Flexibility that has already been provided for teacher evaluation through recent legislation will remain. If a teacher has TNReady data, in this case high school teachers, TNReady data will only be used if it helps the teacher. If a teacher does not have TNReady data, their evaluation will rely on data from prior years.
The department is currently working with the state’s Central Procurement Office to expedite the selection of a vendor
for both the scoring of this year’s high school assessment and the development of next year’s test.  The department has also been in close contact with the United States Department of Education to ensure that Tennessee is in compliance with federal requirements and will continue to work with them on this issue.
TNReady, the state’s new assessment in math and English language arts in grades 3-11, was designed to be administered in two parts. Part I was given in late February and early March, and Part II was scheduled to begin on April 25.
Additional resources can be found in this frequently asked questions document:

MORE on Measurement, Inc.:

Replacing TCAP

TNReady Already?

Ready to Grade?

For more on education politics and policy in Tennessee, follow @TNEdReport


Will TNReady testing resume this year? For some students, maybe not.

The President of Measurement, Inc. said yesterday that there was no guarantee his company would make the testing window.

The Memphis Daily News reports:

The president of a North Carolina-based testing company said Monday that he can’t guarantee all students in Tennessee will receive the test on time.

Measurement Inc. president and CEO Henry “Hank” Scherich said his company is working furiously to get the new TNReady materials to students.

“I wish I could promise them,” Scherich said. He added they were doing everything humanly possible to get the tests to the students on time.

All of the students have at least some of the testing materials, he said, but the company has found itself scrambling to print and ship 5 million test booklets for Tennessee.

This follows last week’s  that a Friday deadline would be missed.

That event caused some lawmakers to call for this year’s testing to be cancelled. The Department of Education has still not agreed to that solution.

For more on education politics and policy in Tennessee, follow @TNEdReport



A Letter of Concern

Prior to the latest TNReady debacle, the Director of Schools in Oak Ridge sent a letter outlining some concerns about this year’s testing to Commissioner of Education Candice McQueen. Her response includes the questions he posed and makes for some interesting reading regarding the challenges faced by districts this year.

McQueen’s response is published here in its entirety:

April 11, 2016
Dr. Borchers,
Thank you for sharing your request and thoughts about TNReady and testing this year. I know this has been a tremendous transition for our families and schools, and I do not take these concerns lightly.
I want to address each of the issues you raised, but first I want to make sure you and your educators are aware of the new flexibility we have offered for accountability for the 2015-16 year, in part because of the unexpected issues we experienced on Part I.
First, both teachers and school leaders will not have results from this year’s tests included in their student growth (TVAAS) score unless it benefits them to do so. In other words, if results from this year give a teacher a higher score, they will be included, but if they hurt a teacher’s evaluation, they will be excluded. Educators will automatically receive the best option. You can read more information by clicking here.
In addition, you as a director can provide educators with the option to select a new achievement measure, and those who had originally chosen a schoolwide TVAAS measure can switch to a non-TVAAS option. Also, per the Tennessee Teaching Evaluation Enhancement Act, districts have complete discretion in how they choose to factor test data into employment decisions like promotion, retention, termination, and compensation. And as we had stated earlier, because the scores will be back later this year, districts do not have to include students’ scores in their grades.
Schools also have flexibility for accountability. When we run the Priority School list next year, we will provide a safe harbor for schools who may have seen a decline in performance in 2015-16 that would have resulted in being placed on the list. Instead, we will also look at school performance excluding 2015-16 data, and if that removes the school from being in the bottom 5 percent, they will not be considered a Priority School.
We have already taken steps through our ESEA waiver to revise district accountability this year. For 2015-16, districts will receive the better of two options for purposes of the achievement and gap closure statuses: a one-year growth measure or their relative rank in the state. If a district’s achievement scores decline, but their peers across the state decline in tandem, a district’s relative rank will remain stable. Similar to the governor’s proposal for teachers, districts will automatically receive the option that yields the higher score.

We still believe in the important role state assessment plays in accountability, and this year’s results will provide a baseline from which we can grow. We have a responsibility as a state to make sure all of our students are making progress toward college and career, and state tests give us the best and fairest measure of how all of our children, in all subgroups, are performing. We also have a responsibility to tell taxpayers about how our children are performing given their investment in our education system. No one test is ever a perfect measure of a child’s readiness or full demonstration of everything they have learned, but each feedback loop provides one angle or piece of data that can be considered within a broader context. That is what we hope TNReady will do – and we are equally committed to our responsibility to continue to improve the test and strengthen the data it provides you each year.
To address your specific concerns:
1. Students who were in the middle of testing on the day of the crash saw the exact same questions and prompts when they took the paper-based version. This gave those students a substantial advantage over their peers. 
There were approximately 20,000 students who successfully completed a Part I assessment online on Feb. 8. Those students did not retake the Part I test on paper. There were also 28,000 students who began an online assessment and were not able to complete the ELA, math, or social studies exam. We believe it would have been unfair to penalize those students because of the system disruptions. The department felt it was critical and fair to provide these students another opportunity.
It is highly unlikely that any of the students that attempted to take their Part I assessment online would have encountered the same writing prompt (or math and social studies items) when they took the paper test. There were 1.8 million tests submitted for Part I, compared to the 28,000 students who had logged in but not completed their Part I test. Because of multiple forms and versions being created for both the online and paper versions of the test, only about 16,000 of those students could have possibly been exposed to items on the paper-based test that were on the online versions, and all of those students were ones who likely experienced significant technical interruptions that may have prevented them from moving through or even seeing much or all of the test.
In addition, the students did not receive any feedback on what they may have previously completed, so they had no idea if their response was on track or not. Also, because the prompts for ELA were specific to the passages provided, students were not able to do additional research or practice composing their answer, since they would have needed to reference specific examples in the text to address a particular prompt. Simply seeing a question would not give a student any more of an advantage than a student who has practiced with the test questions on MICA or MIST.
Overall, this means that less than 1% of students may have been exposed to items 2-3 weeks prior to actual administration, received no feedback on their responses, had no access to items or passages until the paper administration, and experienced severe technical disruptions. Therefore, we don’t believe that those students had any advantage. In contrast, we believe these students would have been at a substantial disadvantage if they had not been allowed to complete the assessment via the paper version.
Finally, we will conduct a test mode effect study to determine if students who completed the assessment online in the fall or on Feb. 8 had any significant difference in performance from those who completed the paper-based version this spring.  If we find such differences, then we will make adjustments in the scoring, as is best practice for large-scale assessments (like ACT) that are administered both via paper and online.
2. The online assessment for geometry given in the fall included a reference sheet. In addition, the TNReady blueprint states that a reference sheet would be provided for all high school math exams; however, the students who tested this February did not receive a reference sheet with the paper/pencil assessment. This led to a great deal of concern from the students and will lead to inconsistent results. 
The reference sheet is intended for algebra I & II, and in the proctor script for Part I this spring, there is a reminder to give the math reference sheet to students for algebra I and algebra II only. It would not have benefited students in geometry, as there are no items on the geometry Part I assessment that the reference sheet will help a student answer.
The reference sheets were printed and shipped to districts along with test booklets and answer documents. We did not hear from Oak Ridge if they did not receive these, but let us know if they did not arrive.
3. In secondary math, students have reported questions that did not match the major work of the grade and item types that did not match the percent distribution that we were given with the blueprints. Despite many requests to the Department of Education for accurate blueprints providing accurate item type breakdowns for parts 1 and 2 of the TNReady, we have not been given updated blueprints. This has led to confusion about what students will be tested on and what item types to prepare for on the assessments.
Apologies if you reached out to our team and we were not responsive. We developed these blueprints for the first time this year to try to help educators understand how to pace their teaching over the course of the year and give them a sense of what standards would be covered on which parts of TNReady. We are learning from our educators about how to better support them in that vision, so we are going to be making some changes in the design of the blueprints for next year.
However, to address your concerns about this year’s blueprints, I want to provide context about what we shared with all districts and what students experienced on Part I. In March 2015, the department held regional assessment meetings introducing the test design for the 2015-16 school year. During those meetings, we included the following slides to highlight the content differences for grades 3-8 versus high school:
There is no language in the high school summary that should have indicated only major work of the grade would be covered in Part I for high school math courses. Moreover, the blueprint for geometry indicates that there are standards outside of major work of the grade that is assessed in all high school courses (see below). These blueprints were released in April 2015 and updated in September 2015, both times including standards beyond the major work of the grade in Part I. Those clusters that are not major work of the grade are highlighted below.   (See full letter for graphics)
There are no item type distributions in any of the mathematics blueprints. We shared some very preliminary projections last spring on item type distribution in the regional assessment meetings to give districts a sense of the mix of items. At that time, we emphasized that there would still be multiple choice and multiple select items, and students would have seen a variety of question types if they practiced on MICA or MIST over the fall and winter.
Just as we did for Part I, we have also shared a document with examples of how math questions will appear in the test booklet and on corresponding answer document, which you can view by clicking here. This illustrates the variety of item types which students may see on Part II.
4. On one of the High School EOCs, we were shipped two different answer documents. That normally wouldn’t be a problem except that we were shipped only one test form. Thus many of our students had an answer document that did not match the test on one of the questions. This not only invalidates that question but may also invalidate the responses immediately after that question because students may have started putting their answers in the wrong place so that it better matched the answer document. 
There was a minor printer error found on one of the geometry answer documents, and we appreciate the notification from Jim Hundertmark, the assessment director at Oak Ridge.  We advised him that we followed up with the department’s assessment design team and Measurement Inc. This issue has been flagged for scorers who will complete the hand- scoring process for geometry Part I, so they will be aware as they score students’ responses.
This issue was not widespread and was limited to one printing batch from one of the eight vendors who supplied Part I answer documents. As always, any item that creates irregularity in scoring may ultimately be excluded from student scores such that there is no impact on final performance results.
5. The test document and answer document did not match. As examples, on one test a grid was numbered by one’s in the test booklet, but the grid on the answer document was numbered by another scale. On another 3-8 test the answer document had a box for the answer but the question in the book showed multiple choice. This was misleading for students who transferred work from test booklet to answer booklet and caused a great deal of confusion. 
We are aware of only one issue with a table in 7th grade math. The table in the answer document included the variable “p” on one of the math terms, while there was no “p” in the test booklet.  This was not a widespread issue, and, as with the geometry item as noted above, this is a hand-scored component that scorers have been made aware of.
This table is the only issue we are aware of in which the answer document and test booklet did not match. As with any assessment, items that cause irregularity in scoring will ultimately be excluded from student scores such that there is no impact on final performance.
6. Students repeatedly reported that boxes for some short answer responses were too small and students were not able to fit the entire answer in the box. This led some students to believe that their answers were not correct, causing them to rework problems, wasting precious time, and quite possibly changing correct work to incorrect work. 
Student response on math answer documents only required numerical responses. If student handwriting was larger than the box, this is not an issue, as the items are hand scored.
7. On the first day of testing for grades 3 – 8, the scripts that the proctors were supposed to read did not match the students’ test booklets. Specifically, the students’ test booklets had sample questions; however, the proctor scripts said nothing about sample questions. Some students caught this, many did not. This alone could invalidate each math test for grades 3-8 because students were looking at sample questions and the proctor’s instructions to them were to begin testing, thus resulting in the answers for the sample questions being put in the place of non-sample questions in the answer book.

We were made aware of this and provided a supplemental document for proctors to address the sample. However, it is important to note that the design of the test booklet would have made it extremely difficult for students to confuse the sample questions with actual test items:

The sample items were not numbered. They were labeled “Sample A” and “Sample B” immediately after the directions. Each sample item had an answer block immediately below it.  On the following page, the correct answers for Sample A and Sample B were shown, with the correct method of completing the answer block.  There was a clear STOP sign at the bottom of the page.    The following page noted that there was “no test material on this page.”  The actual assessment begin two pages after the sample questions and then started with number 1, as did the answer document.  There was no Sample A or Sample B on the answer document.
Please see graphics below. It is unlikely that the students answered the sample questions on their test documents given all the visual cues in the test booklet that distinguished sample questions from actual test items.

8. Students have reported to their parents that prompts were confusing, using words such as “at” and “by” in inconsistent ways so that students did not know what they were being asked to do. One parent said, “My concern is that some students are dealing with the stress of thinking about the faulty test instead of being able to focus on the actual questions.” 
It may be helpful to remember that all of our questions are vetted by hundreds of Tennessee teachers each year, and that every test question that is operational – or in other words, scored – is field tested with students in the same grade and subject prior to being made an operational test question. Those teachers approve and edit each question for content, appropriateness, bias, and sensitivity, and after students take field test items, the results are thoroughly vetted to ensure the question was understood and is appropriate for students to take.
Certainly, though, the rigor of this year’s test was higher than we have had in the past, and we understand that some students have had anxiety about this increased level of expectation.
9.  In the practice tests, the answers were written in the same sheets as the questions but the actual test had separate sheets. We have had many students report that they were unsure about where they were supposed to answer their questions. This was a major cause of confusion, especially with our elementary school students. 
We know the paper management has been challenging for some of our younger students. When students did not transfer responses from test booklets to answer documents, teachers, proctors, or other adults transferred answers under the same test security provisions as we have for student transcription.
10. Because the test was originally online, the students in our elementary schools were not taught to bubble correctly throughout the year. We had students who circled answers or put checks in the bubbles, or who were not even sure how to answer. This put the students who did not have knowledge of how to properly bubble at a disadvantage when compared to their peers.
We did not receive any reports from Oak Ridge or any other districts regarding students not understanding how to bubble their answers. Only our third grade students would have never taken a paper-based TCAP assessment in a prior year, and for those students, districts may have provided the opportunity to complete a sample answer form prior to the assessment if they needed to practice.
Additionally, all 3-8 students take the science TCAP on paper each year, and students are expected to fill out their form for that assessment each spring.
11. For the MSAA alternative assessment, questions were written in such a way as to ignore the student’s current achievement levels. 
While this is the first year that Tennessee has given the MSAA, it is the second year for the operational MSAA, which has been given in many other states. After the assessment last year, the tests were not only scored, but the questions were again reviewed to determine if they are appropriate and accessible for students who qualify for the alternate assessment. This reviewer group includes special education teachers, parents, speech language pathologists, directors, and test design specialists. There may be questions that feel too difficult because this assessment is designed for all students who qualify for the alternate, including those who are not as impacted by their disability as much as others.
Our hope, as we have shared, is for the entire test to be adjusted for student level based on the Learner Characteristics Inventory, and this is not yet entirely possible. This means that Part I will include questions for all levels of students.  After they complete Part I, Part II will adjust and be more reflective of the student’s current skill level. There will still be challenging questions because that is important exposure for all students, but there will be fewer that are a challenge and far more at their level.

The TCAP-Alt Portfolio design was very different and in that model, teachers selected an API that they were confident the student would master. With MSAA, students will see questions they may not know the answer to, and that is not only okay, but expected. This is the same experience all other students have in school. That is part of learning. We expect results from all questions missed, to all or almost all correct, and everything in between. This is expected and appropriate.  You may have a student that misses all the items, and that is okay because that reflects their current understanding and mastery.  That is just an honest reflection of them at this point in time.  Congratulate them for trying. With another year of meaningful and rigorous core instruction, they might get more right next year and that will be an awesome celebration.
I want to close by stressing that TNReady is still a valid test. We take that responsibility very seriously because we know if we want parents and teachers – along with the broader education community – to be able to use this data, it needs to be reliable.
The paper forms that were produced contained items and questions that had undergone a rigorous review process – led by Tennessee teachers – and the forms were constructed in advance, as we had always planned a paper back-up option. Though the switch from online to paper-based testing created a number of logistical challenges for administration – and we know those challenges were great – the student experience of paper-based testing was similar to our historical experience.
In our historical technical reports, as well as in this year’s report, we will conduct tests of content and construct validity to ensure the test is statistically sound. In addition, we perform tests of reliability and produce a comparability analysis. Our decision to move to a paper-based assessment was, in part, to ensure that the overwhelming majority of our students experienced the same test conditions, as opposed to the variability that would have come with technical disruptions. We have two full-time psychometricians on our staff to ensure we are maintain the integrity of our testing program, and we are confident that the psychometrics, logistics, and design processes we have completed will allow the prudent use of student assessment results from the 2015-16 school year.
I hope this has helped to address some of your concerns, but I also want to reiterate that we are committed to improving our TCAP tests, including TNReady, each year, and I look forward to continuing to work with you and you educators in this work.
Thank you again for your thoughts and for your commitment to high expectations for our kids. Thank you as well for your and your educators’ efforts during this transition. I continue to be proud and grateful to see our educators and leaders go above and beyond every single day.
Dr. Candice McQueen

Commissioner of Education



For more on education politics and policy in Tennessee, follow @TNEdReport


Not Our Fault

Measurement, Inc., the state’s vendor for the TNReady tests is saying it’s not their fault that for the third time in a row, the company has failed to deliver a testing product.

The failure has lawmakers and other critics calling for the test to be stopped and for Measurement, Inc. to be fired.

The Department of Education said:

“We share our districts’ frustration that we do not know specific delivery timelines due to [Measurement Inc’s] failure to provide shipping projections and find this lack of information extremely unsatisfactory,” spokesperson Ashley Ball said in a statement.

But the company’s president responded:

“You just can’t take the test off line and put it on a printing press,” President Henry Sherich said by phone Friday. “We’re not failing to deliver. We are delivering as fast as possible.”

Sherich revealed his company is only working with one printer as other printers they work with are booked. This after a delay in delivering Phase I of the tests in March.

Sherich didn’t offer an apology or express concern for the students, parents, and teachers who have suffered as a result of this delay.

While the Department of Education has said it will be flexible with districts as they respond to this new delay, they have not yet said they plan to fire Measurement, Inc.

For more on education politics and policy in Tennessee, follow @TNEdReport


Lamberth: Stop the Test

In response to the latest failure to deliver TNReady, State Representative William Lamberth issued the following statement via his Facebook page:

I have lost faith in Measurement Inc. and I believe it is time to cancel the test for this year and start over. Local school districts who have received the material should have the option of going forward with testing or not at their discretion. I agree that we need a TN specific test that is designed to evaluate how well TN children are learning certain subjects. That test should be designed by TN teachers and TN administrators to be easily implemented and should reflect what is actually being relayed in our classrooms. TN contracted with this company to accomplish this task and they have failed miserably in delivering a computerized version and now can’t even ship the paper version on time. It is time to start over. Measurement Inc. has failed TN teachers and TN students and should not get one red cent of our money. That’s just my opinion.

While the Department of Education has said it will grant districts flexibility in modifying testing schedules, they have not yet said they will cancel the tests or the contract with Measurement, Inc.

For more on education politics and policy in Tennessee, follow @TNEdReport