One Step Further

On the heels of the announcement from the Tennessee Department of Education that TNReady testing was being suspended for grades 3-8, Williamson County Director of Schools Mike Looney went one step further and suspended end of course testing for high school students in his district.

Here’s the email he sent yesterday:

You are an incredible group of professionals and I am exceedingly proud of your work. This year has been full of surprises and uncertainty as it relates to state assessment and yet you still have prepared students for success. Your work is important and matters. I am proud of you.

Unfortunately, sometimes events happen outside of our direct control. Today the Commissioner of Education announced the suspension of Part II of the TnReady/TCAP Assessment in grades 3-8.

In addition, because of my continued concerns, I am suspending End of Course tests at the high school level.

I truly believe in the importance of measuring student progress. It is, from my perspective, a critical piece of our work. And I look forward to us being able to appropriately assess students as soon as possible.

Mike Looney

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Ready for What’s Next?

Following yesterday’s announcement that Measurement Inc. had been fired and TNReady testing suspended, Commissioner of Education Candice McQueen sent an email to teachers explaining the decision and providing information for what’s next.

The email included a line with a bit of an apology: You were ready, and I am sorry that we were unable to provide all students with a consistent and complete testing experience this year.

Because, of course, all students want a consistent and complete testing experience.

Here’s the email in it’s entirety:

Earlier today, I announced that the department is terminating its statewide assessment contract with Measurement Inc., effective immediately. The state awarded Measurement Inc. a contract in November 2014 to provide an online testing platform with a paper and pencil backup. Not only did Measurement Inc. fail to deliver a reliable online platform for students across the state, it has also failed to print and ship paper tests by deadlines the company had set. Terminating our contract with Measurement Inc. was a challenging decision because we’ve been working to honor the effort and investment of Tennessee teachers and students, but we’ve exhausted every option in problem solving with this vendor to get these tests delivered. TNReady was designed to provide Tennessee students, teachers, and families with better information about what students know and understand, and the failure of this vendor has let down the educators and students of our state.

As a result of repeated failures from this vendor, we are suspending Part II of TNReady for grades 3-8 this year. However, because districts have already received all high school testing materials, high school testing will move forward as planned.

While Measurement Inc. had previously assured us it would have all grade 3-8 materials delivered by April 27, the third deadline the company had set and missed this month, 100 percent of districts are still waiting on some 3-8 materials to arrive, and few districts have complete sets of tests for any grade or subject. Last week, the department told districts that we would not ask them to continue to change schedules and extend the testing window beyond May 10. We know the transition to TNReady has presented unexpected challenges for educators, schools, and districts, and we will not ask you to further disrupt your end of year schedule.

Many of you have shared with me that, despite the challenges with implementation this year, you were excited for your students to show what they were able to do on a new assessment. I understand that many of you will share in my disappointment that we won’t have detailed score reports from this year’s assessment for students in grades 3-8. We will provide as much information to schools and students as possible related to results from Part I testing for grades 3-8. This will be used for informational purposes only, and no score will be associated with Part I for grades 3-8. High school tests will be scored, and these results will be shared in the fall.

The transition to a new assessment this year has been challenging, but aligned assessments are critical to ensure that all students are making progress on their path to postsecondary opportunities and the workforce. I have seen first-hand how hard you have worked to align your instruction to our newer, more rigorous standards. You and your students have risen to higher expectations, and I hope you are encouraged by the growth in critical thinking and problem solving that you have seen in your classrooms every day. You have been flexible and patient beyond what we should expect, especially as you planned for Part II amid important field trips and end-of-year celebrations. You were ready, and I am sorry that we were unable to provide all students with a consistent and complete testing experience this year.

I recognize that our apology is not enough. You also deserve clear guidance as we look ahead to next year. We know teachers’ jobs don’t stop when the final bell rings in May. You spend a great deal of time in the summer months planning for the following year. We’ll share updated math and ELA blueprints in May, which will help you plan for the coming year.

While navigating the challenges of this year’s administration of TNReady, we’ve also been working to improve our assessments for next year. Earlier this month, we shared with districts several changes to TNReady for next year. We’re eliminating Part I for the math assessment, and we will include one to three math problems on Part II called integrated tasks that will maintain the rigor of the performance tasks. As a result of eliminating Part I for math, we will be able to reduce overall testing time for math in grades 3-11. Additionally, we’re adding multiple choice items to ELA Part I, which will encourage students to closely engage with reading passages before they write their constructed response. This will reduce the number of items on ELA Part II and decrease the overall testing time for ELA. These changes were made in response to feedback we received from teachers, as well as school and district leaders, and we’re dedicated to continuously improving our assessments as we move forward with a new assessment vendor.

The department is currently working with the state’s Central Procurement Office to expedite the selection of a vendor for both the scoring of this year’s high school assessment and the administration of next year’s test. In the meantime, I encourage you to read our new frequently asked questions document (here), which offers more detailed information about how this latest announcement will affect you, your schools, and your districts. We will continue to update this resource. It’s also important to note that today’s announcement doesn’t change the flexibility that has already been provided on teacher evaluation. If a teacher has 2015-16 TNReady data, in this case high school teachers, TNReady data will only be used if it helps the teacher.

Thank you for your hard work and dedication to your students. I appreciate your patience and understanding as we look ahead to the 2016-17 school year.


 

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Not Ready at All

It turns out TNReady wasn’t ready at all. From a technological failure on day one to a shipping delay in March to all out chaos in April, the transition to the new test has been, in the words of one Director of Schools, an “unmitigated disaster.”

As of today, in the face of yet another delay, the Tennessee Department of Education has terminated the contract with Measurement, Inc. and suspended further testing in grades 3-8 for this year.

Here’s the full press release from the Department of Education:

Education Commissioner Candice McQueen announced today the department will terminate its statewide testing contract with Measurement Inc., effective immediately. While high school testing will continue as planned, the state will suspend grade 3-8 testing during the 2015-16 school year due to Measurement Inc.’s inability to
deliver all testing materials.
After revising their shipping schedule for a third time this month, the state’s testing vendor, Measurement Inc., failed to meet its most recent deadline. As of this morning, all districts were still waiting on some grade 3-8 materials to arrive with a total of two million documents yet to be shipped. In February, the department was forced to move from the originally planned online assessment delivery to a paper-based format due to the failure of the vendor’s online platform.
“Measurement Inc.’s performance is deeply disappointing. We’ve exhausted every option in problem solving with this vendor to assist them in getting these tests delivered,” Commissioner Candice McQueen said. “Districts have exceeded their responsibility and obligation to wait for grade 3-8 materials, and we will not ask districts to continue waiting on a vendor that has repeatedly failed us.”
If districts have received materials for a complete grade or subject in grades 3-8 (i.e. fifth-grade math), they will have the option to administer that specific grade or subject level; however, the department will only be able to deliver limited student performance information for these particular grades and subjects. High school tests will be fully scored, and these results will be delivered later this fall.
“Challenges with this test vendor have not diverted us from our goals as a state. Tennessee has made historic and tremendous growth over the past several years. Higher standards and increased accountability have been a key part of this progress,” Commissioner McQueen said. “Our work toward an aligned assessment plays a critical role in ensuring that all students are continuing to meet our high expectations and are making progress on their path to postsecondary
and the workforce.”
Flexibility that has already been provided for teacher evaluation through recent legislation will remain. If a teacher has TNReady data, in this case high school teachers, TNReady data will only be used if it helps the teacher. If a teacher does not have TNReady data, their evaluation will rely on data from prior years.
The department is currently working with the state’s Central Procurement Office to expedite the selection of a vendor
for both the scoring of this year’s high school assessment and the development of next year’s test.  The department has also been in close contact with the United States Department of Education to ensure that Tennessee is in compliance with federal requirements and will continue to work with them on this issue.
TNReady, the state’s new assessment in math and English language arts in grades 3-11, was designed to be administered in two parts. Part I was given in late February and early March, and Part II was scheduled to begin on April 25.
Additional resources can be found in this frequently asked questions document:
https://tn.gov/assets/entities/education/attachments/tnready_faq_suspension.pdf.

MORE on Measurement, Inc.:

Replacing TCAP

TNReady Already?

Ready to Grade?

For more on education politics and policy in Tennessee, follow @TNEdReport

Seriously?

Will TNReady testing resume this year? For some students, maybe not.

The President of Measurement, Inc. said yesterday that there was no guarantee his company would make the testing window.

The Memphis Daily News reports:

The president of a North Carolina-based testing company said Monday that he can’t guarantee all students in Tennessee will receive the test on time.

Measurement Inc. president and CEO Henry “Hank” Scherich said his company is working furiously to get the new TNReady materials to students.

“I wish I could promise them,” Scherich said. He added they were doing everything humanly possible to get the tests to the students on time.

All of the students have at least some of the testing materials, he said, but the company has found itself scrambling to print and ship 5 million test booklets for Tennessee.

This follows last week’s  that a Friday deadline would be missed.

That event caused some lawmakers to call for this year’s testing to be cancelled. The Department of Education has still not agreed to that solution.

For more on education politics and policy in Tennessee, follow @TNEdReport

 

 

A Letter of Concern

Prior to the latest TNReady debacle, the Director of Schools in Oak Ridge sent a letter outlining some concerns about this year’s testing to Commissioner of Education Candice McQueen. Her response includes the questions he posed and makes for some interesting reading regarding the challenges faced by districts this year.

McQueen’s response is published here in its entirety:

April 11, 2016
Dr. Borchers,
Thank you for sharing your request and thoughts about TNReady and testing this year. I know this has been a tremendous transition for our families and schools, and I do not take these concerns lightly.
I want to address each of the issues you raised, but first I want to make sure you and your educators are aware of the new flexibility we have offered for accountability for the 2015-16 year, in part because of the unexpected issues we experienced on Part I.
First, both teachers and school leaders will not have results from this year’s tests included in their student growth (TVAAS) score unless it benefits them to do so. In other words, if results from this year give a teacher a higher score, they will be included, but if they hurt a teacher’s evaluation, they will be excluded. Educators will automatically receive the best option. You can read more information by clicking here.
In addition, you as a director can provide educators with the option to select a new achievement measure, and those who had originally chosen a schoolwide TVAAS measure can switch to a non-TVAAS option. Also, per the Tennessee Teaching Evaluation Enhancement Act, districts have complete discretion in how they choose to factor test data into employment decisions like promotion, retention, termination, and compensation. And as we had stated earlier, because the scores will be back later this year, districts do not have to include students’ scores in their grades.
Schools also have flexibility for accountability. When we run the Priority School list next year, we will provide a safe harbor for schools who may have seen a decline in performance in 2015-16 that would have resulted in being placed on the list. Instead, we will also look at school performance excluding 2015-16 data, and if that removes the school from being in the bottom 5 percent, they will not be considered a Priority School.
We have already taken steps through our ESEA waiver to revise district accountability this year. For 2015-16, districts will receive the better of two options for purposes of the achievement and gap closure statuses: a one-year growth measure or their relative rank in the state. If a district’s achievement scores decline, but their peers across the state decline in tandem, a district’s relative rank will remain stable. Similar to the governor’s proposal for teachers, districts will automatically receive the option that yields the higher score.

We still believe in the important role state assessment plays in accountability, and this year’s results will provide a baseline from which we can grow. We have a responsibility as a state to make sure all of our students are making progress toward college and career, and state tests give us the best and fairest measure of how all of our children, in all subgroups, are performing. We also have a responsibility to tell taxpayers about how our children are performing given their investment in our education system. No one test is ever a perfect measure of a child’s readiness or full demonstration of everything they have learned, but each feedback loop provides one angle or piece of data that can be considered within a broader context. That is what we hope TNReady will do – and we are equally committed to our responsibility to continue to improve the test and strengthen the data it provides you each year.
To address your specific concerns:
1. Students who were in the middle of testing on the day of the crash saw the exact same questions and prompts when they took the paper-based version. This gave those students a substantial advantage over their peers. 
There were approximately 20,000 students who successfully completed a Part I assessment online on Feb. 8. Those students did not retake the Part I test on paper. There were also 28,000 students who began an online assessment and were not able to complete the ELA, math, or social studies exam. We believe it would have been unfair to penalize those students because of the system disruptions. The department felt it was critical and fair to provide these students another opportunity.
It is highly unlikely that any of the students that attempted to take their Part I assessment online would have encountered the same writing prompt (or math and social studies items) when they took the paper test. There were 1.8 million tests submitted for Part I, compared to the 28,000 students who had logged in but not completed their Part I test. Because of multiple forms and versions being created for both the online and paper versions of the test, only about 16,000 of those students could have possibly been exposed to items on the paper-based test that were on the online versions, and all of those students were ones who likely experienced significant technical interruptions that may have prevented them from moving through or even seeing much or all of the test.
In addition, the students did not receive any feedback on what they may have previously completed, so they had no idea if their response was on track or not. Also, because the prompts for ELA were specific to the passages provided, students were not able to do additional research or practice composing their answer, since they would have needed to reference specific examples in the text to address a particular prompt. Simply seeing a question would not give a student any more of an advantage than a student who has practiced with the test questions on MICA or MIST.
Overall, this means that less than 1% of students may have been exposed to items 2-3 weeks prior to actual administration, received no feedback on their responses, had no access to items or passages until the paper administration, and experienced severe technical disruptions. Therefore, we don’t believe that those students had any advantage. In contrast, we believe these students would have been at a substantial disadvantage if they had not been allowed to complete the assessment via the paper version.
Finally, we will conduct a test mode effect study to determine if students who completed the assessment online in the fall or on Feb. 8 had any significant difference in performance from those who completed the paper-based version this spring.  If we find such differences, then we will make adjustments in the scoring, as is best practice for large-scale assessments (like ACT) that are administered both via paper and online.
2. The online assessment for geometry given in the fall included a reference sheet. In addition, the TNReady blueprint states that a reference sheet would be provided for all high school math exams; however, the students who tested this February did not receive a reference sheet with the paper/pencil assessment. This led to a great deal of concern from the students and will lead to inconsistent results. 
The reference sheet is intended for algebra I & II, and in the proctor script for Part I this spring, there is a reminder to give the math reference sheet to students for algebra I and algebra II only. It would not have benefited students in geometry, as there are no items on the geometry Part I assessment that the reference sheet will help a student answer.
The reference sheets were printed and shipped to districts along with test booklets and answer documents. We did not hear from Oak Ridge if they did not receive these, but let us know if they did not arrive.
3. In secondary math, students have reported questions that did not match the major work of the grade and item types that did not match the percent distribution that we were given with the blueprints. Despite many requests to the Department of Education for accurate blueprints providing accurate item type breakdowns for parts 1 and 2 of the TNReady, we have not been given updated blueprints. This has led to confusion about what students will be tested on and what item types to prepare for on the assessments.
Apologies if you reached out to our team and we were not responsive. We developed these blueprints for the first time this year to try to help educators understand how to pace their teaching over the course of the year and give them a sense of what standards would be covered on which parts of TNReady. We are learning from our educators about how to better support them in that vision, so we are going to be making some changes in the design of the blueprints for next year.
However, to address your concerns about this year’s blueprints, I want to provide context about what we shared with all districts and what students experienced on Part I. In March 2015, the department held regional assessment meetings introducing the test design for the 2015-16 school year. During those meetings, we included the following slides to highlight the content differences for grades 3-8 versus high school:
There is no language in the high school summary that should have indicated only major work of the grade would be covered in Part I for high school math courses. Moreover, the blueprint for geometry indicates that there are standards outside of major work of the grade that is assessed in all high school courses (see below). These blueprints were released in April 2015 and updated in September 2015, both times including standards beyond the major work of the grade in Part I. Those clusters that are not major work of the grade are highlighted below.   (See full letter for graphics)
There are no item type distributions in any of the mathematics blueprints. We shared some very preliminary projections last spring on item type distribution in the regional assessment meetings to give districts a sense of the mix of items. At that time, we emphasized that there would still be multiple choice and multiple select items, and students would have seen a variety of question types if they practiced on MICA or MIST over the fall and winter.
Just as we did for Part I, we have also shared a document with examples of how math questions will appear in the test booklet and on corresponding answer document, which you can view by clicking here. This illustrates the variety of item types which students may see on Part II.
4. On one of the High School EOCs, we were shipped two different answer documents. That normally wouldn’t be a problem except that we were shipped only one test form. Thus many of our students had an answer document that did not match the test on one of the questions. This not only invalidates that question but may also invalidate the responses immediately after that question because students may have started putting their answers in the wrong place so that it better matched the answer document. 
There was a minor printer error found on one of the geometry answer documents, and we appreciate the notification from Jim Hundertmark, the assessment director at Oak Ridge.  We advised him that we followed up with the department’s assessment design team and Measurement Inc. This issue has been flagged for scorers who will complete the hand- scoring process for geometry Part I, so they will be aware as they score students’ responses.
This issue was not widespread and was limited to one printing batch from one of the eight vendors who supplied Part I answer documents. As always, any item that creates irregularity in scoring may ultimately be excluded from student scores such that there is no impact on final performance results.
5. The test document and answer document did not match. As examples, on one test a grid was numbered by one’s in the test booklet, but the grid on the answer document was numbered by another scale. On another 3-8 test the answer document had a box for the answer but the question in the book showed multiple choice. This was misleading for students who transferred work from test booklet to answer booklet and caused a great deal of confusion. 
We are aware of only one issue with a table in 7th grade math. The table in the answer document included the variable “p” on one of the math terms, while there was no “p” in the test booklet.  This was not a widespread issue, and, as with the geometry item as noted above, this is a hand-scored component that scorers have been made aware of.
This table is the only issue we are aware of in which the answer document and test booklet did not match. As with any assessment, items that cause irregularity in scoring will ultimately be excluded from student scores such that there is no impact on final performance.
6. Students repeatedly reported that boxes for some short answer responses were too small and students were not able to fit the entire answer in the box. This led some students to believe that their answers were not correct, causing them to rework problems, wasting precious time, and quite possibly changing correct work to incorrect work. 
Student response on math answer documents only required numerical responses. If student handwriting was larger than the box, this is not an issue, as the items are hand scored.
7. On the first day of testing for grades 3 – 8, the scripts that the proctors were supposed to read did not match the students’ test booklets. Specifically, the students’ test booklets had sample questions; however, the proctor scripts said nothing about sample questions. Some students caught this, many did not. This alone could invalidate each math test for grades 3-8 because students were looking at sample questions and the proctor’s instructions to them were to begin testing, thus resulting in the answers for the sample questions being put in the place of non-sample questions in the answer book.

 
We were made aware of this and provided a supplemental document for proctors to address the sample. However, it is important to note that the design of the test booklet would have made it extremely difficult for students to confuse the sample questions with actual test items:

The sample items were not numbered. They were labeled “Sample A” and “Sample B” immediately after the directions. Each sample item had an answer block immediately below it.  On the following page, the correct answers for Sample A and Sample B were shown, with the correct method of completing the answer block.  There was a clear STOP sign at the bottom of the page.    The following page noted that there was “no test material on this page.”  The actual assessment begin two pages after the sample questions and then started with number 1, as did the answer document.  There was no Sample A or Sample B on the answer document.
Please see graphics below. It is unlikely that the students answered the sample questions on their test documents given all the visual cues in the test booklet that distinguished sample questions from actual test items.

8. Students have reported to their parents that prompts were confusing, using words such as “at” and “by” in inconsistent ways so that students did not know what they were being asked to do. One parent said, “My concern is that some students are dealing with the stress of thinking about the faulty test instead of being able to focus on the actual questions.” 
It may be helpful to remember that all of our questions are vetted by hundreds of Tennessee teachers each year, and that every test question that is operational – or in other words, scored – is field tested with students in the same grade and subject prior to being made an operational test question. Those teachers approve and edit each question for content, appropriateness, bias, and sensitivity, and after students take field test items, the results are thoroughly vetted to ensure the question was understood and is appropriate for students to take.
Certainly, though, the rigor of this year’s test was higher than we have had in the past, and we understand that some students have had anxiety about this increased level of expectation.
9.  In the practice tests, the answers were written in the same sheets as the questions but the actual test had separate sheets. We have had many students report that they were unsure about where they were supposed to answer their questions. This was a major cause of confusion, especially with our elementary school students. 
We know the paper management has been challenging for some of our younger students. When students did not transfer responses from test booklets to answer documents, teachers, proctors, or other adults transferred answers under the same test security provisions as we have for student transcription.
10. Because the test was originally online, the students in our elementary schools were not taught to bubble correctly throughout the year. We had students who circled answers or put checks in the bubbles, or who were not even sure how to answer. This put the students who did not have knowledge of how to properly bubble at a disadvantage when compared to their peers.
We did not receive any reports from Oak Ridge or any other districts regarding students not understanding how to bubble their answers. Only our third grade students would have never taken a paper-based TCAP assessment in a prior year, and for those students, districts may have provided the opportunity to complete a sample answer form prior to the assessment if they needed to practice.
Additionally, all 3-8 students take the science TCAP on paper each year, and students are expected to fill out their form for that assessment each spring.
11. For the MSAA alternative assessment, questions were written in such a way as to ignore the student’s current achievement levels. 
While this is the first year that Tennessee has given the MSAA, it is the second year for the operational MSAA, which has been given in many other states. After the assessment last year, the tests were not only scored, but the questions were again reviewed to determine if they are appropriate and accessible for students who qualify for the alternate assessment. This reviewer group includes special education teachers, parents, speech language pathologists, directors, and test design specialists. There may be questions that feel too difficult because this assessment is designed for all students who qualify for the alternate, including those who are not as impacted by their disability as much as others.
Our hope, as we have shared, is for the entire test to be adjusted for student level based on the Learner Characteristics Inventory, and this is not yet entirely possible. This means that Part I will include questions for all levels of students.  After they complete Part I, Part II will adjust and be more reflective of the student’s current skill level. There will still be challenging questions because that is important exposure for all students, but there will be fewer that are a challenge and far more at their level.

The TCAP-Alt Portfolio design was very different and in that model, teachers selected an API that they were confident the student would master. With MSAA, students will see questions they may not know the answer to, and that is not only okay, but expected. This is the same experience all other students have in school. That is part of learning. We expect results from all questions missed, to all or almost all correct, and everything in between. This is expected and appropriate.  You may have a student that misses all the items, and that is okay because that reflects their current understanding and mastery.  That is just an honest reflection of them at this point in time.  Congratulate them for trying. With another year of meaningful and rigorous core instruction, they might get more right next year and that will be an awesome celebration.
I want to close by stressing that TNReady is still a valid test. We take that responsibility very seriously because we know if we want parents and teachers – along with the broader education community – to be able to use this data, it needs to be reliable.
The paper forms that were produced contained items and questions that had undergone a rigorous review process – led by Tennessee teachers – and the forms were constructed in advance, as we had always planned a paper back-up option. Though the switch from online to paper-based testing created a number of logistical challenges for administration – and we know those challenges were great – the student experience of paper-based testing was similar to our historical experience.
In our historical technical reports, as well as in this year’s report, we will conduct tests of content and construct validity to ensure the test is statistically sound. In addition, we perform tests of reliability and produce a comparability analysis. Our decision to move to a paper-based assessment was, in part, to ensure that the overwhelming majority of our students experienced the same test conditions, as opposed to the variability that would have come with technical disruptions. We have two full-time psychometricians on our staff to ensure we are maintain the integrity of our testing program, and we are confident that the psychometrics, logistics, and design processes we have completed will allow the prudent use of student assessment results from the 2015-16 school year.
I hope this has helped to address some of your concerns, but I also want to reiterate that we are committed to improving our TCAP tests, including TNReady, each year, and I look forward to continuing to work with you and you educators in this work.
Thank you again for your thoughts and for your commitment to high expectations for our kids. Thank you as well for your and your educators’ efforts during this transition. I continue to be proud and grateful to see our educators and leaders go above and beyond every single day.
Best,
Dr. Candice McQueen

Commissioner of Education

 

 

For more on education politics and policy in Tennessee, follow @TNEdReport

 

Not Our Fault

Measurement, Inc., the state’s vendor for the TNReady tests is saying it’s not their fault that for the third time in a row, the company has failed to deliver a testing product.

The failure has lawmakers and other critics calling for the test to be stopped and for Measurement, Inc. to be fired.

The Department of Education said:

“We share our districts’ frustration that we do not know specific delivery timelines due to [Measurement Inc’s] failure to provide shipping projections and find this lack of information extremely unsatisfactory,” spokesperson Ashley Ball said in a statement.

But the company’s president responded:

“You just can’t take the test off line and put it on a printing press,” President Henry Sherich said by phone Friday. “We’re not failing to deliver. We are delivering as fast as possible.”

Sherich revealed his company is only working with one printer as other printers they work with are booked. This after a delay in delivering Phase I of the tests in March.

Sherich didn’t offer an apology or express concern for the students, parents, and teachers who have suffered as a result of this delay.

While the Department of Education has said it will be flexible with districts as they respond to this new delay, they have not yet said they plan to fire Measurement, Inc.

For more on education politics and policy in Tennessee, follow @TNEdReport

 

Lamberth: Stop the Test

In response to the latest failure to deliver TNReady, State Representative William Lamberth issued the following statement via his Facebook page:

I have lost faith in Measurement Inc. and I believe it is time to cancel the test for this year and start over. Local school districts who have received the material should have the option of going forward with testing or not at their discretion. I agree that we need a TN specific test that is designed to evaluate how well TN children are learning certain subjects. That test should be designed by TN teachers and TN administrators to be easily implemented and should reflect what is actually being relayed in our classrooms. TN contracted with this company to accomplish this task and they have failed miserably in delivering a computerized version and now can’t even ship the paper version on time. It is time to start over. Measurement Inc. has failed TN teachers and TN students and should not get one red cent of our money. That’s just my opinion.

While the Department of Education has said it will grant districts flexibility in modifying testing schedules, they have not yet said they will cancel the tests or the contract with Measurement, Inc.

For more on education politics and policy in Tennessee, follow @TNEdReport

 

A Modification

As we reported yesterday, Phase II of TNReady is not so ready. In fact, Grace Tatter reports that the problem is statewide, impacting grades 3-8.

Tatter cites an email from the Department of Education indicating the state is not sure when the Phase II tests will be delivered to districts.

The email also says:

“…Districts may modify their testing schedules as needed, without any prior approval or notice to the (state),”

The thing is, some districts have already been trying to modify their schedules by not giving the test at all. The idea of refusing to administer Phase II surfaced in Murfreesboro in late March and early April. The state responded by issuing a vague threat regarding withholding BEP funds.

Tullahoma City Schools on Monday approved a resolution unanimously calling on the state to cancel testing for the remainder of this year.

All of this was before the realization that Phase II tests would not make it to Tennessee districts on time. Now, though, the Department of Education’s own words suggest that districts may modify as they see fit without consulting the state. One possible modification would be to not administer the test at all. Another would be to schedule it for a time in June when students aren’t in school. Districts could say they offered the test, but no one showed up to take it.

The state has also made a big fuss about what happens to students/districts if students simply refuse to take the test. Trouble is, the state’s memo is based on some pretty fuzzy reasoning.

As this piece was being written, the Department of Education announced it will not ask districts to reschedule tests beyond the current testing window, which expires on May 10th. That means if materials are not received in time for administration by that date, districts don’t have to administer the tests. The Department also indicated it would provide additional flexibility to districts.

From Jason Gonzales:

The Tennessee Department of Education announced to districts Friday it won’t reschedule the TNReady testing window again this year and for those districts that don’t receive tests on time, will provide flexibility.

“We will not ask districts to reschedule again beyond what has been communicated to date, and we will not extend the testing window beyond May 10,” according to a statement sent to districts Friday.

So, what’s next? Will the state cancel the contract with testing vendor Measurement, Inc.? Will Commissioner McQueen assume responsibility for the failed transition to a new test?

Only time will tell, and there’s not much time left.

 

For more on education politics and policy in Tennessee, follow @TNEdReport

Still Not F*&#ing Ready

TNReady Phase II is supposed to be starting, except it won’t. It seems that shipping delays will prevent at least eight school districts from starting the planned administration of Phase II next week.

In Sumner County, emails have gone out confirming the delay and a new planned start date of May 2nd.

Officials in seven other districts have confirmed they have yet to receive the testing materials.

This comes after a disastrous first day of TNReady testing back in February and subsequent shipping delays of Phase 1 paper materials.

It also comes after the Murfreesboro City School Board discussed refusing to administer Phase II and the Tullahoma City Schools considered a resolution calling on the state to stop any further testing this year.

From the start, the transition to TNReady has been bungled. While Commissioner McQueen continues to make excuses, blame the vendor, and promise a better outcome next time, students in Tennessee schools face disrupted schedules and loss of learning time.

Instead of issuing threats to districts, perhaps the Department of Education should have been developing solutions or simply responding to the frustrations of students, parents, and teachers across the state. Maybe stopping after Phase I would have allowed for a true course correction.

In any case, we’re still not TNReady.

For more on education politics and policy in Tennessee, follow @TNEdReport

Mary Pierce on the ASD Resolution

Nashville School Board Member Mary Pierce took to Facebook to discuss the passage of a resolution calling for a moratorium on school takeovers from the ASD. Her comments are below.

After getting a few confused questions about the ASD Resolution passed on Tuesday night, I’m posting the YouTube link of the meeting & this discussion begins around 1:53 mark. The initial stated purpose of the proposed resolution called for a moratorium of ASD takeovers based on the first year of TN Ready Scores. Given that the state has acknowledged TN Ready issues and excluded use of scores in teacher evaluations, this type of resolution made sense to me.

However, when I received our agenda packet, I read the resolution presented as one that went well beyond this call for a one-year moratorium. It is my opinion that it made subjective allegations against the ASD, referenced that MNPS *might* implement the same type of IZone as Shelby County Schools and asked for funding to do just that (yet our board has never discussed this), and generally was written with a tone of which I did not agree. And, as I stated Tuesday night, it must be owned that there was nothing preventing Shelby County from implementing an iZone prior to the external pressure applied by the presence of the ASD. I also find it ironic that the gains heralded by many about the SCS iZone are based on the very same TCAP/TVAAS scores deemed flawed by those same people when used on district schools that are not performing as well. But that’s a whole other topic.

I amended the resolution (below with the original and my tracked changes) which still requested a one year reprieve from ASD takeovers based on the first year TN Ready scores, and also asked for local education agencies (LEAs) to be included in the legislative committee summer study the TN DOE has announced for “ASD Clean-Up,” including plans to return the takeover schools back to the LEAs as soon as practicable. (Edit: Click here to see original post with picture)

This amended version failed in a 4:4 vote (Dr. Gentry had left for a community meeting) and then Mr. Pinkston’s original resolution passed 5:3 with Elissa Kim, Tyese Hunter and I voting against.

By the way, resolutions are simply statements of resolve and often a request–like this one–but they have no binding authority.

https://www.youtube.com/watch?v=nfq6F4Q8d6k