Ready for a Break

Following the Day One failure of TNReady testing, the state proposed switching to only paper and pencil tests.  Last week, the first sign of trouble on that front developed, as Dickson County reported a delay in receiving the printed materials.

While the Department of Education reports that most districts have received their materials, Chalkbeat reported yesterday that at least a dozen districts have had to reschedule testing due to printing delays. Those districts include:

  • Tennessee Achievement School District
  • Bartlett
  • Hamblen County
  • Maury County
  • Madison County
  • Murfreesboro City
  • Putnam County
  • Robertson County
  • Sevier County
  • Sullivan County
  • Tipton County
  • Wilson County

Despite these delays, the TNReady testing will continue, and in fact, many districts have already begun some paper and pencil testing.

Still, it seems that TNReady just can’t catch a break in its first year.

For more on education politics and policy in Tennessee, follow @TNEdReport

 

The Paper Chase

Following the failure of TNReady on Day One, Commissioner Candice McQueen announced a simple solution: Tests will now be administered on pencil and paper. Except, it turns out, it’s not so simple. What if the paper tests don’t arrive on time?

The Dickson Herald reports:

Dickson County Schools have delayed administering the paper version of the state’s new TN Ready standardized tests until March 7 after a delay in receiving the testing materials, the schools director said.

Schools Director Dr. Danny Weeks alerted parents to the issue in a SchoolReach phone message and he also discussed the matter with the county School Board on Thursday night.

Educators and parents had prepared for administering the paper tests on Monday. However, Weeks said the school system had not yet received confirmation the print testing materials had yet shipped Thursday.

The ongoing saga of the TNReady challenges reminds me of the time the legislature pulled Tennessee out of PARCC just as we were preparing to have our first year with the Common Core aligned tests. Instead of a year without a test, we administered another year of TCAP — a test not aligned with our state’s current standards, and thus not an accurate indicator of student mastery or teacher impact.

Governor Haslam and Commissioner McQueen have announced that teachers and students alike won’t be held accountable for test results this year, but what about just taking the year off and getting it right?

For more on education politics and policy in Tennessee, follow @TNEdReport

Of Hope and TNReady

Natalie Coleman is a 7th grade language arts teacher in Sumner County and a 2015-16 Tennessee Hope Street Group Fellow.

Are we ready?

This question is front-and-center in the conversation surrounding education in Tennessee.

This is the question ringing in classrooms across the state, the question plaguing teachers working tirelessly to adjust instruction to more rigorous expectations, striving to help students reach heights monumentally higher than they’ve ever been asked to, much less prepared to, before.

This is the question of parents, nervous their children’s scores will not be as high as they’re accustomed to, worried that everything they’ve heard about the standards and Race to the Top and the over-testing is true, worried that the changes happening in our state may not be good for our children.

This is the question of students whose target has been moved each year, who have been told TCAP counts as a grade (and that it doesn’t), that it’s the last year for TCAP tests (and that it’s not), and that now it is time for us to be TNReady. As a state, we have even branded our new test with a name that echoes our question—Are we ready? Are we TNReady?

For anyone in the state closely connected to education, TNReady is a name that carries with it fear of the unknown, of unrealistic standards, and of unwarranted pressures on teachers, parents, and students. At the same time, though, it resonates with the hope of what we as a state want to achieve—readiness in our students.

We want them to be ready for the next steps in their educations and in their lives. We want them to be prepared to succeed. We do not want to continue reading that students in the first Tennessee Promise cohort aren’t making it, even when college is free, because it’s “too hard.” We do not want to continue hearing from employers that Tennessee’s young workforce is simply not ready.

I will admit that, as a teacher, I am nervous about TNReady because of the pressure it puts on my students. I fear that my classroom will progressively become more and more of a test preparation center and less of a place where students can cultivate creativity, curiosity, interest, and wonder. I am concerned that the testing may take too much of our time and focus, may not be developmentally appropriate, may not be amply vetted, may overwhelm our low-budget school technology resources. I believe that teacher and parent groups are right to raise questions and concerns, right to warn that TNReady may not itself be ready and that its incorporation into student grades and teacher evaluations is problematic and potentially unfair.

Yet, the prospect of TNReady also fills me with hope because of the aspiration it represents. As a state, we have said that it’s time for our students to be ready, time to stop selling them short with watered-down standards and bubble-sheet assessments, time to do what’s necessary for our students to be able to read and write at levels that will make them ready for the literacy demands of college and careers.

In the previous six years I’ve taught, I’ve felt a great tension between what I believe has always been the heart of our language arts standards and how those standards were ultimately assessed. At first, I idealistically believed that teaching language arts the way I learned to teach—authentically and deeply rooted in reading and writing—would automatically translate to test success as well. My achievement levels and TVAAS scores told a different story. Over time, I learned that achieving the desired results required shifting gears to TCAP-specific strategies and drills as the test approached. Test scores improved greatly, but I don’t know what my students actually gained, besides a good score, from those weeks of lessons.

Now, though, my students are preparing for TNReady Part I, a test that will require them to read rigorous texts and synthesize the information from them into a sophisticated essay. This new test has the potential to be one that matches the authenticity I strive for in my classroom.

When I tell my students that the writing we are doing in class right now is to prepare not only for TNReady Part I but for many kinds of writing they will need to do in the future, I can mean it. The skills we are honing to prepare for this test are skills that will help them write successfully in high school, on AP exams, for college admissions essays, in college classes, and even in their careers.

Right now in Tennessee, because of our raised standards and the assessments that come with them, our students are learning skills that will make them ready. I believe this and hope for more growth because of the amazing growth that I’ve already seen.

As our state has undergone massive educational shifts, our students have borne the changes and adapted. When we first began piloting text-based essay prompts a few years ago in my district, many of the students in my class stared at them blankly, merely copied the text word-for-word, or wrote a half-page “essay” that displayed a complete misunderstanding of the task. The writing was often missing the basic components of topic sentences, indention, or even separating paragraphs at all. As I worked to help my students prepare in those early days, for tests that were pilots, my students groaned when we were “writing again.” Even though I worked to make writing fun and to give students opportunities to write for genuine purposes throughout the year, writing assessment preparation was an arduous task for everyone, and students were often frustrated.

Each year, though, the frustration has diminished a bit. In the beginning, just making sure students learned the basics of an essay format seemed an impossible task; now, they come to me knowing how to tackle prompts and organize their thoughts into paragraphs. There is still much room for growth, but where my students start every year and where they end are both well beyond those markers for the class before. Each year is better and better, and—best of all—the groaning is gone. Put two complex texts and a writing prompt in front of my students now, and they set right to work, staying focused for over an hour at a time, writing away. They’re open to revision and work to make changes. They ask for help, and they take pride in making their writing the best they can.

This is progress I would have considered miraculous three years ago, yet it is commonplace now, and I am grateful for the growth I see in students’ abilities each year.

When February comes and brings with it text-based writing tasks for my seventh graders that look more like something I would have learned to do in pre-AP classes in high school, when April comes with a second computer-based test, this one filled with rigorous and lengthy texts to read and a large dose of an entirely new breed of multi-select, drop-down box, click-and-drag multiple choice questions, will my students be ready?

I am not sure that they will be completely ready. Yet.

No matter how my students score on TNReady this year, though, they are undoubtedly stronger for what we’ve done. No matter what problems we encounter with the test and what we need to do to fix it, I hope we never lose sight of the goal behind it. I hope we keep our standards high, I hope we keep striving to make our assessments authentic measures of the skills we want our students to attain, and I hope we see that the end result is students who are ready.

For more on education politics and policy in Tennessee, follow @TNEdReport

What Does ESSA Mean to You?

Jon Alfuth is the newest addition to the Tennessee Education Report team. In his inaugural post, he breaks down the newly-signed Every Student Succeeds Act.

This last week saw the passage of the successor to No Child Left Behind (NCLB), the Every Student Succeeds Act. After months and months of negotiations, this legislation is suddenly a reality. I’m here to break to down and give you an idea of what it means for districts across Tennessee.

NCLB, Waivers and Race to the Top

First, you have to start with No Child Left Behind and education policy under the Obama administration. The legislation massively ramped up the Federal Government’s involvement in what was traditionally a state dominated education system. The 2002 law set ambitious long term goals that every student would be proficient by an agreed upon date, required states to establish systems to track student performance and set stiff penalties for schools that failed to make adequate yearly progress towards those goals.

It didn’t go as planned. Early in the Obama administration (and arguably before), it was clear that the 100 percent proficiency goals and timeframe was an admirable dream, but a dream none the less. The Obama administration chose to grant states waivers from many provisions of federal policy, but only if the states adjusted their education policy to fit the administrations education agenda. Specifically, states had to implement college and career ready expectations for students, target low performing schools and population groups and create teacher and principal evaluation systems with student growth as a component.

Then came the Race to the Top, a competition among states for funding to implement education reform policies in each state. Tennessee was one of the first states to receive funding and required states to build assessment systems for standards, adopt data systems, support teachers and school leaders and create interventions in low performing schools. Tennessee already had TVAAS in place, so we were a natural fit as two of the key requirements were already met. One of the biggest innovations that has come out of Race to the Top is the Achievement School District, which was spurred largely by federal money.

ESSA

Now we get to the Every Student Succeeds Act. The act tones down much of the direct or indirect influencing of local education policy that has been promoted by the Federal Government while keeping the “spirit” of NCLB in place.

The goal in the compromise bill that has now been signed into law is to keep in place the structure preferred by democrats that forces states to report on and take action to rectify education inequities while at the same time catering to republican desires for more state and local control.

Here are some of the highlights of how this bill differs from the NCLB and Obama era policy:

  • Testing– under NCLB, testing was once a year every year In grades 3-8 with one test in high school. ESSA keeps the frequency of testing in place, but allows states to be more flexible with what tests are given and when in the year they are given.
  • Standards– ESSA takes the same tack as NCLB, supporting higher standards, but includes an interesting provision that prohibits the Secretary of Education from “influencing, incentivizing or coercing” states to adopt common core.
  • Accountability–ESSA pulls back from the NCLB era significantly and allows states to essentially come up with their own accountability goals, as long as those plans are submitted to the Department of Education. This contrasts with NCLB, which prescribed interventions from the top down. ESSA also relaxes the influence that test scores are required to play in accountability systems.
  • School evaluation– under NCLB, evaluation focused mostly on test scores. ESSA allows states to expand the scope of their evaluation to include “other measures” such as graduation rate, student engagement and disciplinary data in evaluation.
  • Low performing schools –under NCLB states had to address low performing schools using mostly prescribed methods. ESSA specifies that states must address the bottom 5% of schools by assessment scores and high schools with low graduation rates or underperforming subgroups, but again leaves it up to the states to decide how.
  • Overtesting – the law contains a provision to encourage states to eliminate unnecessary state and local tests and provides them funding to do so. It also would provide support to districts to analyze the amount of time teachers test with the end goal of reducing that time.

Dramatic Change?

Looking over these provisions, the overall theme of ESSA in my eyes is state designed accountability monitored by the federal government. This differs markedly from the spirit of NCLB, which used heavy handed top down methods to impose change. Now states are much more on the hook to come up with their own strategies to improve schools.

That said, this isn’t that different than what has been done under the waivers granted by the Obama administration. Waivers required states to submit a plan, which was then reviewed by the federal government and approved or turned down. The same concept seems to be at play within ESSA, but with more freedom granted to the states to decide what and how to address the different requirements embedded within ESSA.

Short Term Impact

Here in Tennessee, we already do much of what the flexibility under ESSA would allow. We’ve already started breaking up our assessments over the course of the year with the upcoming implementation of TNReady, where students will take their yearly assessment in two different sessions in the spring. We’ve also started down the road of eliminating unnecessary assessments.

We also have two existing interventions for low performing schools (ASD and iZone), we report our test data and have established data systems in place.

We also effectively tackled the standards issue by writing our own standards by revising and adding to the common core.

In sum, I don’t think we’re going to see a dramatic transformation of how we conduct education in Tennessee a la Race to the Top. I think the more likely outcome is that teachers and schools will start seeing small tweaks here and there to the education policy frameworks established over the past few years.

One area in which I think we could see some movement is in the area of reducing redundant testing. My hope is that the Tennessee DOE takes advantage of the funds available through ESSA to study the number of tests and the time that is spent preparing for them to take these assessments.

Longer Term Impact

My final take on all of this is that much of the result of ESSA locally will depend on the actions taken by constituents and their interactions with state elected officials. I’ve already explained why I don’t think much will happen, primarily because we’ve taken advantage of much of the flexibility already afforded us under NCLB waivers.

But that could change quickly depending on constituent mobilization. Local and state level elected officials are much more responsive to public opinion and Tennessee’s legislators seem especially so. For example, if we see tremendous upswing in the opt-out movement we might see a large rollback in the amount, frequency and design of our accountability measures.

For advocates of the current system, much of ESSA will come down to defending what has already been won in the past decade. The systems for a standards based accountability system are in place, and those that support this vision of education will need to likely fight tooth and nail to keep it intact.

In the end, movement will be up to advocates for the new states quo to push to keep what we already have and for opponents of the system to push for what they want to see change. That’s something that is very difficult to predict.

For more on education policy and politics in Tennessee, follow @TNEdReport

 

Thoughts on Annual Student Assessments

Dan Lawson is the Director of Schools at Tullahoma City Schools

The Issue: Assessments of student academic progress.  As you well know, The state of Tennessee is transitioning assessments from our former suite to the new TNReady assessments.  Furthermore, you are also well aware of the fact that many of the standards on which the current assessment is based are currently under review with consideration for additions and removal.  

 

The Background: In the scenario I described about teachers and growth scores, a senior teacher representing a lauded math department was able to present data that clearly and convincingly aligned our instruction with two critical components in our academic program in Tullahoma City Schools: ACT and Advanced Placement.  As he visited with me, he did so with a concern that can best be characterized by this summary:  Our instructional path produces ACT and Advanced Placement scores significantly above the state average and if we teach all the prescribed TNReady standards in timeframes aligned with TNReady assessments, we are concerned that our student performance on ACT and Advanced PLacement assessments will decline.

Certainly, that statement is based on the experiences and anecdotes of my staff members, but there is tremendous logic in this fundamental question.  Since one of our primary purposes and expected outcomes is to produce students who are “college and career ready” as measured by ACT or SAT, why don’t we allow schools and districts with the desire to do so to assess based on the ACT or SAT suite of services aligned with the measure we aspire to accomplish?  While the issue of assessment often is directly linked to the issue of accountability, I submit that the accountability of most schools and districts would be enhanced by reporting scores that both our students and their parents readily understand.  To that end, nearly every high school student enrolled in Tennessee high schools clearly understands the difference between a “15” and a “30” on the ACT.  That understanding makes it much easier for a teacher and school leader to discuss and propose interventions to address the “15” that has been reported for that student.

 

A Proposed Solution: There has been a misalignment in the testing/teaching standards from SAT 10 to TCAP to ACT and this misalignment has allowed some system’s to experience low TVAAS scores for K-2, 3-8, and 9-12 assessments. Until we pick a plan and follow that plan, we will be hard pressed to see college and career readiness expand in Tennessee. IF college and career readiness is really our goal, then don’t we need a clearly and cleanly aligned set of standards to reach that goal?

 

Align the state assessment with the ACT or SAT suite of services.  I understand that concerns exist suggesting that we can not accomplish that outcome and be compliant with state procurement, but I am also well aware of the fact that other states utilize the ACT suite today.  I am confident that we have the ability to accomplish anything that the state of Alabama has accomplished.

For more on education politics and policy in Tennessee, follow @TNEdReport

A Matter of Fairness

A coalition of education advocacy groups released an online petition today calling for a one year waiver from using student test scores in teacher evaluations in Tennessee.

Here’s the press release:

A coalition of groups supporting public education today launched an online petition asking the Tennessee General Assembly and Governor Bill Haslam to grant teachers a grace period from the use of student test scores in their evaluations in the first year of new TNReady tests. The petition tracks language adopted unanimously by the Knox County School Board, which passed a resolution last week opposing the use of student test scores in teacher evaluation for this academic year.

“The state has granted waivers so that TNReady scores aren’t required to be counted in student grades for this year,” said Lyn Hoyt, president of Tennesseans Reclaiming Educational Excellence (TREE). “If TNReady won’t count in student grades, it’s only fair that it shouldn’t count for teacher evaluation.” Hoyt noted that the transition to the new test means entering uncharted territory in terms of student scores and impact on teacher evaluation scores. As such, she said, there should be a one year or more grace period to allow for adjustment to the new testing regime.

“TNReady is different than the standardized tests we’ve had in the past,” Hoyt said. “Our students and teachers both deserve a reasonable transition period. We support the Knox County resolution and we are calling on the General Assembly to take notice and take action. Taking a thoughtful path transitioning to the new test can also build confidence and trust in the process.”

Hoyt also cited a recent policy statement by the American Educational Research Association that cautions against using value-added data in teacher evaluations and for high-stakes purposes. “Researchers who study value-added data are urging states to be cautious in how it is used to evaluate teachers,” Hoyt said. “The transition to TNReady is the perfect time to take a closer look at how test scores are used in teacher evaluations. Let’s take a year off, and give our students and teachers time to adjust. It’s a matter of fundamental fairness.”

Groups supporting the petition include:

Strong Schools (Sumner County)
Williamson Strong (Williamson County)
SPEAK (Students, Parents, Educators Across Knox County)
SOCM (Statewide Organizing for Community eMpowerment)

Middle TN CAPE (Coalition Advocating for Public Education)
Momma Bears Blog
Advocates for Change in Education (Hamilton County)
Concerned Parents of Franklin County (Franklin County)
Parents of Wilson County, TN, Schools
Friends of Oak Ridge Schools (City of Oak Ridge Schools)
TNBATs (State branch of National BATs)
TREE (Tennesseans Reclaiming Educational Excellence)
TEA (Tennessee Education Association)

For more on education politics and policy in Tennessee, follow @TNEdReport

New and Not Ready

Connie Kirby and Carol Bomar-Nelson, English teachers at Warren County High School, share their frustration with the transition to TNReady and what it means for teacher evaluation.

Connie Kirby:

This is going to be long, but I don’t usually take to social media to “air my grievances.” Today I feel like there’s no better answer than to share how I feel. It’s been a long year with some of the highest of the highs and lowest of the lows. I work in a wonderful department at a great school with some of the most intelligent, hard-working people I know. As the years have progressed, we have gone through many changes together and supported each other through the good and the bad (personally and professionally). We do our best to “comply” with the demands that the state has put on us, but this year everything that we’ve been hearing about and preparing for for years has come to fruition. We’re finally getting familiar with the “real deal” test, instead of dealing with EOCs and wondering how it’s going to change. I’ve seen the posts and rants about Common Core and have refrained from jumping on the bandwagon because I have had no issues with the new standards. I do, however, see an issue with the new assessment, so I have held my hand in the hopes that I might find something worth sharing and putting my name next to. Today, I witnessed an exchange between one of my colleagues and the state, and I couldn’t have said it better myself. With her permission, I am sharing her words.

Carol Bomar-Nelson:

I don’t know how to fix the problems with the test. I agree that teachers should have accountability, and I think student test scores are one way of doing that. Having said that, if the state is going to hold teachers accountable for student test scores, then the test needs to be fair. From what I have seen, I firmly believe that is not the case. I am not just basing this conclusion on the one “Informational Test” in MICA. Other quizzes I have generated in MICA have had similar flaws. When my department and I design common assessments in our PLC’s, we all take the tests and compare answers to see which questions are perhaps ambiguous or fallacious in some way. I do not see any evidence that the state is doing this for the tests that it is manufacturing. A team of people can make a test that is perfect with respect to having good distractors, clear wording, complex passages, and all the other components that make up a “good” test, but until several people take the test, compare answers, and discuss what they missed, that test is not ready for students to take–especially not on a high stakes test that is supposed to measure teacher effectiveness. I understand that this is the first year of this test. I am sympathetic to the fact that everyone is going through a ‘learning process’ as they adapt to the new test. Students have to learn how to use the technology; teachers have to learn how to prepare their students for a new type of tests; administrators have to figure out how to administer the test; the state has to work out the kinks in the test itself…The state is asking everyone to be “patient” with the new system. But what about for the teachers? Yes, the teacher effectiveness data only counts for 10% this year, but that 10% still represents how I am as a teacher. In essence, this new tests is like a pretest, correct? A pretest to get a benchmark about where students stand at the end of the year with this new test that has so many flaws and so many unknowns. In the teaching profession, I think all would agree that it is bad practice to count a pretest AT ALL for a student’s grade. Not 35%, not 25%, not even 10%. So how is it acceptable practice to count a flawed test for 10% of a teacher’s evaluation? We can quibble all day about which practice questions…are good and which questions are flawed, but that will not fix the problem. The problem lies in the test development process. If the practice questions go through the same process as the real questions, it would stand to reason that the real test questions are just as flawed as the practice questions. My students have to take that test; I never get to see it to determine if it is a fair test or not, and yet it still counts as 10% of my evaluation that shows my effectiveness as a teacher. How is that fair in any way whatsoever? In what other profession are people evaluated on something that they never get to see? Especially when that evaluation ‘tool’ is new and not ready for use?

I know how to select complex texts. I know how to collaborate with my PLC. I can teach my students how to read, think critically, analyze, and write. When I do not know how to do something, I have no problem asking other teachers or administrators for suggestions, advice, and help. I am managing all of the things that are in my control to give my students the best possible education. Yet in the midst of all of these things, my teacher accountability is coming from a test that is generated by people who have no one holding them accountable. And at the end of the year, when those scores come back to me, I have no way to see the test to analyze its validity and object if it is flawed.

For more on education politics and policy in Tennessee, follow @TNEdReport

CAPEd Crusaders

At last night’s MNPS Board meeting, members of newly-formed education advocacy group CAPE spoke out about the time spent testing students this year as the state shifts to new TNReady tests.

Here’s what one member and teacher had to say to WSMV:

“It disrupts our schedules. It demoralizes the students. It demoralizes the teachers. It creates chaos,” Kale said. “Our students don’t even know what their schedules are … because they’re interrupted so many times for testing.”

The new state tests significantly increase the time students will spend testing, especially in the earlier grades.

The increased time spent testing comes at a time when a state task force has recommended both reduced testing and more testing transparency.

While the 2016 session of the Tennessee General Assembly may take up the issue, that likely won’t stop the administration of this year’s TNReady.

For more on education politics and policy in Tennessee, follow @TNEdReport

Not Yet TNReady?

As students and teachers prepare for this year’s standardized tests, there is more anxiety than usual due to the switch to the new TNReady testing regime. This according to a story in the Tennessean by Jason Gonzalez.

Teachers ask for “grace”

In his story, Gonzalez notes:

While teachers and students work through first-year struggles, teachers said the state will need to be understanding. At the Governor’s Teacher Cabinet meeting Thursday in Nashville, 18 educators from throughout the state told Gov. Bill Haslam and McQueen there needs to be “grace” over this year’s test.

The state has warned this year’s test scores will likely dip as it switches to a new baseline measure. TCAP scores can’t be easily compared to TNReady scores.

Despite the fact that the scores “can’t be easily compared,” the state will still use them in teacher evaluations. At the same time, the state is allowing districts to waive the requirement that the scores count toward student grades, as the TCAP and End of Course tests have in the past.

In this era of accountability, it seems odd that students would be relieved of accountability while teachers will still be held accountable.

While that may be one source of anxiety, another is that by using TNReady in the state’s TVAAS formula, the state is introducing a highly suspect means of evaluating teachers. It is, in fact, a statistically invalid approach.

As noted back in March citing an article from the Journal of Educational Measurement:

These results suggest that conclusions about individual teachers’ performance based on value-added models can be sensitive to the ways in which student achievement is measured.

The researchers tested various VAM models (including the type used in TVAAS) and found that teacher effect estimates changed significantly based on both what was being measured AND how it was measured. 

 

That means that the shift to TNReady will change the way TVAAS estimates teacher effect. How? No one knows. We can’t know. We can’t know because the test hasn’t been administered and so we don’t have any results. Without results, we can’t compare TNReady to TCAP. And, even once we have this year’s results, we can’t fairly establish a pattern — because we will only have one year of data. What if this year’s results are an anomaly? With three or more years of results, we MAY be able to make some estimates as to how TCAP compares to TNReady and then possibly correlate those findings into teacher effect estimates. But, we could just end up compounding error rates.

Nevertheless, the state will count the TNReady results on this year’s teacher evaluations using a flawed TVAAS formula. And the percentage these results will count will grow in subsequent years, even if the confidence we have in the estimate does not. Meanwhile, students are given a reprieve…some “grace” if you will.

I’d say that’s likely to induce some anxiety.

For more on education politics and policy in Tennessee, follow @TNEdReport

Phil Williams, Testing, and MNPS

NewsChannel5’s Phil Williams sent this tweet today teasing his story on alleged testing irregularities in MNPS:

Phil Williams (@NC5PhilWilliams)
Coming up on @NC5 at 6, #NC5investIgates: Have some Metro high schools been #FakingTheGrade? pic.twitter.com/tRRYeUl4lk

Here’s the full response from MNPS:

Tonight, November 2, 2015, investigative reporter Phil Williams of News Channel 5 plans to air a story containing accusations about end-of-course exams in Metro Schools. Below is our full and detailed response to Phil, as well as a record of our communication with him during his reporting.

DOWNLOAD a PDF copy of this statement.

Beginning late in the week of October 19 and continuing throughout the week of October 26, there have been regular email and telephone conversations – often daily – to address your questions related to accusations that some Metro high schools are using various methods to avoid administering state-mandated End-of-Course (EOC) exams to certain students in order to inflate their performance data. As stated numerous times throughout these conversations, we take these accusations extremely seriously. We asked for evidence of specific wrong-doing in your possession so that the instances in question can be thoroughly investigated and to allow us to fully respond to your story.

Below is a comprehensive response to the questions you have posed thus far related to the “general EOC concerns” story you say is scheduled to air this evening, Monday, Nov. 2, 2015. This response includes questions and requests of us, along with a summary of how we have fulfilled them. Further responses may follow related to other specific concerns you plan to address in future stories.

General Statement on EOC Exams

Students are required to take all state-mandated EOC exams at the end of the second semester of a course regardless of when or how they complete the course. To determine if there is evidence of a wide-spread trend with students not completing the required EOCs, over the last week our Research and Evaluation department has been carefully reviewing transcript and EOC exam files for the most recent cohort of MNPS graduates.

Records reviewed to date indicate that there is no evidence of systematic avoidance of EOC exams. We have found a relatively small number of students who received a regular high school diploma in the spring of 2015 and who took EOC courses in our schools but do not appear to have ever attempted the EOC exam. The department went through several years of files in order to track students’ course and test history. Our investigation is focused on the courses for which the Tennessee Department of Education establishes accountability targets, called Annual Measureable Objectives (AMOs), which requires each high school to have a 95% participation rate on EOC exams.

With a 2015 graduating class of 4,221 students, they should have collectively taken 16,884 exams with AMOs over the course of their high school careers. Of those 16,884 exams, the district lacks a test record for only 231 or 1.37%. These cases appear to be spread out and not unusually high for any particular school. All high schools fall within the 1-2% range. Given an average daily attendance rate of 93%, there will be students that never make up an EOC. There may also be some who took the EOC at another time outside of MNPS or whose student ID was incorrectly coded on an EOC answer sheet and who do not match our course enrollment files.

The 231 missed EOC exams are broken down as follows: There were 44 students missing an Algebra I EOC test record and 10 students marked absent. An answer sheet is supposed to be turned in for every student enrolled in the course, and those that do not test or make up the test should be coded as absent. It is likely that many, if not most, of those students missing an EOC document were absent during testing and an answer sheet marked “absent” was not submitted. There were 32 missing an Algebra II EOC and 32 more marked absent. For English II, 26 had no test record and 16 were shown as absent. There were 35 missing for English III and 36 absent.

If NewsChannel 5 is in possession of documentation that contradicts the district’s findings of its own internal review described above, Metro Schools requests to be given access to the documentation immediately to allow us to thoroughly investigate the claims. Likewise, if former or current MNPS employees are in possession of documentation that indicates a systematic attempt to inflate performance data for individual schools, those individuals are urged to bring their concerns forward to district leadership so that they can be properly investigated. We have no record of an open complaint of this nature.

Use of Credit Recovery in High Schools

Metro Nashville Public Schools has made personalized learning the focus of our instructional practice. Our goal is to prepare every student for success in college and career, which personalized learning allows us to do. Personalized learning involves teachers meeting students where they are, regularly monitoring their progress, and moving students forward only when they’re able to demonstrate mastery of the content. This includes intervening as early as possible when a student’s performance indicates he or she is failing to master the content of a course.

As part of this approach, credit recovery is offered to high school students who fail a semester of a course. If a student fails a course in the fall to the degree that grade-averaging the two semesters is unlikely to result in the student passing the course as a whole, the student is given the option to take the fall course through credit recovery before proceeding to the spring course. For example, a student who fails “Algebra I Fall” will be given the option to retake the fall course of Algebra I during the spring semester. The student will then take “Algebra I Spring” during the summer semester or subsequent fall semester. All attempts are made to place the student in “Algebra 1 Spring” during the following summer or fall. If there is a scheduling conflict, the student may have to wait to the following spring to take the spring course.

It is in the best interest of the student to take this approach because if he or she has not mastered the content of a fall course, he or she will be ill-prepared to succeed in the spring course, which builds on the content knowledge from the fall. The decision to enter into credit recovery is made by the student and his or her parent/guardian in consultation with the teacher and the student’s counselor.

If a student takes a spring course during the summer or fall semester, he or she will take the EOC at that time. Meaning a student who fails Algebra I this fall may take the Algebra I EOC in July or December of 2016, depending on when he or she completes both courses.

The opinion that this approach to instruction in intended solely to inflate EOC scores is misguided. This is a standard practice used by school districts in our state. The fact that the state’s testing calendar allows for EOCs to be taken in the spring and summer is evidence that this practice is supported by the state. The state does not use EOCs to measure the academic performance of a specific grade level. Unlike grades K through 8, high school courses are offered to students based on their individual academic level. For example, an advanced student may take Algebra I in eighth grade instead of ninth grade, in which case the EOC score is calculated into the middle school’s math data, rather than the high school the student goes on to attend. Similarly, students who take AP classes do not take EOC exams for those subjects, therefore their academic performance is not included in the high school’s overall EOC data. EOC data is intended to reflect the high school’s ability to successfully teach the state standards in main subject areas, regardless of when the student takes the course during his or her time in high school. There is a clear disincentive for high schools to unnecessarily delay a student’s promotion among courses since the state calculates a high school’s graduation rate based on “on-time” graduates, defined as students who graduate within four years and one summer of starting high school. Because all students are required to earn four math credits and four English credits, when they are delayed from completing one of those required credits it risks requiring the student to take more than four years to graduate.

Most importantly, our focus is on helping students succeed. Ultimately, our goal is to prepare every student for college and career. If a student requires extra time to successfully master the content of a course, we believe the student should be allowed that time. Forcing students to progress in course schedules when they are not prepared to understand or master the content would equate to setting our students up for failure.    

Use of Content Recovery in High Schools

In addition to “credit recovery,” which is a student re-taking a failed semester of a course, Metro Schools also offers “content recovery” courses to support students who are struggling with the foundational skills needed to succeed in an EOC course.

For example, the district offers “Algebra I A,” a content recovery course to support students enrolled in Algebra I. The Algebra I A course may cover basic math skills, such as fractions, based on what underlining knowledge is needed for a student to understand the Algebra lessons. Similar classes are offered for English courses, and are listed as “English I CAR,” with “CAR” standing for Content Area Reading.

It is district practice for students to be enrolled in content recovery courses either simultaneously or prior to taking an EOC course. A content recovery course cannot be taken in place of an EOC course. Although students do earn credits for content recovery courses, the credits do not qualify for the math or English credits required for graduation. Additionally, enrollment in a content recovery course does not negate a student’s requirement to take the EOC exam at the end of the second semester of the EOC course.

Pearl-Cohn Entertainment Magnet High School

  • You claim:
    • Pearl-Cohn has removed students from EOC exam classes and placed them in independent study courses as a means of avoiding their scores from affecting the school’s overall EOC score. You intimate in an email to Principal Sonia Stewart that direction for this practice is coming from supervision in the district office.
  • We responded:
    • Verbally on the phone the week of Oct. 26 explaining the district’s practice of remediation with students who are failing EOC classes. Further detail and explanation is provided above in the statements on credit recovery and content recovery.
  • You asked for:
    • All course offerings for Fall 2015 and number of students enrolled in each class
  • We fulfilled this request on Friday, Oct. 30, 2015

Stratford STEM Magnet School

  • You claim:
    • Students being “physically pulled” from EOC exam rooms or barred from entering EOC exam rooms.
  • We responded:
    • Verbally on the phone the week of Oct. 26 explaining Stratford’s EOC participation rate is consistently 95% or above for the last two years. The data is as follows:
      • Algebra I – 100% in 2014 and 97% in 2015
      • Algebra II – 95% in 2014 and 96% in 2015
      • English II – 98% in 2014 and 98% in 2015
      • English III – 96% in 2014 and 95% in 2015
    • We further explained that given the AMOs of 95% participation and average daily attendance of 93%, there is no incentive for principals to withhold students from EOC exams, lest they risk failing to meet the AMO.
  • You asked for:
    • All course offerings for Fall 2015 and number of students enrolled in each class
  • We fulfilled this request on Friday, Oct. 30, 2015.

Hunters Lane High School

  • You claim:
    • Hunters Lane has removed students from EOC exam classes and placed them in elective courses as a means of avoiding their scores from affecting the school’s overall EOC score.
  • We responded:
    • Verbally on the phone the week of Oct. 26 explaining the district’s practice of remediation with students who are failing EOC classes. Further detail and explanation is provided in the above statements on credit recovery and content recovery.
  • You asked for:
    • All course offerings for Fall 2015 and number of students enrolled in each class
  • We fulfilled this request on Oct. 30, 2015.
  • On Oct. 29, you asked for:
    • Insight into the situation of a specific Hunters Lane student who was allegedly removed from EOC courses she was passing.
  • Our response:
    • We are still investigating the details of this student, including a close look at the student’s data. However, there are extenuating circumstances surrounding this particular student, which are part of her private record and may not be discussed with you without a written waiver from the parent/guardian.

Maplewood High School

  • You claim:
    • Without knowing the specific mechanism being used, that students are being either pulled from EOC classes or prevented from taking EOC exams.
  • We responded:
    • Verbally on the phone the week of Oct. 26 explaining the district’s practice of remediation with students who are failing EOC classes. Further detail and explanation is provided in the above statements on credit recovery and content recovery.
  • You claim:
    • A source reported to you seeing an email from Jay Steele giving direction in this practice.
  • We responded:
    • Verbally on the phone the week of Oct. 26 that no such email is known to exist, but that it could have been confused with an email sent by Aimee Wyatt on Feb. 11, 2014, to high school principals giving guidance on how to use credit recovery for course remediation. You were provided a copy of this email.
  • You asked for:
    • All course offerings for Fall 2015 and number of students enrolled in each class
  • We fulfilled this request on Oct. 30, 2015.

 

For more on education politics and policy in Tennessee, follow @TNEdReport