Muddy Waters

Laura Faith Kebede of Chalkbeat reports on the challenges in generating reliable TVAAS scores as a result of TNReady trouble last year. Her story cites a statistician from the Center for Assessment who explains the issue this way:

Damian Betebenner, a senior associate at Center for Assessment that regularly consults with state departments, said missing data on top of a testing transition “muddies the water” on results.

“When you look at growth over two years, so how much the student grew from third to fifth grade, then it’s probably going to be a meaningful quantity,” he said. “But to then assert that it isolates the school contribution becomes a pretty tenuous assertion… It adds another thing that’s changing underneath the scene.”

In other words, it’s difficult to get a meaningful result given the current state of testing in Tennessee. I wrote recently about this very issue and the problem with the validity of the growth scores this year.

Additionally, two years ago, I pointed out the challenges the state would face when shifting to a new test. Keep in mind, this was before all the TNReady trouble that further muddied the waters. Here’s what I said in March of 2015:

Here’s the problem: There is no statistically valid way to predict expected growth on a new test based on the historic results of TCAP. First, the new test has (supposedly) not been fully designed. Second, the test is in a different format. It’s both computer-based and it contains constructed-response questions. That is, students must write-out answers and/or demonstrate their work.

Since Tennessee has never had a test like this, it’s impossible to predict growth at all. Not even with 10% confidence. Not with any confidence. It is the textbook definition of comparing apples to oranges.

The way to address this issue? Build multiple years of data in order to obtain reliable results:

If you measure different skills, you get different results. That decreases (or eliminates) the reliability of those results. TNReady is measuring different skills in a different format than TCAP. It’s BOTH a different type of test AND a test on different standards. Any value-added comparison between the two tests is statistically suspect, at best. In the first year, such a comparison is invalid and unreliable. As more years of data become available, it may be possible to make some correlation between past TCAP results and TNReady scores.

So, now we have two challenges: We have two different types of tests AND we have a missing year of data. Either one of these challenges creates statistical problems. The combination of the two calls for a serious reset of the state’s approach to accountability.

As I suggested yesterday, taking the time to get this right would mean not using the TNReady data for accountability for teachers, students, or schools until 2019 at the earliest. If our state is committed to TNReady, we should be committed to getting it right. We’re spending a lot of money on both TNReady and on TVAAS. If we’re going to invest in these approaches, we should also take the time to be sure that investment yields useful, reliable information.

Why does any of this matter? Because, as Kebede points out:

At the same time, TVAAS scores for struggling schools will be a significant factor to determine which improvement tracks they will be be placed on under the state’s new accountability system as outlined in its plan to comply with the federal Every Student Succeeds Act. For some schools, their TVAAS score will be the difference between continuing under a local intervention model or being eligible to enter the state-run Achievement School District. The school growth scores will also determine which charter schools are eligible for a new pot of state money for facilities.

TVAAS scores also count in teacher evaluations. TNReady scores were expected to count in student grades until the quick scores weren’t back in time. If all goes well with the online administration of TNReady this year, the scores will count for students.

The state says TNReady matters. The state evaluates schools based on TVAAS scores. The state teacher evaluation formula includes TVAAS scores for teachers and TNReady scores as one measure of achievement that can be selected.

In short: Getting this right matters.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Seeping Scores Sour School Board

Members of the Murfreesboro City School Board are not happy with the slow pace of results coming from the state’s new TNReady test. All seven elected board members sent a letter to Commissioner of Education Candice McQueen expressing their concerns.

The Daily News Journal reports:

“However, currently those test scores seep ever-so-slowly back to their source of origin from September until January,” the letter states. “And every year, precious time is lost. We encourage you to do everything possible to get test results — all the test results — to schools in a timely manner.

“We also encourage you to try to schedule distribution of those results at one time so that months are not consumed in interpreting, explaining and responding to those results,” the letter continued.

A Department of Education spokesperson suggested the state wants the results back sooner, too:

“We know educators, families and community members want these results so they can make key decisions and improve, and we want them to be in their hands as soon as possible,” Gast said.. “We, at the department, also desire these results sooner.”

Of course, this is the same department that continues to have trouble releasing quick score data in time for schools to use it in student report cards. In fact, this marked the fourth consecutive year there’s been a problem with end of year data — either timely release of that data or clear calculation of the data.

TDOE spokesperson Sara Gast went further in distancing the department from blame, saying:

Local schools should go beyond TNReady tests in determining student placement and teacher evaluations, Gast said.

“All personnel decisions, including retaining, placing, and paying educators, are decisions that are made locally, and they are not required to be based on TNReady results,” Gast said. “We hope that local leaders use multiple sources of feedback in making those determinations, not just one source, but local officials have discretion on their processes for those decisions.”

Here’s the problem with that statement: This is THE test. It is the test that determines a school’s achievement and growth score. It is THE test used to calculate an (albeit invalid) TVAAS score for teachers. It is THE test used in student report cards (when the quick scores come back on time). This is THE test.

Teachers are being asked RIGHT NOW to make choices about the achievement measure they will be evaluated on for their 2017-18 TEAM evaluation. One choice: THE test. The TNReady test. But there aren’t results available to allow teachers and principals to make informed choices.

One possible solution to the concern expressed by the Murfreesboro School Board is to press the pause button. That is, get the testing right before using it for any type of accountability measure. Build some data in order to establish the validity of the growth scores. Administer the test, get the results back, and use the time to work out any challenges. Set a goal of 2019 to have full use of TNReady results.

Another solution is to move to a different set of assessments. Students in Tennessee spend a lot of time taking tests. Perhaps a set of assessments that was less time-consuming could allow for both more instructional time and more useful feedback. I’ve heard some educators suggest the ACT suite of assessments could be adapted in a way that’s relevant to Tennessee classrooms.

It will be interesting to see if more school districts challenge the Department of Education on the current testing situation.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

What’s in the Water?

At some schools in Nashville, the answer is unacceptable amounts of lead.

Phil Williams reports:

A NewsChannel 5 investigation discovers information potentially affecting the health of school children across Nashville — information that has not been shared with parents.

It reveals children are still drinking lead-contaminated water when they go to school — despite the district’s assurances that there’s nothing to worry about.

This past summer, Metro Schools tested every water fountain in the district after questions raised by NewsChannel 5 Investigates.

As the school year started, officials only shared the worst results with the public.

But we obtained the raw data, which shows there’s a lot more to the story.

Doctors and health officials suggest MNPS needs to do more — which may involve replacing contaminated pipes and finding alternative water sources in the meantime.

Read more about the risks posed and the challenge of addressing this problem.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Disappointing

That’s the word from Commissioner of Education Candice McQueen in response to a refusal by both Shelby County and Nashville school districts to hand over student data.

As the Data Wars continue, Chalkbeat reports on McQueen’s reaction:

“We are disappointed that these districts are choosing to withhold information from parents about the options that are available to their students while routinely saying they desire more parental engagement,” she said. “Allowing parents to be informed of their educational options is the epitome of family engagement and should be embraced by every school official.”

McQueen seemed to indicate that firmer consequences could lie ahead. “We must consider all options available in situations where a district actively chooses to ignore the law,” she said in the statement. McQueen told lawmakers in a conference call last month that she was not discussing withholding state funds as a penalty at the time, according to Rep. John Clemmons, who was on the call.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

TC Talks Chattanooga

Nashville-based education blogger TC Weber takes some time to explain a bit more about what’s happening with Chattanooga and the state’s Achievement School District in a recent post.

Here’s how he explains what’s happening since the threat of an ASD expansion team in Hamilton County became more real:

Let’s take a quick trip down to Chattanooga where last night a historic vote took place. The Hamilton County School Board voted 7 -2 to continue the conversation about creating a partnership zone with the Tennessee Department of Education. In case you are not familiar with the Partnership Zone plan, it’s the latest quick fix scheme developed by the TNDOE because people have started to catch on to the dumpster fire that is the Achievement School District. Under the Partnership Zone plan, both the county and the state would work together to improve underperforming schools in the district.

The plan calls for the a creation of an appointed board that would oversee the Partner Zone. This creates a bit of a conundrum. Under current law, schools governing boards can only be elected entities. So this would require a change in legislation. A change that could open a virtual pandora’s box because what’s to stop other districts from switching to an all appointed board, a hybrid, or turn control over to the mayor or other appointed officials?

The term partner is a little bit of a misnomer. The state is making it perfectly clear who wears the pants in this relationship right from the out set. The HCS Board was told that they could choose not to pursue the “Partnership Zone” but if they didn’t State Superintendent Candice McQueen would take all 5 of the priority schools plus two more schools and dump them in the Achievement District. If this is in fact a threat she was prepared to follow through with, it’s a little troubling and a clear sign that she’s willing to play politics with kids. The ASD is an unmitigated failure that should be ended this legislative session not used a stick to ensure district compliance.

As Weber points out, McQueen is using the threat of aggressive state action (takeovers, fines) to attempt to get her way lately. So far, that has not resulted in yielding in Nashville or Memphis. It will be interesting to see how the Partnership Zone plays out in Chattanooga.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

The Data Wars: A New Hope?

The ongoing Data Wars between the state’s two largest school districts and the Tennessee Department of Education continue, with today being the deadline set by Commissioner Candice McQueen for districts to hand over the data or face consequences.

Yesterday, Anna Shepherd and Chris Caldwell, chairs of the Boards of Education in Nashville and Memphis respectively, penned an op-ed detailing their opposition to the data demand from McQueen.

They wrote:

Tennessee Education Commissioner Candice McQueen has demanded that Metro Nashville Public Schools and Shelby County Schools surrender personal contact information for a large number of students and families in our school systems, which represent approximately 20 percent of Tennessee’s K-12 public school students.

Her argument: A new state law requires us to hand over personal information to ASD charter schools so these taxpayer-funded private schools can use the data to fill thousands of empty seats by recruiting students away from public schools.

In addition to violating student and family privacy — the right to privacy is a fundamental American principle — the problem with McQueen’s data demand is this: The ASD now is universally viewed as a failed experiment in education reform.

Shepherd and Caldwell contend that their district’s students will not be well-served by marketing efforts from charter schools operating under the banner of the Achievement School District:

Instead, McQueen proposes to shift the cost burden of the failing ASD to local taxpayers in Memphis and Nashville. She wants to confiscate our student data and information in order to stage marketing raids on our schools — which would redirect local taxpayer funds to the ASD and its charter operators at the expense of our school systems.

With today’s deadline looming, it appears school leaders in Memphis and Nashville are locked down against releasing the data demanded by McQueen. Should that position hold, the question will be: What will McQueen do about it? Will she unleash her ultimate weapon and withhold state funds from these districts as punishment?

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Apples and Oranges

Here’s what Director of Schools Dorsey Hopson had to say amid reports that schools in his Shelby County district showed low growth according to recently released state test data:

Hopson acknowledged concerns over how the state compares results from “two very different tests which clearly are apples and oranges,” but he added that the district won’t use that as an excuse.

“Notwithstanding those questions, it’s the system upon which we’re evaluated on and judged,” he said.

State officials stand by TVAAS. They say drops in proficiency rates resulting from a harder test have no impact on the ability of teachers, schools and districts to earn strong TVAAS scores, since all students are experiencing the same change.

That’s all well and good, except when the system upon which you are evaluated is seriously flawed, it seems there’s an obligation to speak out and fight back.

Two years ago, ahead of what should have been the first year of TNReady, I wrote about the challenges of creating valid TVAAS scores while transitioning to a new test. TNReady was not just a different test, it was (is) a different type of test than the previous TCAP test. For example, it included constructed response questions instead of simply multiple choice bubble-in questions.

Here’s what I wrote:

Here’s the problem: There is no statistically valid way to predict expected growth on a new test based on the historic results of TCAP. First, the new test has (supposedly) not been fully designed. Second, the test is in a different format. It’s both computer-based and it contains constructed-response questions. That is, students must write-out answers and/or demonstrate their work.

Since Tennessee has never had a test like this, it’s impossible to predict growth at all. Not even with 10% confidence. Not with any confidence. It is the textbook definition of comparing apples to oranges.

Here’s a statement from the academic article I cited to support this claim:

Here’s what Lockwood and McCaffrey (2007) had to say in the Journal of Educational Measurement:

We find that the variation in estimated effects resulting from the different mathematics achievement measures is large relative to variation resulting from choices about model specification, and that the variation within teachers across achievement measures is larger than the variation across teachers.
You get different value-added results depending on the type of test you use. That is, you can’t just say this is a new test but we’ll compare peer groups from the old test and see what happens. Plus, TNReady presents the added challenge of not having been fully administered last year, so you’re now looking at data from two years ago and extrapolating to this year’s results.
Of course, the company paid millions to crunch the TVAAS numbers says that this transition presents no problem at all. Here’s what their technical document has to say about the matter:
In 2015-16, Tennessee implemented new End-of-Course (EOC) assessments in math and English/language arts. Redesigned assessments in Math and English/language arts were also implemented in grades 3-8 during the 2016-17 school year. Changes in testing regimes occur at regular intervals within any state, and these changes need not disrupt the continuity and use of value-added reporting by educators and policymakers. Based on twenty years of experience with providing valueadded and growth reporting to Tennessee educators, EVAAS has developed several ways to accommodate changes in testing regimes.
Prior to any value-added analyses with new tests, EVAAS verifies that the test’s scaling properties are suitable for such reporting. In addition to the criteria listed above, EVAAS verifies that the new test is related to the old test to ensure that the comparison from one year to the next is statistically reliable. Perfect correlation is not required, but there should be a strong relationship between the new test and old test. For example, a new Algebra I exam should be correlated to previous math scores in grades seven and eight and to a lesser extent other grades and subjects such as English/language arts and science. Once suitability of any new assessment has been confirmed, it is possible to use both the historical testing data and the new testing data to avoid any breaks or delays in value-added reporting.
A couple of problems with this. First, there was NO complete administration of a new testing regime in 2015-16. It didn’t happen.
Second, EVAAS doesn’t get paid if there’s not a way to generate these “growth scores” so it is in their interest to find some justification for comparing the two very different tests.
Third, researchers who study value-added modeling are highly skeptical of the reliability of comparisons between different types of tests when it comes to generating value-added scores. I noted Lockwood and McCaffrey (2007) above. Here are some more:
John Papay (2011) did a similar study using three different reading tests, with similar results. He stated his conclusion as follows: [T]he correlations between teacher value-added estimates derived from three separate reading tests — the state test, SRI [Scholastic Reading Inventory], and SAT [Stanford Achievement Test] — range from 0.15 to 0.58 across a wide range of model specifications. Although these correlations are moderately high, these assessments produce substantially different answers about individual teacher performance and do not rank individual teachers consistently. Even using the same test but varying the timing of the baseline and outcome measure introduces a great deal of instability to teacher rankings.
Two points worth noting here: First, different tests yield different value-added scores. Second, even using the same test but varying the timing can create instability in growth measures.
Then, there’s data from the Measures of Effective Teaching (MET) Project, which included data from Memphis. In terms of reliability when using value-added among different types of tests, here’s what MET reported:
Once more, the MET study offered corroborating evidence. The correlation between value-added scores based on two different mathematics tests given to the same students the same year was only .38. For 2 different reading tests, the correlation was .22 (the MET Project, 2010, pp. 23, 25).
Despite the claims of EVAAS, the academic research raises significant concerns about extrapolating results from different types of tests. In short, when you move to a different test, you get different value-added results. As I noted in 2015:

If you measure different skills, you get different results. That decreases (or eliminates) the reliability of those results. TNReady is measuring different skills in a different format than TCAP. It’s BOTH a different type of test AND a test on different standards. Any value-added comparison between the two tests is statistically suspect, at best. In the first year, such a comparison is invalid and unreliable. As more years of data become available, it may be possible to make some correlation between past TCAP results and TNReady scores.

Or, if the state is determined to use growth scores (and wants to use them with accuracy), they will wait several years and build completely new growth models based on TNReady alone. At least three years of data would be needed in order to build such a model.

Dorsey Hopson and other Directors of Schools should be pushing back aggressively. Educators should be outraged. After all, this unreliable data will be used as a portion of their teacher evaluations this year. Schools are being rated on a 1-5 scale based on a growth model grounded in suspect methods.

How much is this apple like last year’s orange? How much will this apple ever be like last year’s orange?

If we’re determined to use value-added modeling to measure school-wide growth or district performance, we should at least be determined to do it in a way that ensures valid, reliable results.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Charters on the March?

Charter schools have not gained much ground outside of Memphis and Nashville, but that doesn’t mean potential charter operators and the Tennessee Charter School Center aren’t trying. Just a few years ago, there was quite a fight over a proposed charter school in Cheatham County. That application was ultimately denied.

Yesterday, the Clarksville Rotary Club hosted charter school lobbyist Emily Lilley to talk about charter schools and the process of creating one.

Of course, Clarksville residents might not be too eager to “think outside the box” as their current public schools appear to be performing quite well.

Where else are charter proponents planning to expand?

For more on education politics and policy in Tennessee, follow @TNEdReport


 

The Data Wars: Herb Strikes Back

Yes, the Data Wars continue. Metro Nashville Public Schools (MNPS) gained new hope recently when 33 members of Nashville’s Metro Council penned a letter supporting resistance to the Achievement School District’s request for student data.

Now, Tennessee’s Attorney General has weighed-in and says the alliance of MNPS and Shelby County must comply with the ASD’s request. What happens if they don’t? Nate Rau notes in the Tennessean:

McQueen’s warning leaves open the possibility the state would dock education dollars from Metro and Shelby schools if they continue to deny her request.

It wouldn’t be the first time for Nashville, as the Haslam administration withheld $3.4 million in state funds in 2012 after the school board refused to approve controversial Great Hearts charter school.

Withholding state BEP funds is a favorite “ultimate weapon,” used in the Great Hearts controversy and also threatened during the TNReady debacle in year one of that test that wasn’t.

During the debate that ultimately saw Nashville schools lose funds in a BEP penalty, Commissioner Kevin Huffman and the Department of Education had an ally in then-Nashville Mayor Karl Dean. Joey Garrison reported in the (now defunct) City Paper at the time:

By this point, Huffman had already facilitated a July 26 meeting to discuss Great Hearts’ next move, a gathering that took place just hours before Great Hearts’ revised application would go before the Metro board for second consideration. The meeting site: the office of Mayor Karl Dean, also a Great Hearts backer. In attendance, among others, were Huffman, Dean, Barbic, Deputy Mayor Greg Hinote, Great Hearts officials Dan Scoggin and Peter Bezanson, and Bill DeLoache, a wealthy Nashville investor and one of the state’s leading charter school proponents.

As Rau points out, the current controversy stems from a newly-passed state law giving charter schools the opportunity to request student data from district schools. It seems, however, that there is some dispute over the intent of that law. Rau explains:

Slatery’s opinion also said that the student data may be used for the ASD to promote its schools to prospective students. State Rep. John Forgety, who chairs a House education committee and supported the legislation, told The Tennessean the intent was not to create a law that allowed districts to market to each other’s students.

So it seems the legislature may need to revisit the issue to clear things up.

Also unclear: Where do the current candidates for Governor stand on protecting student data vs. providing marketing information to competing districts and schools?

Stay tuned for more. Will the Shelby-MNPS alliance continue their resistance? Will Commissioner McQueen unleash the power of BEP fund withholding? Will this issue end up in court?

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Goodbye, Grace Tatter

Chalkbeat’s Grace Tatter wrote her last piece for the online education news site this week. She’s moving on to Harvard for graduate school.

I could go back and count the number of blog posts of mine that included the words “Grace Tatter reported…” or some variation of that phrase, but that would take too long.

Tatter did an incredible job providing comprehensive coverage of education policy in Tennessee. She was there and wrote about numerous key events.

Her stories were clear, concise, and accessible.

Often, a paragraph from a Grace Tatter story would inspire me to dig a little deeper, find out a little more, and write a post of my own.

So, I say goodbye to Grace Tatter. Your work will be missed. I know the newer faces at Chalkbeat will continue doing sound work, and I look forward to it.

And, one last time, I’ll cite something Grace said to make a point:

Even when stories don’t seem to be about money, they usually are. How much money is being spent on testing, teacher salaries, school discipline reform? How much should be available for wraparound services? Why do some schools have more money than others? Is there enough to go around? Tennessee leaders have steadily upped public education spending, but the state still invests less than most other states, and the disparities among districts are gaping. That’s why more than a handful of school districts are battling with the state in court. Conversations about money are inextricable from conversations about improving schools

Once again, in typical Tatter fashion, she nails it. We can’t have the conversation about improving our schools without the conversation about investing in our schools. Money matters.

For more on education politics and policy in Tennessee, follow @TNEdReport