TC Talks Testing

Nashville education blogger TC Weber talks about testing (and a lot of other things) in his latest post.

Specifically, he talks about the release of data on TNReady tests and the comparisons being made to previous TCAP tests.

Keep in mind: We didn’t have a complete administration of TNReady in 2016. Which means the 2017 test was the first year for TNReady. It also means the comparisons being made are based on different tests taken two years ago. So, you have analysis of 5th grade results and “growth” on TNReady being made in comparison to 3rd grade results on TCAP.

It’s apples and oranges. 

Here’s what TC has to say:

Let’s approach this in a different manner though. Say I annually run a 5k race and each year my timing goes up a little bit, so I’m feeling like  I want something different. After year 5 I change to a 10k race. My time for that race is substantially lower. What conclusions can I draw from that difference in time? Am I really not that good a 5k runner? Is the course really that much harder than the 5k I was running? Is my training off? Am I not that good a runner?
I’d say there are very few conclusions, based on comparing the results between my 5k and my 10k time, that can be drawn. It could be that the length of the course was a bigger adjustment than anticipated. It could be that conditions were worse on the day I ran the 10k vs the 5k. It could be that one course was flatter and one was hillier. A kid could be good at bubble in questions but not write ins. How do we know that improvement isn’t contingent just on familiarity with the course? Or the test?
I know people will argue that we should all be training to run hills instead of a flat races. But does running hills well really indicate that I am a better runner? Terrain is just another variable. My liberal arts education always explained to me that in order to get the most accurate measurement possible you need to remove as many of the variables as possible.
One year of data is not a real indication of anything other than, kid’s are not very good at taking this test. In order to draw any meaningful conclusions, you would have to have a set of data that you could analyze for trends. Simply taking a 10k race and comparing it’s results to a 5k race’s results, just because both are races, is not a valid means to draw conclusions about a runners abilities. The same holds true for students and testing.
If TNReady really is the amazing test we’ve all been waiting for, why not take the time to build a reliable set of data? The results from year one don’t really tell us much of anything. Because we skipped* 2016, it’s even MORE difficult to draw meaningful conclusions about the transition from TCAP to TNReady.
TC talks about these challenges and more issues. Check it out.
*We didn’t actually skip the 2016 test. Instead, many students attempted to take the test only to face glitches with the online system. Schools then were given various new times for testing to start only to have those dates changed and ultimately, to see the test cancelled. 
Kids were jerked around with messages about how the “important test” was coming up next week only to have it not happen. Teachers were told they’d be proctoring tests and instead had to quickly plan lessons. Our schools and students adapted, to be sure. But, there is no way to give back the instructional time lost in 2016.
Now, we have students taking THE test in 2017 only to see a slow drip of data come back. Students are told the test matters, it will count toward their grades. Teachers have growth scores based on it. Schools are assigned ratings based on it. But, getting it right doesn’t matter. Well, unless it does.
Oh, and we spend a lot of money on a testing system that produces questionable results with data coming back at a time that reduces usefulness.
What’s next? This year, we’ll try again to administer TNReady online across the state. That didn’t work so well with the previous vendor, but maybe it will this time. Of course, online administration adds another variable to the mix. So, 2018 will be the first time many students have taken a fully online TNReady test. Assuming it works, online administration could address the challenges of getting results back in a timely fashion. But, the transition could impact student performance, once again calling into question the legitimacy of growth scores assigned to students and schools.
For more on education politics and policy in Tennessee, follow @TNEdReport


 

Muddy Waters

Laura Faith Kebede of Chalkbeat reports on the challenges in generating reliable TVAAS scores as a result of TNReady trouble last year. Her story cites a statistician from the Center for Assessment who explains the issue this way:

Damian Betebenner, a senior associate at Center for Assessment that regularly consults with state departments, said missing data on top of a testing transition “muddies the water” on results.

“When you look at growth over two years, so how much the student grew from third to fifth grade, then it’s probably going to be a meaningful quantity,” he said. “But to then assert that it isolates the school contribution becomes a pretty tenuous assertion… It adds another thing that’s changing underneath the scene.”

In other words, it’s difficult to get a meaningful result given the current state of testing in Tennessee. I wrote recently about this very issue and the problem with the validity of the growth scores this year.

Additionally, two years ago, I pointed out the challenges the state would face when shifting to a new test. Keep in mind, this was before all the TNReady trouble that further muddied the waters. Here’s what I said in March of 2015:

Here’s the problem: There is no statistically valid way to predict expected growth on a new test based on the historic results of TCAP. First, the new test has (supposedly) not been fully designed. Second, the test is in a different format. It’s both computer-based and it contains constructed-response questions. That is, students must write-out answers and/or demonstrate their work.

Since Tennessee has never had a test like this, it’s impossible to predict growth at all. Not even with 10% confidence. Not with any confidence. It is the textbook definition of comparing apples to oranges.

The way to address this issue? Build multiple years of data in order to obtain reliable results:

If you measure different skills, you get different results. That decreases (or eliminates) the reliability of those results. TNReady is measuring different skills in a different format than TCAP. It’s BOTH a different type of test AND a test on different standards. Any value-added comparison between the two tests is statistically suspect, at best. In the first year, such a comparison is invalid and unreliable. As more years of data become available, it may be possible to make some correlation between past TCAP results and TNReady scores.

So, now we have two challenges: We have two different types of tests AND we have a missing year of data. Either one of these challenges creates statistical problems. The combination of the two calls for a serious reset of the state’s approach to accountability.

As I suggested yesterday, taking the time to get this right would mean not using the TNReady data for accountability for teachers, students, or schools until 2019 at the earliest. If our state is committed to TNReady, we should be committed to getting it right. We’re spending a lot of money on both TNReady and on TVAAS. If we’re going to invest in these approaches, we should also take the time to be sure that investment yields useful, reliable information.

Why does any of this matter? Because, as Kebede points out:

At the same time, TVAAS scores for struggling schools will be a significant factor to determine which improvement tracks they will be be placed on under the state’s new accountability system as outlined in its plan to comply with the federal Every Student Succeeds Act. For some schools, their TVAAS score will be the difference between continuing under a local intervention model or being eligible to enter the state-run Achievement School District. The school growth scores will also determine which charter schools are eligible for a new pot of state money for facilities.

TVAAS scores also count in teacher evaluations. TNReady scores were expected to count in student grades until the quick scores weren’t back in time. If all goes well with the online administration of TNReady this year, the scores will count for students.

The state says TNReady matters. The state evaluates schools based on TVAAS scores. The state teacher evaluation formula includes TVAAS scores for teachers and TNReady scores as one measure of achievement that can be selected.

In short: Getting this right matters.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Seeping Scores Sour School Board

Members of the Murfreesboro City School Board are not happy with the slow pace of results coming from the state’s new TNReady test. All seven elected board members sent a letter to Commissioner of Education Candice McQueen expressing their concerns.

The Daily News Journal reports:

“However, currently those test scores seep ever-so-slowly back to their source of origin from September until January,” the letter states. “And every year, precious time is lost. We encourage you to do everything possible to get test results — all the test results — to schools in a timely manner.

“We also encourage you to try to schedule distribution of those results at one time so that months are not consumed in interpreting, explaining and responding to those results,” the letter continued.

A Department of Education spokesperson suggested the state wants the results back sooner, too:

“We know educators, families and community members want these results so they can make key decisions and improve, and we want them to be in their hands as soon as possible,” Gast said.. “We, at the department, also desire these results sooner.”

Of course, this is the same department that continues to have trouble releasing quick score data in time for schools to use it in student report cards. In fact, this marked the fourth consecutive year there’s been a problem with end of year data — either timely release of that data or clear calculation of the data.

TDOE spokesperson Sara Gast went further in distancing the department from blame, saying:

Local schools should go beyond TNReady tests in determining student placement and teacher evaluations, Gast said.

“All personnel decisions, including retaining, placing, and paying educators, are decisions that are made locally, and they are not required to be based on TNReady results,” Gast said. “We hope that local leaders use multiple sources of feedback in making those determinations, not just one source, but local officials have discretion on their processes for those decisions.”

Here’s the problem with that statement: This is THE test. It is the test that determines a school’s achievement and growth score. It is THE test used to calculate an (albeit invalid) TVAAS score for teachers. It is THE test used in student report cards (when the quick scores come back on time). This is THE test.

Teachers are being asked RIGHT NOW to make choices about the achievement measure they will be evaluated on for their 2017-18 TEAM evaluation. One choice: THE test. The TNReady test. But there aren’t results available to allow teachers and principals to make informed choices.

One possible solution to the concern expressed by the Murfreesboro School Board is to press the pause button. That is, get the testing right before using it for any type of accountability measure. Build some data in order to establish the validity of the growth scores. Administer the test, get the results back, and use the time to work out any challenges. Set a goal of 2019 to have full use of TNReady results.

Another solution is to move to a different set of assessments. Students in Tennessee spend a lot of time taking tests. Perhaps a set of assessments that was less time-consuming could allow for both more instructional time and more useful feedback. I’ve heard some educators suggest the ACT suite of assessments could be adapted in a way that’s relevant to Tennessee classrooms.

It will be interesting to see if more school districts challenge the Department of Education on the current testing situation.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Apples and Oranges

Here’s what Director of Schools Dorsey Hopson had to say amid reports that schools in his Shelby County district showed low growth according to recently released state test data:

Hopson acknowledged concerns over how the state compares results from “two very different tests which clearly are apples and oranges,” but he added that the district won’t use that as an excuse.

“Notwithstanding those questions, it’s the system upon which we’re evaluated on and judged,” he said.

State officials stand by TVAAS. They say drops in proficiency rates resulting from a harder test have no impact on the ability of teachers, schools and districts to earn strong TVAAS scores, since all students are experiencing the same change.

That’s all well and good, except when the system upon which you are evaluated is seriously flawed, it seems there’s an obligation to speak out and fight back.

Two years ago, ahead of what should have been the first year of TNReady, I wrote about the challenges of creating valid TVAAS scores while transitioning to a new test. TNReady was not just a different test, it was (is) a different type of test than the previous TCAP test. For example, it included constructed response questions instead of simply multiple choice bubble-in questions.

Here’s what I wrote:

Here’s the problem: There is no statistically valid way to predict expected growth on a new test based on the historic results of TCAP. First, the new test has (supposedly) not been fully designed. Second, the test is in a different format. It’s both computer-based and it contains constructed-response questions. That is, students must write-out answers and/or demonstrate their work.

Since Tennessee has never had a test like this, it’s impossible to predict growth at all. Not even with 10% confidence. Not with any confidence. It is the textbook definition of comparing apples to oranges.

Here’s a statement from the academic article I cited to support this claim:

Here’s what Lockwood and McCaffrey (2007) had to say in the Journal of Educational Measurement:

We find that the variation in estimated effects resulting from the different mathematics achievement measures is large relative to variation resulting from choices about model specification, and that the variation within teachers across achievement measures is larger than the variation across teachers.
You get different value-added results depending on the type of test you use. That is, you can’t just say this is a new test but we’ll compare peer groups from the old test and see what happens. Plus, TNReady presents the added challenge of not having been fully administered last year, so you’re now looking at data from two years ago and extrapolating to this year’s results.
Of course, the company paid millions to crunch the TVAAS numbers says that this transition presents no problem at all. Here’s what their technical document has to say about the matter:
In 2015-16, Tennessee implemented new End-of-Course (EOC) assessments in math and English/language arts. Redesigned assessments in Math and English/language arts were also implemented in grades 3-8 during the 2016-17 school year. Changes in testing regimes occur at regular intervals within any state, and these changes need not disrupt the continuity and use of value-added reporting by educators and policymakers. Based on twenty years of experience with providing valueadded and growth reporting to Tennessee educators, EVAAS has developed several ways to accommodate changes in testing regimes.
Prior to any value-added analyses with new tests, EVAAS verifies that the test’s scaling properties are suitable for such reporting. In addition to the criteria listed above, EVAAS verifies that the new test is related to the old test to ensure that the comparison from one year to the next is statistically reliable. Perfect correlation is not required, but there should be a strong relationship between the new test and old test. For example, a new Algebra I exam should be correlated to previous math scores in grades seven and eight and to a lesser extent other grades and subjects such as English/language arts and science. Once suitability of any new assessment has been confirmed, it is possible to use both the historical testing data and the new testing data to avoid any breaks or delays in value-added reporting.
A couple of problems with this. First, there was NO complete administration of a new testing regime in 2015-16. It didn’t happen.
Second, EVAAS doesn’t get paid if there’s not a way to generate these “growth scores” so it is in their interest to find some justification for comparing the two very different tests.
Third, researchers who study value-added modeling are highly skeptical of the reliability of comparisons between different types of tests when it comes to generating value-added scores. I noted Lockwood and McCaffrey (2007) above. Here are some more:
John Papay (2011) did a similar study using three different reading tests, with similar results. He stated his conclusion as follows: [T]he correlations between teacher value-added estimates derived from three separate reading tests — the state test, SRI [Scholastic Reading Inventory], and SAT [Stanford Achievement Test] — range from 0.15 to 0.58 across a wide range of model specifications. Although these correlations are moderately high, these assessments produce substantially different answers about individual teacher performance and do not rank individual teachers consistently. Even using the same test but varying the timing of the baseline and outcome measure introduces a great deal of instability to teacher rankings.
Two points worth noting here: First, different tests yield different value-added scores. Second, even using the same test but varying the timing can create instability in growth measures.
Then, there’s data from the Measures of Effective Teaching (MET) Project, which included data from Memphis. In terms of reliability when using value-added among different types of tests, here’s what MET reported:
Once more, the MET study offered corroborating evidence. The correlation between value-added scores based on two different mathematics tests given to the same students the same year was only .38. For 2 different reading tests, the correlation was .22 (the MET Project, 2010, pp. 23, 25).
Despite the claims of EVAAS, the academic research raises significant concerns about extrapolating results from different types of tests. In short, when you move to a different test, you get different value-added results. As I noted in 2015:

If you measure different skills, you get different results. That decreases (or eliminates) the reliability of those results. TNReady is measuring different skills in a different format than TCAP. It’s BOTH a different type of test AND a test on different standards. Any value-added comparison between the two tests is statistically suspect, at best. In the first year, such a comparison is invalid and unreliable. As more years of data become available, it may be possible to make some correlation between past TCAP results and TNReady scores.

Or, if the state is determined to use growth scores (and wants to use them with accuracy), they will wait several years and build completely new growth models based on TNReady alone. At least three years of data would be needed in order to build such a model.

Dorsey Hopson and other Directors of Schools should be pushing back aggressively. Educators should be outraged. After all, this unreliable data will be used as a portion of their teacher evaluations this year. Schools are being rated on a 1-5 scale based on a growth model grounded in suspect methods.

How much is this apple like last year’s orange? How much will this apple ever be like last year’s orange?

If we’re determined to use value-added modeling to measure school-wide growth or district performance, we should at least be determined to do it in a way that ensures valid, reliable results.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

The Data Wars: Herb Strikes Back

Yes, the Data Wars continue. Metro Nashville Public Schools (MNPS) gained new hope recently when 33 members of Nashville’s Metro Council penned a letter supporting resistance to the Achievement School District’s request for student data.

Now, Tennessee’s Attorney General has weighed-in and says the alliance of MNPS and Shelby County must comply with the ASD’s request. What happens if they don’t? Nate Rau notes in the Tennessean:

McQueen’s warning leaves open the possibility the state would dock education dollars from Metro and Shelby schools if they continue to deny her request.

It wouldn’t be the first time for Nashville, as the Haslam administration withheld $3.4 million in state funds in 2012 after the school board refused to approve controversial Great Hearts charter school.

Withholding state BEP funds is a favorite “ultimate weapon,” used in the Great Hearts controversy and also threatened during the TNReady debacle in year one of that test that wasn’t.

During the debate that ultimately saw Nashville schools lose funds in a BEP penalty, Commissioner Kevin Huffman and the Department of Education had an ally in then-Nashville Mayor Karl Dean. Joey Garrison reported in the (now defunct) City Paper at the time:

By this point, Huffman had already facilitated a July 26 meeting to discuss Great Hearts’ next move, a gathering that took place just hours before Great Hearts’ revised application would go before the Metro board for second consideration. The meeting site: the office of Mayor Karl Dean, also a Great Hearts backer. In attendance, among others, were Huffman, Dean, Barbic, Deputy Mayor Greg Hinote, Great Hearts officials Dan Scoggin and Peter Bezanson, and Bill DeLoache, a wealthy Nashville investor and one of the state’s leading charter school proponents.

As Rau points out, the current controversy stems from a newly-passed state law giving charter schools the opportunity to request student data from district schools. It seems, however, that there is some dispute over the intent of that law. Rau explains:

Slatery’s opinion also said that the student data may be used for the ASD to promote its schools to prospective students. State Rep. John Forgety, who chairs a House education committee and supported the legislation, told The Tennessean the intent was not to create a law that allowed districts to market to each other’s students.

So it seems the legislature may need to revisit the issue to clear things up.

Also unclear: Where do the current candidates for Governor stand on protecting student data vs. providing marketing information to competing districts and schools?

Stay tuned for more. Will the Shelby-MNPS alliance continue their resistance? Will Commissioner McQueen unleash the power of BEP fund withholding? Will this issue end up in court?

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Metro Council Members Back MNPS in Data Wars

I’ve written before about the escalating Data Wars between the state’s Achievement School District (ASD) and the two largest school districts – Shelby County and MNPS.

Now, Nashville’s Metro Council is weighing-in, at least in the form of a letter signed by 33 Council Members to MNPS Board Chair Anna Shepard.

The Tennessean notes:

The 33 Nashville Metro Council members signed a letter, dated Tuesday, that commends the district for “taking steps to protect the personal information of students and families.”

“We understand the state has taken a confrontational position on this issue, seeking to compel Nashville and Memphis schools to continue sharing personal information in opposition to federal and without state statute supporting their position,” the letter reads. “However, as elected representatives of the same constituents whose privacy rights are being violated, we encourage you to continue to advocate for our families by the just and proper means that are available to you.”

As I’ve noted before, Commissioner McQueen has asked for an Attorney General’s opinion on the various interpretations of a new state law that some suggest mandates the data-sharing the ASD seeks.

What happens if MNPS doesn’t share the data? There’s always the possibility the state will punish them by withholding some BEP funds.

That happened back in 2012 over the Great Hearts controversy. Those who follow MNPS closely will recall that then-Mayor Karl Dean was a prime backer of Great Hearts, which put him at odds with the elected School Board at that time.

As Joey Garrison, writing for the City Paper at the time, reported:

Emails show DeLoache, long known as an unofficial education adviser to Dean, served as a resource for Huffman, as well. After the Metro board denied Great Hearts in May, DeLoache told Huffman he hoped its rejection might “provide an opportunity to highlight to the Governor” the need to push for a statewide charter school authorizer during the 2013 legislative session. (A statewide charter authorizer would effectively supersede and therefore negate authority of local charter authorizers such as Metro.)

That’s Bill DeLoache, the wealthy Nashville investor and charter proponent who has spent heavily in the past to help elect pro-charter candidates to the MNPS School Board.

Will MNPS and Shelby County Schools face fines if they continue on their current path of protecting student data from the ASD? Will more Metro leaders stand up and support the School Board?

The Data Wars continue.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Next?

Tennessee’s Achievement School District (ASD) is again looking for a Superintendent as it was announced today that current Superintendent Malika Anderson is on her way out.

Chalkbeat has the story:

Malika Anderson, who has sought to steer Tennessee’s school turnaround district to stability after its contentious early work in Memphis and Nashville, is stepping down as its second superintendent at the end of this month.

Education Commissioner Candice McQueen had this to say about the move:

“This transition in no way disrupts our work,” McQueen said in a press release. “We are taking what we have learned about school improvement over the past five years and using that knowledge to maximize students’ success by putting in place a strong set of evidence-based options that will drive improvements in students performance.”

Anderson is the second Superintendent in the ASD’s short history, replacing Chris Barbic. Barbic noted on his departure:

In his email early Friday, Barbic offered a dim prognosis on that pioneering approach. “As a charter school founder, I did my fair share of chest pounding over great results,” he wrote. “I’ve learned that getting these same results in a zoned neighborhood school environment is much harder.”

The ASD has been plagued with both lackluster results and challenges connecting with the communities it serves during its brief but tumultuous existence.

According to the Department of Education’s release, a search will begin immediately for Anderson’s replacement. In the meantime, Deputy Commissioner of Education Kathleen Airhart will serve as Interim Superintendent. Before coming to the Department of Education, Airhart was the Director of Schools in Putnam County.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Herb Backs Down

Earlier this year, I featured an excerpt from a piece written by Mike Stein about Tennessee’s Attorney General, Herb Slatery, and his support for ending the DACA — Deferred Action for Childhood Arrivals – program.

Stein notes that while Slatery joined with Attorneys General in several other states in sending a letter to U.S. Attorney General Jeff Sessions calling for an end to DACA (and threatening a lawsuit), the conservative CATO Institute actually supports maintaining DACA for its economic benefits.

Now that some reports suggest President Trump may be taking action to end DACA, let’s look at who Herb Slatery would have deported.

Chalkbeat had this report of a Nashville student-turned-educator who is also a beneficiary of the DACA program:

Ruiz knows what it’s like to live with uncertainty about his future.

His mother brought him to the United States to give him a better shot at graduating from high school and going to college, which she hadn’t been able to do in Mexico.

He attended public schools in Nashville, where he mastered English by the third grade.

When DACA was announced during his freshman year at Trevecca, Ruiz applied on the very first day. “DACA was an avenue for me to work hard and do what I wanted with that,” he said. “It made me feel in control and empowered.”

After graduating with a degree in history, Ruiz applied to Teach For America and was assigned to an elementary school in Denver. Realizing that his passion is working with high school students, he moved this year to STRIVE Prep Excel, a charter high school where he teaches Spanish.

For background, here’s how Chalkbeat describes DACA:

The policy gives protections, but not citizenship, for two years at a time to undocumented immigrants who came here as children.

Carlos Ruiz was brought to Nashville at age 6. He didn’t ask to come here. He didn’t deliberately evade the nation’s laws. He attended public schools in Nashville. He graduated from a college in Nashville. He decided to become a teacher.

Slatery sent a letter TODAY to Tennessee’s U.S. Senators announcing he’s pulling Tennessee out of litigation over DACA. Specifically, Slatery notes:

There is a human element to this, however, that is not lost on me and should not be ignored. Many of the DACA recipients, some of whose records I reviewed, have outstanding accomplishments and laudable ambitions, which if achieved, will be of great benef,rt and service to our country. They have an appreciation for the opportunities afforded them by our country.

The sad reality is that our Congress hasn’t taken a serious look at immigration reform that would address situations like Ruiz’s. Until they do, DACA provides protection for the children of immigrants. Children like Carlos Ruiz who has decided to take the opportunity he was given and serve others.

Slatery’s letter calls for legislative solution – seemingly in direct opposition to the Trump Administration’s position.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

*An earlier version of this story did not include details of Slatery’s letter released today.

Trade Offer

I reported last week on the Data Wars brewing between the state’s two largest school districts and the Tennessee Department of Education.

Now, as both Nashville and Memphis dig in, MNPS is offering a trade of sorts.

Chalkbeat reports on a letter sent by MNPS Board Chair Anna Shepard to Education Commissioner Candice McQueen.

In her letter, Shepard proposes cooperation between the state’s Achievement School District (ASD) and MNPS based on several conditions.

Specifically:

I would personally be willing to consider a coordinated initiative under which MNPS, using its existing communications infrastructure, would inform families about ASD choice options — if they choose to “opt in” to such communications. I cannot speak for my board colleagues until such time as we have had the opportunity to deliberate on this concept.

Shepard’s conditions:

  1. A moratorium on ASD expansion
  2. State subsidies for schools that lose students to the ASD
  3. The State engage in discussions around a new “fiscal impact” component of the BEP to address the impact charter schools have on local school districts

Regarding that fiscal impact, an audit of MNPS published in 2015 noted this:

“The key question for determining fiscal impacts is whether enrollment reductions allow a district to achieve expenditure reductions commensurate with revenue reductions. Fixed costs are incurred regardless of whether students attend traditional or charter schools. The problem is that some fixed costs, such as building maintenance, computer network infrastructure, and health services do not vary based on enrollment. Therefore, teachers and their salaries are a key cost driver tied to student enrollment … However, it is not always possible to reduce teacher costs proportionate to losses in revenue. For these costs to be reduced significantly, the school would need to close altogether.”

As for the ASD moratorium, it seems that the turnaround district continues to produce underwhelming results. Combine this with a track record of poor communication and you begin to understand why districts aren’t eager for the ASD to open more schools in their backyards.

For her part, Commissioner McQueen is seeking an Attorney General’s opinion on the MNPS and Shelby County interpretation of the data-sharing law passed in the 2017 legislative session.

It seems unlikely that McQueen would agree to the conditions set forth by Shepard. It seems possible both MNPS and Shelby County will face the threat of fines should they continue resisting.

Stay tuned as the Data Wars heat up.

For more on education politics and policy in Tennessee, follow @TNEdReport


 

Holden: Trust Teachers

Former teacher and current education blogger Mary Holden offers her thoughts on how to address the teacher shortage. Of course, even if there wasn’t a teacher shortage, this is the right way to treat teachers. Here’s some of what she has to say:

Trusting teachers to do their job – BECAUSE THEY ARE TRAINED PROFESSIONALS – should be commonplace practice in every school district. But it’s not.

An article from last year in The Atlantic discussed what happened when some Finnish teachers taught here in the U.S. Guess what they noticed? The lack of autonomy. And that’s a very bad thing: “According to a National Center for Education Statistics (NCES) report, teacher autonomy is positively associated with teachers’ job satisfaction and retention. And while most U.S. public-school teachers report a moderate amount of control in the classroom, many say they have little autonomy.”

Let me repeat that in a different way: We have a teacher shortage. Want to retain teachers and attract new ones? Then trust them to do their jobs. Ask them what they need, and then give them the support they need. (Oh, and paying them more would help, too!)

Holden also offers this suggestion:

I wish more districts would recognize what teachers have been saying for years – stop focusing on the data and the test scores and all the punitive measures that have been in place since the dawn of the accountability movement, and instead, focus on what matters: People. Relationships. Community. Developing the joy of learning. And trust our teachers to teach the subjects for which they are trained to teach.

This (and the rest of her article) offer sound advice on how to support and nurture teachers. I hear people say all the time that decisions in education should be made based on what’s good for kids instead of what’s good for adults — as if the two are mutually exclusive. Guess what? Supported teachers who are given autonomy are happy teachers. Which means they are better teachers. Which is GREAT for kids.

For more on education politics and policy in Tennessee, follow @TNEdReport